Key Takeaways
- General Comment No. 25 (2021) is the definitive UN legal interpretation confirming that all 54 articles of the UNCRC apply fully in digital environments — not just some of them.
- Developed with direct input from 700+ children across 27 countries, it is one of the most child-inclusive international policy documents ever produced.
- The four core CRC principles — non-discrimination, best interests, survival and development, and respect for children's views — carry binding weight for all 196 signatory states.
- States must now report to the UN Committee on how they are implementing children's rights digitally. Non-compliance affects a country's standing in the human rights review cycle.
- For companies, GC25 establishes clear expectations under the UN Guiding Principles on Business and Human Rights — including Child Rights Impact Assessments, safety by design, and data minimization for child users.
- Critically, the document frames digital access itself as a rights issue: excluding children from the digital world to "protect" them can itself constitute a rights violation.
What is General Comment No. 25 — and Why Does It Matter?
On 2 March 2021, the United Nations Committee on the Rights of the Child formally adopted General Comment No. 25 on Children's Rights in Relation to the Digital Environment. It is the most authoritative international statement ever made on how children's rights apply in digital spaces — and it carries legal and moral weight that every government, company, and civil society organization working in digital policy must understand.
The UN Convention on the Rights of the Child (UNCRC), adopted in 1989, is the most widely ratified human rights treaty in history. Of the 197 UN member states, 196 have ratified it — every country except the United States. But for more than three decades, the treaty's application to digital environments was largely implicit. As children's lives moved online, a critical interpretive gap opened: did the treaty's protections — the right to privacy, to access information, to protection from exploitation, to be heard — apply in digital spaces? General Comment No. 25 answered that question definitively: yes, fully, and without exception.
Countries have ratified the UNCRC — the most widely ratified human rights treaty in history. All 196 now have legal obligations regarding children's rights in digital environments under General Comment No. 25.
General comments are the authoritative interpretations issued by UN treaty bodies. While not independently enforceable through courts, they are legally significant: they define what counts as compliance when states report to the UN, they inform domestic courts interpreting national children's rights legislation, and they set the baseline expectations against which all child protection policy is measured. For the 196 ratifying states, GC25 is not aspirational guidance — it is binding interpretation of treaty obligations they have already accepted.
700 Children Across 27 Countries Helped Write It
What distinguishes General Comment No. 25 from virtually all other international policy documents is the depth and breadth of direct child participation in its development. Before drafting began, the UN Committee undertook a structured global consultation with children — not about children, but genuinely with them.
More than 700 children across 27 countries participated directly, with additional inputs gathered from civil society organizations representing millions more. The consultation reached children in Kenya, Bangladesh, Brazil, Germany, Australia, Guatemala, and many other nations — spanning different economic contexts, digital access levels, cultural norms, and lived experiences. Importantly, it included children with disabilities, children in rural areas with limited connectivity, and children from marginalized communities who face compounded barriers online.
"The digital environment is not separate from children's lives — it is part of their lives. Rights that do not apply there do not really apply at all."
Children told the Committee three things clearly and consistently. First, that digital access is not a luxury — it is fundamental to their education, friendships, creative expression, and civic participation. Second, that they experience real, serious harms online — harassment, exploitation, invisible data collection, manipulative design — and that existing protections are inadequate. Third, that they want to be protected without being excluded: they want digital spaces that are safe and open, not safe by being closed.
These findings shaped the document profoundly. The resulting General Comment is notable for its emphasis on children's agency and evolving capacities — not just their vulnerability. It explicitly rejects what it calls a "protection-only" approach that treats children purely as objects of adult care rather than rights-holders with their own perspectives and interests.
The Four Core CRC Principles Applied to Digital Spaces
The UNCRC is structured around four general principles that cross-cut all other rights in the treaty. These are not optional features — they are the interpretive lens through which every other article must be read. General Comment No. 25 explains in detail how each applies to digital environments, with significant implications for policy and product design.
1. Non-Discrimination (Article 2)
All children must be able to enjoy their rights in digital environments without discrimination on the basis of any characteristic — race, sex, disability, language, socioeconomic background, geographic location, national origin, or any other status. In practice, this principle has several specific implications:
- Accessibility: Digital services and child safety tools must be accessible to children with visual, auditory, cognitive, or motor disabilities. Reporting mechanisms designed only for text-reading users fail this standard.
- Language: Content warnings, safety information, and reporting tools available only in dominant languages exclude large populations of children. The ITU publishes its COP training resources in all six UN languages precisely in response to this principle.
- The digital divide: Protective measures — age verification systems, parental controls, educational resources — must not deepen inequality by being available only to children with high-end devices, reliable connectivity, or technically sophisticated parents. Protection cannot become a privilege.
- Algorithmic non-discrimination: AI-powered content recommendation systems, content moderation tools, and risk classification algorithms must be audited for discriminatory impact on children based on protected characteristics. Research has repeatedly shown that automated systems can amplify existing social inequalities.
- Intersectionality: Children who belong to multiple marginalized groups face compounded risks. A disabled child from a low-income family in a rural area faces very different digital safety challenges than a middle-class urban child — and protection frameworks must account for this.
Regulatory connection: The EU Digital Services Act's requirement for systemic risk assessments specifically includes assessment of discriminatory impact — directly implementing the non-discrimination principle at the regulatory level.
2. Best Interests of the Child (Article 3)
The best interests of the child must be a primary consideration in all decisions, policies, and actions affecting children in digital environments. This principle has the most direct and profound implications for how companies design and operate digital products.
The "best interests" standard is not the same as "the most restrictive possible approach." It requires a genuine assessment of what serves children's overall wellbeing — which includes their right to access, to learn, to participate, and to develop autonomy, not just their right to be protected from specific harms. This nuance matters: an age verification system that blocks all minors from a platform may reduce one type of harm while violating multiple other rights.
For companies, the best interests standard operationally requires:
- Child Rights Impact Assessments (CRIAs): Before launching any product or feature that children will use, companies should conduct a structured assessment of the likely impact on children's rights. UNICEF's D-CRIA Toolbox is the leading methodology, now referenced in EU DSA implementation guidelines.
- Privacy by default: The most protective available privacy settings must be the default for child users — not opt-in. Default settings determine the experience of the vast majority of users, who never change them.
- Engagement design: Infinite scroll, push notification systems, streaks, and other engagement-maximizing design patterns must be assessed against children's wellbeing. Where they are found to harm children's mental health or disrupt sleep and development, they must be modified or disabled for child users.
- Commercial subordination: Where commercial interests conflict with children's best interests, children's interests take precedence. This principle underlies the EU DSA's ban on profiling-based advertising to known minors and the UK Children's Code's prohibition on using children's data for commercial purposes beyond service provision.
- Age-appropriate design: What serves the best interests of a 6-year-old is very different from what serves the best interests of a 16-year-old. Products accessible across this age range must apply differentiated design — not one-size-fits-all settings.
3. Right to Life, Survival and Development (Article 6)
States must ensure that the digital environment does not threaten children's survival, health, or holistic development — cognitive, emotional, social, physical, and moral. This principle grounds the most urgent protective obligations:
- Content harmful to health: The UK Online Safety Act 2023 identifies specific categories of harmful content that platforms must actively suppress for child users — including content promoting suicide, self-harm, and eating disorders. Research published in the Lancet and other peer-reviewed journals has documented causal links between algorithmic amplification of such content and real-world harm to adolescents.
- Online sexual exploitation: Every form of child sexual abuse material, live-streaming exploitation, grooming, and sextortion threatens children's survival and development in the most severe way. The obligation under Article 6 requires states to criminalize these acts and ensure platforms have technical systems to detect and remove this content.
- Digital wellbeing: Emerging evidence on the impact of social media on adolescent mental health — particularly for girls — has entered mainstream policy debate. Australia's decision to ban social media for under-16s, the UK's requirements for "safe" design in children's codes, and the OECD's 2025 "How's Life for Children in the Digital Age?" report all reflect growing recognition that development harms are real and measurable.
- Violence and extremism: Content that promotes, glorifies, or facilitates violence — including online radicalization pathways — implicates Article 6 directly. Algorithmic systems that amplify extremist content to vulnerable young users are not a neutral technical choice; they are a potential Article 6 violation.
4. Respect for the Views of the Child (Article 12)
Children have the right to express their views freely on all matters that affect them, and to have those views given due weight in accordance with their age and maturity. In digital contexts, this principle has often been honored only nominally — or actively violated by systems designed to extract data from children without their genuine understanding or meaningful consent.
- Meaningful data consent: Children must be able to understand, in age-appropriate language, what data is being collected about them and why — and exercise genuine choice about it. "I agree to the Terms of Service" clicked by a 10-year-old who has never seen the document does not constitute meaningful consent.
- Digital identity and data rights: Children have rights over their own digital identities, including the right to have content removed (the "right to erasure" under GDPR is particularly significant for children, who may not understand the long-term implications of sharing content at a young age).
- Child-accessible reporting: Reporting mechanisms must be genuinely usable by children — with simple language, visual cues, low-friction processes, and meaningful responses. A reporting button buried five levels deep in a settings menu does not meet this standard.
- Policy participation: Governments and companies should consult children in the development of policies and products that affect them. The development of General Comment No. 25 itself demonstrated what this looks like in practice — and the resulting document is stronger for it.
- Evolving capacities: A 17-year-old's autonomy interests are significantly greater than a 7-year-old's. The principle requires age-differentiated approaches that respect children's growing capacity to make meaningful decisions — rather than treating all under-18s as uniform "minors."
What GC25 Requires: States, Companies, and Parents
Obligations for States (Governments)
All 196 UNCRC ratifying states bear primary legal obligations under the treaty. General Comment No. 25 specifies that states must:
- Develop a comprehensive national digital strategy for children's rights — not a standalone "online safety" policy, but an integrated strategy that addresses all dimensions of children's rights in digital environments across education, health, justice, and social protection systems.
- Establish clear regulatory frameworks that hold digital companies accountable for protecting children's rights — including ex ante requirements (design standards), ex post remedies (enforcement), and independent oversight mechanisms.
- Enact data protection legislation with child-specific provisions — including meaningful consent mechanisms, data minimization requirements, and the right to erasure for content created in childhood.
- Ensure adequate law enforcement capacity to investigate and prosecute online child sexual exploitation — and to cooperate internationally, since the most serious harms cross jurisdictions.
- Fund digital literacy programs that equip children to navigate digital environments safely, critically, and confidently — not just as consumers but as informed participants.
- Ensure access to affordable, quality internet connectivity for all children, recognizing that the digital divide itself is a children's rights issue.
- Establish multi-stakeholder governance mechanisms that bring together government, industry, civil society, and — critically — children themselves in ongoing digital governance.
WePROTECT Model National Response: The six-domain MNR framework (legislation, prevention, law enforcement, private sector collaboration, data and research, victim support) is the operational expression of GC25's obligations for states — and has been adopted by 90% of the 42 countries assessed under it. See the Standards section for more.
Corporate Responsibilities
While states bear primary legal obligations, General Comment No. 25 is explicit that companies bear significant responsibilities under the UN Guiding Principles on Business and Human Rights (the "Ruggie Principles"). The Committee calls on states to require companies to conduct human rights due diligence — including children's rights due diligence — as a condition of market access.
Specific company responsibilities articulated in GC25 include:
- Child Rights Impact Assessments (CRIAs): Before launching or significantly modifying any product or service likely to be accessed by children, companies should assess the anticipated impact on children's rights. The EU DSA's systemic risk assessment requirements operationalize this at the regulatory level.
- Safety by Design: Child protection must be built into products from the earliest stages of design and development — not added as a compliance afterthought. This principle is now embedded in regulatory frameworks across three continents. See the Safety by Design article for the full implementation framework.
- Data minimization and purpose limitation: Companies must collect only the data necessary for the stated purpose, and must not use children's data for secondary purposes — including behavioral advertising, AI training, or sale to third parties — without specific, informed consent.
- Transparency and accountability: Companies must publish meaningful transparency reports, provide accessible information about how children's data is processed, and maintain accountability mechanisms that children and families can actually access.
- Non-exploitation of children's data for advertising: The EU DSA's ban on profiling-based advertising to known minors is the regulatory expression of this obligation — which GC25 establishes as a human rights requirement independent of any specific regulation.
- Robust grievance and redress mechanisms: Children and families must have accessible, effective channels to report harms and seek remedies — including removal of harmful content, account suspension, and escalation to authorities.
The Role of Parents and Educators
General Comment No. 25 explicitly recognizes the role of parents, caregivers, and educators in supporting children's rights in digital spaces — while carefully situating this within a rights-based framework that preserves children's own agency.
The document calls on states to support parents with the tools, knowledge, and resources needed to guide their children's digital lives. But it is equally explicit that parental authority does not override children's rights — particularly as children grow older. Surveillance-based parenting approaches that involve monitoring every digital interaction without the child's knowledge or agreement raise serious concerns under Article 16 (privacy) and Article 12 (respect for children's views).
The most effective parental role, as reflected in research cited in GC25, is one of open communication, trust-building, and collaborative navigation of digital risks — not control and surveillance. Children who feel they can talk to a trusted adult about online problems are significantly more likely to seek help when they need it.
For parents: The Parents & Educators section provides practical, research-grounded guides on conversations about online safety, age-appropriate device use, and how to recognize warning signs — without undermining your child's trust.
Digital Access as a Human Right
One of the most consequential contributions of General Comment No. 25 is its reframing of digital access as a fundamental right rather than a privilege. The document grounds this in Articles 17 (access to information), 28 (right to education), 31 (right to play and cultural life), and 13 (freedom of expression).
In practical terms, this means:
- Age verification cannot simply be "block all minors": A blanket exclusion of under-18s from an entire platform — without nuanced consideration of the rights at stake — may protect against one harm while violating multiple others. This is a key tension in current age verification policy debates, particularly in Australia and the UK.
- Connectivity itself is a rights issue: The approximately 2.7 billion people worldwide who lack internet access include a disproportionate share of children — particularly in Sub-Saharan Africa and South Asia. States have obligations to address this infrastructure gap, not just to regulate harmful content.
- Digital literacy is a right: Children who lack the skills to navigate digital environments safely cannot fully exercise their rights. Education systems have obligations to develop children's digital competencies — not as a nice-to-have, but as a rights obligation.
- Over-restriction has costs: A 14-year-old in a rural community for whom the internet is the primary access to educational resources, peer support, and cultural participation suffers real rights harms when protective measures cut off that access indiscriminately.
"Protecting children online cannot mean excluding children from the online world. The goal is a digital environment in which children can thrive — not simply one from which they are barred."
Implementation Status: Where Does the World Stand?
Five years after GC25's adoption, implementation is accelerating — but unevenly across regions and stakeholder groups.
Strong Progress: Europe and Anglophone World
The European Union, United Kingdom, and Australia have produced the most substantive regulatory implementation of GC25 principles:
- EU Digital Services Act (2022, in force 2024): Article 28 requires very large platforms accessible to minors to implement specific protective measures, ban profiling-based advertising to known minors, and conduct annual risk assessments. The Commission's July 2025 guidelines on protection of minors explicitly reference the best interests standard from GC25.
- UK Online Safety Act 2023: Ofcom's children's safety codes (in force from July 2025) impose over 40 specific measures on platforms, including "highly effective" age assurance, safe design requirements, and mandatory CSAM detection. Penalties reach 10% of global annual turnover — making non-compliance commercially untenable for large platforms.
- UK Children's Code (AADC, 2021): 15 legally binding standards enforced by the ICO that directly implement best interests, privacy by default, and non-exploitation of children's data.
- Australia Online Safety Act and Under-16 Law (2024): Australia has gone furthest in some respects, requiring platforms to prevent under-16s from creating accounts — though the age cut-off approach raises questions about access rights under GC25.
Emerging Implementation: Global South
Implementation has lagged most significantly in the Global South — the regions where the majority of the world's children live. The ITU's national COP assessment programme (conducted in 13 countries to date) has identified consistent gaps in legislation, enforcement capacity, and institutional coordination. Key challenges include:
- Limited legislative frameworks: many countries lack both the specific child online protection laws and the data protection frameworks that GC25 requires
- Enforcement capacity: law enforcement agencies often lack the digital forensics training and international cooperation frameworks needed to address online child exploitation
- Resource constraints: the comprehensive multi-stakeholder response GC25 envisions requires sustained investment that many lower-income countries cannot sustain without international support
ICMEC's work on legislative reform — having supported 100+ countries in developing or strengthening CSAM legislation — and WePROTECT's Maturity Model offer structured pathways for countries at all stages of development.
Industry: Accelerating Under Regulatory Pressure
The technology industry's implementation of GC25-consistent practices has accelerated dramatically in response to regulatory pressure. Meta's Teen Accounts (launched globally in 2024-2025), YouTube's supervised experience, Discord's family center, and Apple's Screen Time framework all represent partial implementations of GC25 principles — though civil society organizations including the 5Rights Foundation continue to document significant gaps between stated commitments and actual design.
What Happens When Rights Are Violated?
Understanding GC25 requires understanding not just what it requires, but what violation looks like — and what consequences follow:
- State-level accountability: States that fail to implement GC25 obligations face periodic scrutiny through the UN's Universal Periodic Review (UPR) and the UNCRC reporting cycle. While direct enforcement through international courts is limited, reputational consequences, diplomatic pressure, and civil society litigation in domestic courts can drive compliance.
- Corporate enforcement: For companies, the consequences are increasingly concrete. The EU Commission opened a formal DSA investigation into TikTok in January 2026. Ofcom began its first OSA enforcement investigations in March 2026. The ICO has fined TikTok and other platforms for GDPR violations affecting children. The era of consequence-free non-compliance is over.
- Civil society litigation: In multiple jurisdictions, civil society organizations and affected families have begun bringing litigation against platforms for harms to children. These cases increasingly cite GC25 as the baseline standard of care that companies should have met.