Child Rights in Digital Spaces

How children's fundamental rights — privacy, participation, protection, and development — apply in the digital world. Grounded in the UN Convention on the Rights of the Child.

5 Articles All Audiences Foundational
Foundational All audiences
A comprehensive analysis of UN General Comment No. 25 (2021) — the most significant child rights document of the digital era, adopted after the UN Committee consulted 700+ children across 27 countries. Explains how the four core CRC principles (non-discrimination, best interests, right to life and development, respect for children's views) apply to digital platforms, algorithmic systems, and data practices. Essential reading for understanding why child online protection is a legally binding human rights obligation — not merely a policy preference — for all 196 signatory states.
Policymakers Tech companies
A rigorous breakdown of how the four foundational CRC principles translate into concrete digital design and policy obligations. Examines how recommendation algorithms can violate non-discrimination rights by amplifying harmful content to vulnerable groups; how profiling-based advertising undermines the "best interests" standard; and how meaningful child participation in platform governance satisfies the principle of respect for children's views. Referenced by EU DSA impact assessment guidance and Ofcom's children's risk profile methodology. Directly relevant to D-CRIA requirements under UNICEF's industry guidelines.
All audiences Privacy
An authoritative guide to children's privacy rights in data-driven digital environments — covering how GDPR Article 8, UK AADC Standard 4, and COPPA's verifiable parental consent requirement create a layered global framework. Examines the legal tension between parental control and children's evolving capacity for autonomous decision-making (as affirmed in General Comment 25 ¶64). Includes analysis of ICO enforcement actions against TikTok (£12.7M penalty, 2023), Google's YouTube settlement ($170M, FTC), and the principle that children's data must never be used to build behavioral profiles for commercial purposes.
Policymakers Parents
One of child online protection's most contested policy challenges: how to shield children from harm without infantilizing them or severing their access to education, civic participation, and peer connection. Examines documented failures on both ends — platforms that block innocuous health information while allowing harmful content through, and age-gating systems so porous they offer no real protection. Analyzes how Australia's blanket under-16 social media ban, the UK OSA's tiered content duties, and the EU DSA's risk-based approach represent three distinct philosophical answers to the same question — with different implications for children's rights to expression under Article 13 UNCRC.
NGOs Policymakers
An essential but frequently overlooked dimension of child online protection: the risk that safety measures designed in high-income, English-language contexts inadvertently widen the digital divide for the 2.2 billion children living in the Global South. Covers how ID-based age verification systems exclude children from countries without robust civil registration infrastructure; how content filters trained on English-language corpora fail to detect harm in local languages; and why the UNICEF Innocenti research consistently shows that restricting digital access can deepen educational inequality. Draws on ITU connectivity data and WePROTECT's recommendations for context-sensitive national strategies.