Laws & Regulations: The Global Landscape

From the EU Digital Services Act to UK's Online Safety Act, US COPPA, and Australia's social media ban — a comprehensive guide to the regulatory frameworks governing child online protection worldwide.

10 Articles Tech Companies Policymakers

Regulatory Convergence

2024-2026 marks a decisive global shift from self-regulation to mandatory, enforceable child safety frameworks. Penalties now include up to 10% of global annual turnover in the UK and EU.

Jurisdiction Law/Code Age Focus Age Verification Max Penalty Status
EU Digital Services Act Under 18 Required 6% global revenue In force (Feb 2024)
UK Online Safety Act Under 18 Highly effective required 10% global turnover Codes in force Jul 2025
UK Children's Code (AADC) Under 18 Age estimation £17.5M or 4% global In force since 2021
US COPPA Under 13 Parental consent $51,744/violation In force
US KOSA (proposed) Under 17 TBD TBD Under consideration
California AADC Act Under 18 Age estimation $7,500/violation Partly in force (2026)
Australia Social Media Law Under 16 Required AUD 49.5M In force 2024
EU Tech companies

EU Digital Services Act — Protecting Minors

The EU Digital Services Act is the most comprehensive platform regulation in force globally — and its provisions for child protection are among its most significant. Article 28 prohibits Very Large Online Platforms from presenting advertising based on profiling to users they know or should know are minors; Article 34 mandates systemic risk assessments covering negative effects on minors; and Article 35 requires proportionate mitigation measures. The July 2025 Commission guidelines on protection of minors provide specific implementation guidance, including accepted age assurance methodologies and recommended risk assessment scopes. With penalties reaching 6% of global annual revenue and DSA Digital Services Coordinator investigations already underway against TikTok and Meta, this is a live compliance reality — not a future aspiration.

UK Tech companies

UK Online Safety Act 2023

The UK Online Safety Act 2023 represents the most detailed mandatory child safety legislation currently in force — establishing a statutory duty of care on platforms and imposing 40+ specific measures in Ofcom's Children's Safety Codes of Practice. Implementation is phased: illegal content duties took effect March 2025, with the full children's safety codes in force from July 2025. The "highly effective" age assurance standard — requiring platforms to robustly prevent children accessing harmful content — sets a higher bar than any comparable regulation and has prompted a significant commercial age verification industry. Maximum penalties of 10% of global annual turnover and criminal liability for senior managers who fail to comply with information notices represent a genuine deterrent. Ofcom's published enforcement strategy targets the 25,000+ in-scope services based on child user prevalence and risk level.

UK Designers

UK Age Appropriate Design Code (Children's Code)

The UK Age Appropriate Design Code (Children's Code), issued by the ICO and in force since September 2021, established the global template for child-specific data protection design requirements. Its 15 standards — including high privacy by default, prohibition on nudge techniques that encourage children to share more data, geolocation turned off as default, and restrictions on profiling — inspired California's AADC Act, EU DSA implementation guidance, and IEEE 2089. The ICO's 2024-2025 enforcement priorities focus on age assurance adequacy and profiling practices, with the £12.7M TikTok enforcement notice (2023) demonstrating willingness to impose significant penalties. Maximum penalties: the greater of £17.5M or 4% of global annual turnover — applicable to any service "likely to be accessed" by children in the UK, regardless of where the company is headquartered.

US Parents

US COPPA — What You Need to Know

COPPA remains the bedrock of children's online privacy protection in the United States — requiring verifiable parental consent before collecting personal data from children under 13, prohibiting behavioral advertising targeting known children, and mandating data minimization and deletion policies. While enacted in 1998, COPPA's scope has been interpreted expansively through FTC rulemaking: the 2024 COPPA Rule update added restrictions on push notifications to children and tightened requirements for "mixed audience" platforms. Key enforcement actions include the 2019 YouTube/Google settlement ($170M), the 2022 Epic Games settlement ($275M — the largest civil penalty in FTC history), and ongoing investigations into social media platforms' handling of underage users. Per-violation penalties currently set at $51,744 with no statutory cap on total liability.

US Policymakers

US Kids Online Safety Act (KOSA) — Status

The Kids Online Safety Act represents the most ambitious proposed expansion of US federal child protection law in a generation. KOSA's central innovation is a statutory "duty of care" requiring platforms to prevent and mitigate harms to minors — including depression, anxiety, addictive behavior, and self-harm — that are reasonably foreseeable from their design. After passing the Senate 91-3 in July 2024 (the broadest bipartisan vote on any social media legislation), KOSA stalled in the House amid First Amendment concerns; revised versions were reintroduced in 2025. Regardless of final passage, KOSA has already influenced state-level legislation, platform design decisions by major tech companies, and Ofcom's interpretation of the UK OSA's "systems duty." Compliance counsel should treat KOSA's core provisions as the direction of travel for US federal regulation.

US Policymakers

US COPPA 2.0 — Proposed Changes

COPPA 2.0 addresses the most glaring gap in current US children's privacy law: the "13-17 cliff" where children gain the right to consent to unlimited data collection on their 13th birthday. The proposed legislation would extend COPPA's consent and data minimization requirements to users under 17; create an absolute prohibition on targeted advertising based on personal data to users under 17; establish a dedicated FTC Youth Marketing and Privacy Division with specific enforcement mandate; and introduce a "right to deletion" for minors comparable to GDPR Article 17. The proposal has significant bipartisan support and would, if enacted, create US data protection standards for teenagers that exceed any current international equivalent. Companies with US operations should monitor this closely as compliance timelines would be compressed.

US Tech companies

California Age-Appropriate Design Code Act

California's Age Appropriate Design Code Act — signed into law in 2022, modeled directly on the UK ICO Children's Code — established US state-level design requirements for digital services likely to be accessed by under-18s. The law required Data Protection Impact Assessments, high privacy defaults, and prohibition on dark patterns targeting children. After NetChoice's First Amendment challenge succeeded at the district court level, the Ninth Circuit Court of Appeals in March 2026 upheld the core design requirements as constitutionally permissible commercial conduct regulations while striking narrower provisions. California's law has proven more durable than critics predicted and has triggered similar legislation in Maryland, Nebraska, Vermont, and seven other states — creating a growing patchwork that is effectively forcing national compliance by large platforms regardless of federal inaction.

Australia Policymakers

Australia's Approach — Under-16 Social Media Ban

Australia's Online Safety Amendment (Social Media Minimum Age) Act 2024 made Australia the first country to impose a blanket minimum age requirement (16) on social media platform account creation — going further than any comparable legislation globally. Platforms with more than one million Australian users face an obligation to take "reasonable steps" to prevent under-16 account creation, with penalties up to AUD 49.5 million for systemic non-compliance. The eSafety Commissioner holds both enforcement authority and the power to issue industry codes requiring specific Safety by Design measures. The legislation has sparked intense international debate — praised by child safety advocates as setting a new protection standard, criticized by digital rights groups as potentially harmful to adolescent socialization and political engagement. Early implementation data from the eSafety Commissioner's audits is being closely watched by regulators in Canada, the UK, and several EU member states considering similar measures.

Global Policymakers

Emerging Regulations Worldwide

While EU, UK, US, and Australian regulations dominate headlines, a significant wave of child online protection legislation is now advancing across the Global South and emerging economies — with implications for every global digital platform. Indonesia's 2025 Electronic Systems and Transactions regulation mandates compliance with IEEE 2089 age appropriate design standards for platforms with Indonesian users, covering 270 million people. In Africa, Kenya, Ghana, and Nigeria have enacted or are advancing data protection frameworks with specific child provisions. Brazil's LGPD (Lei Geral de Proteção de Dados) requires parental consent for processing children's data, with active enforcement beginning in 2024. The pattern across all these developments is regulatory convergence toward three core principles: age assurance as a prerequisite for child access to high-risk content; prohibition on behavioral profiling of minors; and mandatory child rights impact assessment before product launch. Companies ignoring non-Western regulatory developments do so at increasing commercial and reputational risk.

All Reference

Global Regulatory Comparison Table

A comprehensive reference matrix for compliance teams, legal counsel, and policymakers navigating the rapidly expanding global child online protection regulatory landscape. Covers 12+ jurisdictions including EU DSA, UK Online Safety Act, UK Children's Code, US COPPA, KOSA (proposed), COPPA 2.0 (proposed), California AADC, Australia's Social Media Age Law, Brazil's LGPD child provisions, India's DPDP Act, Indonesia's IEEE 2089 mandate, and Canada's proposed Online Harms Act. Columns include: age thresholds (ranging from under-13 to under-18), consent mechanisms required, age assurance/verification standard mandated, maximum penalties, enforcement authority, extraterritorial reach, and implementation timeline. Updated quarterly to reflect legislative developments.