For Parents & Educators

Plain-language guides on keeping children safe online, recognizing warning signs, having age-appropriate conversations about digital risks, and building digital literacy in the classroom.

5 Articles Parents Educators

A Note for Caregivers

Online safety is built through trust, not surveillance. These guides help you support children with age-appropriate guidance, open conversation, and practical tools — while respecting their growing autonomy.

Parents Practical
A comprehensive, evidence-based guide structured by developmental stage: under-5 (WHO and AAP recommend zero digital entertainment media before 18-24 months; supervised use only after that); 5-8 (co-viewing, child-directed content, no social features, content filtering active); 9-12 (guided digital literacy, family media agreements, monitoring without surveillance, privacy settings walkthrough for the five platforms Ofcom data shows children this age use most); and 13-17 (graduated autonomy, honest conversations about commercial manipulation and data exploitation, mental health check-ins). Covers the research evidence on effective parental mediation: active engagement ("what did you watch today?") consistently outperforms passive restriction in longitudinal studies. Includes step-by-step privacy settings guides for YouTube, TikTok, Instagram, Roblox, and Minecraft — and exactly what to do if you discover your child has experienced online harm, including how to preserve evidence and file a report with NCMEC or IWF.
Parents Educators
NSPCC, Childline, and CEOP research consistently shows that the most powerful protective factor for children online is not parental control software — it is a trusting relationship with an adult they believe will listen without overreacting. This guide provides specific, tested conversation frameworks for every age group, drawn from child disclosure research. For younger children (5-8): age-appropriate "tricky people" language developed by child protection organizations. For 9-12s: scenarios drawn from Ofcom's annual children's media use report covering group pressure, image sharing, and the permanence of digital content. For teenagers (13-17): destigmatizing conversations about sexting, pornography, sextortion, and how to support a peer in trouble. Critically covers what to do when a child discloses harm: step-by-step evidence preservation guidance, how to contact CEOP for grooming concerns, and why children who disclose to a trusted adult recover significantly better than those who do not.
Educators
UNESCO, OECD, and Ofcom all identify digital literacy education as one of the most effective long-term child protection strategies available — teaching children to critically evaluate online content, understand how platforms profit from their attention, and develop resilience against manipulation. This guide gives educators a practical curriculum integration framework built around five core competencies: critical evaluation of misinformation and algorithmic bias; understanding how platforms collect and monetize personal data; digital relationships, consent, and healthy boundaries; emotional wellbeing and the psychological design features of platforms (infinite scroll, social comparison, notification systems); and the ethical responsibilities of digital content creation. Includes lesson plan templates aligned with UK PSHE and Computing curricula for Key Stages 1-4, a recommended resource stack (Childnet, UK Safer Internet Day, Common Sense Media), and guidance on handling disclosures that arise during classroom discussions — a situation teachers must be specifically prepared for.
Parents Educators
Research consistently shows children rarely proactively disclose online harm — particularly grooming and sexual exploitation. The ability of parents and educators to recognize behavioral warning signs is therefore a critical early-warning capability. This guide consolidates indicators identified by CEOP, NSPCC, eSafety Commissioner, and Childline across four harm categories: grooming and sexual exploitation (unexplained gifts, new older contacts, secretive device use, explicit material discovered, distress after going online); cyberbullying (reluctance to attend school, emotional dysregulation linked to device use, sudden social media account deletion); harmful content exposure (concerning language, unusual preoccupation with topics like eating disorders or self-harm methods); and financial exploitation/sextortion (requests for money, distress about financial matters). Critically covers what NOT to do — avoid accusatory responses that prevent disclosure, never confront the suspected perpetrator directly, never view suspected CSAM yourself. Provides step-by-step guidance on evidence preservation, reporting through CEOP and IWF, and when to involve police.
Parents Tools
A rigorously curated guide to parental control tools and family safety apps, cutting through marketing claims to assess what actually protects children based on independent research and regulator guidance. Organized by age group: for under-8s, covers child-safe environments (Amazon Kids+, Google Kids Space), child-safe browsers (Kiddle), and curated platforms (YouTube Kids); for 9-12s, covers network-level filtering (Circle, Bark Home), device-level controls (iOS Screen Time, Google Family Link), and platform family features (YouTube Supervised Accounts, Roblox's parental controls); for teenagers, shifts to monitoring and communication tools (Bark's behavior-monitoring approach, Discord's Family Center) that support trust-based supervision rather than surveillance. Each tool is rated against four criteria: effectiveness, bypass resistance, privacy impact on family data, and practical ease of use. Notes which tools have been independently reviewed by NSPCC Digital and Common Sense Media. Also covers built-in platform features Ofcom's research identifies as genuinely effective: Instagram's Supervision feature, TikTok's Family Pairing, and Snapchat's Family Center.