Children's Online Safety
The Online Safety Act 2023 and the ICO's Children's Code (Age Appropriate Design Code) create a comprehensive framework protecting children online in the UK. Platforms must assess risks to children, prevent access to harmful content, and apply age-appropriate settings by default. This guide explains your rights as a parent and the protections that should be in place.
Key points
- The Online Safety Act 2023 places specific 'children's duties' on platforms likely to be accessed by under-18s.
- The ICO's Children's Code requires online services to provide high privacy settings by default for users under 18.
- Platforms must implement age assurance mechanisms to prevent children from accessing harmful adult content.
- Parents can report platforms failing their children's safety duties to Ofcom.
- Children have UK GDPR rights including the right to erasure of data collected when they were under 18.
The Online Safety Act 2023: Children's Duties
The Online Safety Act 2023 imposes specific duties on platforms that are "likely to be accessed by children" (under 18). These platforms must:
- Complete a children's risk assessment to identify and evaluate risks to children, including harmful content, contact with adults, and conduct risks (cyberbullying, grooming).
- Implement safe design measures to mitigate identified risks — for example, preventing algorithmic recommendation of harmful content to children.
- Prevent children from accessing primary priority harmful content — including pornographic content and content that promotes eating disorders, self-harm, or suicide.
- Apply age assurance — proportionate measures to verify or estimate a user's age to restrict access to harmful content.
- Provide an accessible, effective complaints process for users, including children and parents.
The ICO's Children's Code
The ICO's Age Appropriate Design Code (Children's Code) applies to online services likely to be accessed by children in the UK. Services must:
- Apply the highest privacy settings by default for child users — geolocation, direct messaging, and public profile settings should be off by default.
- Not use nudge techniques to encourage children to share more personal data than necessary or to spend more time on the platform.
- Not use children's data in ways that are detrimental to their wellbeing — for example, algorithmic personalisation that increases anxiety or exposure to harmful content.
- Provide clear, transparent privacy information in plain language appropriate for the age group.
- Not collect more personal data than is necessary for the service.
The ICO can investigate platforms that fail to comply and impose fines up to £17.5 million or 4% of global turnover.
Rights and Options for Parents
As a parent, you have several options if you are concerned about your child's online safety:
- Use parental controls — most devices and internet service providers offer parental control tools to restrict access to certain types of content. The UK's major ISPs (BT, Virgin Media, Sky, TalkTalk) provide free parental control filters.
- Report concerns to platforms via their reporting mechanisms — platforms must have accessible complaints processes under the OSA 2023.
- Complain to Ofcom if a platform is not complying with its Online Safety Act duties — Ofcom can investigate and take enforcement action.
- Exercise UK GDPR rights on behalf of your child — as a parent you can submit SARs and erasure requests on behalf of children under 13 (or under 18 where the child lacks capacity to act for themselves).
Reporting Child Online Safety Harms
If your child has been harmed online:
- Online grooming or sexual exploitation: Report to the police (999 if immediate danger, 101 otherwise) and to the Child Exploitation and Online Protection Command (CEOP) at ceop.police.uk.
- Cyberbullying: Report to the platform and, if serious or persistent, to the school and police. The Diana Award's Anti-Bullying Pro programme and Childline (0800 1111) provide support.
- Exposure to illegal content: Report child sexual abuse material (CSAM) to the Internet Watch Foundation at iwf.org.uk. Report other illegal content to the platform and, if serious, to the police.
- Harmful algorithmic recommendations: Report to Ofcom if a platform's recommendation algorithm is exposing children to content that causes harm.
Frequently asked questions
What age can a child consent to their own data being processed for online services?
My child signed up to a social media platform claiming to be 18. Do they still have protection?
Can I have my child's data erased from a platform they used when under 13?
What should I do if my child has been groomed or exploited online?
What to do next
- 1CEOP — report a concern
Report online grooming or child sexual exploitation to CEOP.
- 2Internet Watch Foundation
Report child sexual abuse material found online.
- 3Childline
Free, confidential support for children and young people up to 19.
- 4UK GDPR rights overview
Understand data protection rights you can exercise on behalf of your child.
Official bodies and resources
Information Commissioner's Office
RegulatorThe UK's independent authority for data protection and information rights, enforcing the UK GDPR and Data Protection Act 2018.
Office of Communications
RegulatorRegulates UK communications industries including telecoms, broadband, TV, radio, and postal services.
Home Office
GovernmentThe lead government department for immigration and passports, drugs policy, crime, fire, counter-terrorism, and police.
Was this page helpful?
Related guides
UK GDPR Rights for Individuals
The UK General Data Protection Regulation (UK GDPR) and the Data Protection Act 2018 (DPA 2018) give individuals in the UK eight legally enforceable rights over how organisations collect, store, and use their personal data. These rights apply whether the data is held by a business, public body, or online platform.
6 min
Removing Content from Social Media
Getting unwanted content removed from social media can be challenging, but you have a range of legal and practical tools available. Platform reporting mechanisms, UK GDPR erasure requests, defamation takedown notices, and the Online Safety Act 2023 all provide routes to removal depending on the nature of the content.
6 min
Dealing with Cyberstalking
Cyberstalking involves using digital technology — social media, email, messaging apps, location tracking, or spyware — to harass, monitor, or stalk a victim. It is illegal in the UK under the Protection from Harassment Act 1997, with specific stalking offences carrying sentences of up to 10 years imprisonment. If you are a victim, a range of criminal and civil protections are available.
6 min
Disclaimer