Loading

Blog details

The UK’s Children’s Code: Update for businesses processing children’s data

The UK’s Children’s Code: Update for businesses processing children’s data

The ICO has provided recommended business practices for businesses processing children’s data in the UK, to ensure compliance with the UK’s Children’s Code.

 

The UK’s Children’s Code (also known as the Age-Appropriate Design Code) sets out key data protection requirements for online services that process children’s personal data. Enforced by the Information Commissioner’s Office (ICO), the code ensures that digital platforms act in the best interests of children and comply with the UK General Data Protection Regulation (UK GDPR). For businesses and organizations handling children’s data, compliance with the Children’s Code is essential to avoid regulatory scrutiny and potential enforcement actions. The ICO’s 2024/25 strategy focuses on social media platforms and video-sharing platforms, assessing their data protection practices and holding them accountable. The authority recently published a status update regarding its enforcement of the UK’s Children’s Code, including corrective actions taken by various online businesses. 

 

The UK Children’s Code requires businesses to ensure high privacy settings for children by default, unless there is a compelling reason to do otherwise.

 

The UK’s Children’s Code mandates that online services default to high privacy settings for children unless a compelling reason exists to do otherwise. Platforms should ensure that children’s personal information remains private unless they actively choose to share it. Some services achieve this by making children’s profiles private by default, while others provide strict visibility controls. Recent interventions by the ICO have led platforms like Dailymotion and Twitch to improve their privacy settings, while ongoing discussions continue with other such services.

 

The ICO requires platforms to disable geolocation by default under the Children’s Code, prompting several services to improve privacy settings.

 

The ICO has identified growing concerns about children sharing their geolocation data online, as it can pose safety risks. The Children’s Code states that geolocation settings must be turned off by default unless there is a strong justification to enable them. Several platforms, including BeReal, Sendit, and Soda, have made changes to improve their geolocation privacy settings, ensuring that precise location data is not automatically shared. X has also removed the ability for under-18s to opt in to geolocation sharing, further strengthening protections.

 

The Children’s Code mandates default ad profiling is turned off for minors, leading companies to strengthen protections.

 

The Children’s Code requires platforms to turn off profiling for targeted advertising by default unless it serves the best interests of the child. While some platforms have eliminated advertising for children entirely, others limit ad targeting based on basic demographic data like age and location. Following ICO interventions, X has ceased serving ads to minors, and Viber has extended its protections to cover 17-year-olds, ensuring they are not targeted with personalized ads.

 

The ICO is investigating recommender systems’ use of children’s data, including TikTok, amid concerns over transparency and potential harm.

 

Recommender systems use children’s personal information to suggest content, raising concerns about the volume of data collected and the potential for exposure to harmful content. The ICO’s review found that many platforms offer little transparency about how these algorithmic systems function. As we reported in our most recent blog, the ICO has written to several companies to clarify their practices and launched an investigation into the use of personal data in its recommender systems by TikTok and other companies for users aged 13-17. The ICO is also working closely with Ofcom to regulate these systems effectively.

 

The ICO is pushing for better age verification, as many platforms still rely on ineffective self-declaration.

 

UK data protection law requires platforms that rely on user consent for data processing to obtain parental consent for children under 13. The ICO’s review found that many platforms still rely on self-declaration for age verification, which may be ineffective. Some services, like Vero, have committed to introducing age assurance measures, while others, like Imgur and Reddit, are under investigation. The ICO continues to push for stronger age verification mechanisms and is working with international regulators to establish consistent age assurance standards.

 

As the ICO continues to monitor compliance with the Children’s Code, businesses must adapt to avoid penalties and meet data protection requirements.

 

The ICO continues to monitor digital platforms, conducting compliance discussions, investigations, and enforcement actions. Organizations that fail to align with the Children’s Code risk fines, sanctions, or public scrutiny. With further updates expected in 2025/26, businesses must proactively adapt their practices to ensure compliance with evolving data protection laws. Businesses processing children’s data should review their privacy settings, geolocation controls, advertising policies, and age verification methods to align with these legal requirements.

Discover how Aphaia can help ensure compliance of your data protection and AI strategy. We offer full GDPR and UK GDPR compliance, as well as outsourced DPO services. We specialise in empowering organisations like yours with cutting-edge solutions designed to not only meet but exceed the demands of today’s data landscape. Contact Aphaia today.

Prev post
Children’s Data Privacy Practices fuel investigations into various businesses
March 6, 2025