The European Commission strengthens child safety regulations with the Digital Services Act and Age-Appropriate Design code.
In 2024, the European Commission made significant progress in advancing child safety regulations under the Digital Services Act (DSA). As digital platforms play an increasingly central role in children’s lives, the EU has intensified its focus on ensuring their protection from online risks. Through stricter platform accountability, enhanced risk assessments, and the development of the EU Code of Conduct on Age-Appropriate Design, the European Commission has worked to create a safer and more responsible digital environment.
Compliance with child safety regulations builds trust and transparency.
For companies and organizations operating within the EU market, compliance with the ever-evolving regulations is not just a legal necessity but also a vital step in fostering trust with customers and users, particularly younger audiences and their guardians. The measures introduced by the Commission reflect a broader regulatory shift toward transparency, responsible design, and proactive risk management for digital service providers.
Under the DSA, Very Large Online Platforms must take extra steps to protect minors.
The European Commission has strengthened enforcement mechanisms under the DSA, in particular requiring Very Large Online Platforms (VLOPs) and search engines to take additional measures in protecting minors. Companies operating within the EU market must now ensure that children are shielded from harmful content, deceptive design tactics, and targeted advertising. This regulatory push has led to increased audits and compliance checks, with platforms expected to demonstrate sound content moderation policies as well as age-appropriate service adjustments.
Mandatory risk assessments help businesses and organisations identify and address online harms.
A key development under the DSA has been the introduction of mandatory risk assessments for online platforms. These assessments are designed to identify and address potential harms that children may face, such as cyberbullying, exposure to explicit or misleading content, and manipulative design strategies (such as dark patterns). The European Commission has been working closely with technology firms to refine best practices in risk mitigation while ensuring that platforms remain innovative and user-friendly. Companies that fail to conduct thorough risk assessments or that inadequately address identified risks may face significant regulatory actions, including fines or restrictions under the DSA’s enforcement framework.
The EU Code of Conduct on Age-Appropriate Design guides platforms in protecting minors.
The EU Code of Conduct on Age-Appropriate Design aligns with the Better Internet for Kids (BIK+) strategy. This voluntary code is designed to guide platforms in implementing child-friendly privacy settings, minimizing data collection, and designing services that prioritize safety and well-being, particularly for minors. While the code is not legally binding, adherence to its principles is seen as a best practice and could become an industry standard. Platforms that fail to align with the EU Code of Conduct on Age-Appropriate Design may face increased scrutiny under the Digital Services Act, particularly concerning their obligations to assess and mitigate systemic risks to children.
Collaboration with industry and policymakers strengthens child safety measures.
Beyond regulatory enforcement, the European Commission has prioritized collaboration with industry stakeholders, civil society organizations, and policymakers to refine child safety measures. Public consultations and advisory group meetings have played a crucial role in shaping policies that are both practical for businesses and effective in protecting children. These discussions have helped bridge the gap between compliance requirements and technological feasibility, ensuring that companies can implement safety measures without compromising innovation or user experience.
Companies must take a proactive approach to child safety.
Organizations operating in the EU digital market must remain vigilant and proactive in adapting to these changes. Compliance with the Digital Services Act and the Age-Appropriate Design Code will be critical in maintaining regulatory approval and public trust. Companies should consider reviewing their current child safety policies, enhancing content moderation practices, and conducting regular risk assessments to align with EU expectations. Organizations that take a proactive approach to online child safety will not only achieve legal compliance but also position themselves as industry leaders in ethical digital innovation.