Authors
Authors
Joana Lawrence
Associate
The protection of children’s personal data and online safety is increasingly prominent in the European legal landscape, requiring companies to engage with it on an ongoing basis. Because children are a particularly vulnerable group, their protection in the digital world is also of significant societal importance. Following increased regulatory focus in 2025, children’s privacy and online safety will remain a major priority in 2026, with further consultations and enforcement aimed at safeguarding children and their data online.
I. Recap for 2025: Guidance and developments
In 2025, several regulators sharpened their focus on protecting children’s safety and personal data online, responding to growing needs in a fast‑evolving digital environment and the expanding risk landscape posed by online games, AI-driven chatbots, and similar technologies.
1. Regulatory guidance
Against this backdrop, 2025 saw a number of key regulatory developments at the EU level aimed at strengthening the protection of children’s privacy and online safety. In particular, EU institutions and bodies issued guidance clarifying expectations around age assurance, platform design, advertising, and the interaction between the GDPR and the Digital Services Act (DSA), as outlined below.
In February, the EDPB published its statement on age assurance, setting out 10 principles to balance protecting children online with GDPR‑compliant data protection. It emphasizes that age‑assurance measures must be risk‑based and proportionate and must minimize data collection, avoiding unnecessary identification or biometric data.
In July, the European Commission issued final, non‑binding guidelines to protect minors online, introducing stronger age‑assurance and verification requirements (including references to the EU age‑verification solution), expanding obligations for default privacy and safety settings with more granular controls, and emphasizing transparency, accessibility, and child participation to better inform and empower children and their guardians. The guidelines also strengthen moderation and reporting mechanisms by prioritizing rapid response, human oversight, and cross‑platform cooperation.
In September, the EDPB issued guidelines clarifying how the Digital Services Act (DSA) obligations on advertising and recommender systems align with the GDPR. The guidelines state that online platforms accessible to children must ensure a high level of privacy, safety, and security. Profiling‑based advertising is prohibited for users known to be children, and any age‑assurance measures used must be necessary, proportionate, and privacy‑preserving.
2. Enforcement momentum
Protecting children online in 2025 was not merely theoretical. Regulators are already taking action under the DSA and GDPR to make clear that digital players must exercise heightened care and take all necessary measures to protect children’s online safety and data‑protection rights.
II. What to expect in 2026
Looking ahead, it is already evident that regulators will maintain a strong focus on protecting children and their personal data online throughout 2026, reinforcing the need for digital players to adopt proactive and robust safeguards. The hot topics of 2026 could be:
Age assurance and age verification methods: While the regulatory landscape for age‑verification methods is still developing and will continue to evolve as the DSA and GDPR are interpreted and applied, the EU Commission is piloting an official, open-source age verification “mini wallet,” which allows users to prove they are over 18 by using, for example, digital IDs. This blueprint is built on the same technical specifications as the future European Digital Identity Wallets to ensure full interoperability.
Processing of children's data: The EDPB is currently working on guidelines on the processing of children’s data, as outlined in the EDPB Strategy 2024–2027 and will soon launch a dedicated hub, “Privacy for Kids,” where parents, teachers, and educators can find educational resources to help children understand and protect their digital privacy.
Social media bans: There is also a political shift toward greater protection of children online: several countries are implementing or considering strict social‑media bans for children to address mental‑health and online‑safety risks. Australia is leading with a ban for children under 16, while France is currently considering similar age groups.
One thing is clear: safety and privacy issues for children are top of mind for regulators worldwide. Companies should anticipate stricter assessments of necessity, proportionality, and safeguards wherever children may be affected, including within AI‑enabled services. Providers whose services are used by children—or likely to be accessed by them—should track developments closely.
Authors
Authors
Joana Lawrence
Associate
/Passle/67292b798819e3f21b0f649e/MediaLibrary/Images/2025-02-25-14-19-12-576-67bdd160187895f466a6470d.jpg)