/ 5 min read

UAE Child Digital Safety Law – Tighter Regulation for Digital Platform Operators and ISPs

Marking a major step in digital regulation for 2026, the United Arab Emirates (UAE) has enacted a comprehensive framework to safeguard children in the digital environment through Federal Decree-Law No. 26/2025 on Child Digital Safety (CDS Law). 

The law came into effect on 1 January 2026 and has one year ramp-up compliance period for those subject to it (see below). 

The CDS Law has potentially far-reaching operational consequences for platform operators. It follows similar developments globally (such as the EU Digital Services Act, UK Online Safety Act and Australian Online Safety Act 2021 (as amended)) designed to regulate the online space, with a focus on the protection of children. Its issuance is also reflective of broader regional trends we have observed in recent months; notably an increasing willingness of regional media and technology regulators to actively monitor, and issue instructions to, platform operators (including those based outside of the region) where there are perceived content compliance risks. There appears to have been a particular focus across the GCC with regards to the risk of exposure of minors to unsuitable content.

Whilst it remains subject to further detail to be set out in subsequent decisions, the CDS Law provides more actionable and specific requirements than much of the existing UAE legislation in this space (such as Cybercrime Law, Wadeema’s Law and UAE Media Law, each of which remain in force concurrently) and is unambiguous with respect to certain core requirements which will need to be addressed by platform operators through technical and operational measures. 

  1. Who does the CDS Law apply to?

The CDS Law has broad application, including potential extra-territorial application. It applies to internet service providers and digital platforms:

  • operating in the UAE; or
  • targeting users in the UAE,

whenever children (defined as those under 18 years of age, aligning with the recently revised age of majority in the UAE) are using the platforms or are exposed to the platforms’ content or services. Various specific digital platforms are listed non-exhaustively as being within the scope of the CDS law including websites, mobile apps, gaming platforms, streaming platforms, e-commerce platforms and social media platforms.

It also imposes obligations on “caregivers” (i.e. Parents or others entrusted with their care). 

  1. Child Digital Safety Council

The CDS Law provides that a new body, the Child Safety Council, will be created. The Council will play a central role in the rollout of the new framework, with its remit including (without limitation):

  • developing the strategic direction with respect to child digital safety;
  • proposing and reviewing legislative developments;  and
  • proposing general standards for children’s digital privacy and security, guidelines and general rules for the safe use of digital platforms by children.
  1. Platform Classification System 

The UAE Cabinet will issue a classification system for digital platforms that will set out:

  • criteria for classifying digital platforms into different categories based upon their type, content, level of usage and impact;
  • controls, restrictions and tools those digital platforms must implement (which will vary depending upon the age groups of the children);
  • disclosure requirements; and
  • procedures and mechanisms to verify that digital platforms meet their obligations (as per their classification).
  1. Children's Privacy and Data

Building upon the overarching restrictions under the UAE Personal Data Protection Law, the CDS Law sets out a broad prohibition on collecting, processing publishing or sharing personal data of children under the age of 13 unless specific steps are taken including, without limitation: 

  • Obtaining verified and documented parental consent (which can be withdrawn);
  • Providing a privacy policy;
  • Collecting only the personal data strictly necessary to provide the service; and
  • Not using such data for commercial purposes (including targeted advertising).  
  1. Age Verification

Digital platform operators must adopt effective, reasonable age‑verification mechanisms commensurate with their risk classification and the potential impact of their content on children. The steps which must be taken will depend upon the classification of the digital platform (see (3) above). 

  1. Restrictions on Betting

The CDS Law sets out obligations with respect to ensuring that children do not have access to “online commercial games”, which would capture not only traditional gambling sites but also games which incorporate mechanisms which allow for wagering or betting for monetary or other valuable considerations. This may potentially extend to loot box mechanics which are prevalent in many online games. 

  1. Digital Platform Operators – Key Obligations

Platform operators must, in accordance with their classification (see (3), above), develop and implement extensive procedures and controls to ensure children’s digital safety, including (without limitation) the following:

  • implementing privacy settings for children’s accounts;
  • implementing tools to enforce age restrictions (including age verification). Note that we were also expecting a newly published content age classification regime to be published by the UAE Media Council (which is now, as of December 2024, the National Media Authority). It remains to be seen how these requirements and regimes will overlap and operate;
  • implementing blocking and filtering tools;
  • providing parental control tools, including features that allow setting daily usage limits for children on digital platforms;
  • providing tools for reporting child pornography and / or harmful content;
  • leveraging their technical capabilities, including AI systems, for the proactive detection, removal, or reporting of harmful content;
  • immediate reporting to the authorities of any child pornography or harmful content;
  • following orders issued by the authorities to remove or report child pornography, or any content harmful to children; and
  • providing the authorities with periodic statistics and reports on the measures taken and extent of compliance.

The Telecommunications And Digital Government Regulatory Authority will also issue policies directed at ISPs with respect to ensuring digital safety online (including with respect to content filtering and securing parental consent). 

  1. Caregiver Obligations

The CDS Law imposes specific obligations on caregivers to monitor children's online activity, and to ensure that they do not access harmful content “whilst respecting the digital autonomy in accordance with their age groups”.

  1. Penalties

The consequences of non-compliance are to be set out under a separate decision. 

What Next?

Whilst the CDS Law indicates that there is detail to follow in subsequent decisions and implementing regulations, there is sufficient detail for those subject to the CDS Law to begin mapping out the extent to which their existing operations and offerings comply. Given the breadth and nature of the requirements, many platform operators will likely need to take significant operational steps to ensure compliance (in particular with respect to the deployment of filtering tools and age gating requirements).

This may be particular challenging for platforms which primarily offer user generated content, which is inherently more challenging to monitor and filter. 

Should you have any questions with respect to the CDS Law, please contact: 

Jamie Ryder (Partner)  

Alex Mackay (Senior Associate)