Children’s risk assessments
Key learnings from the illegal harms assessments
Ofcom has seen around 70 illegal harms assessments from different organisations already this year, so we asked what insights it had drawn from them. Ofcom confirmed that services have adopted varied approaches to conducting assessments. This isn’t surprising, given that services differ significantly in their design, user base and available evidence. Whilst some leaned more heavily on quantitative data, others focused on qualitative insights or third-party research.
The key is that there is no one-size-fits-all model, and Ofcom supports a flexible approach, provided it is evidence-based and meets the duties set out in the regime. The same applies to children’s risk assessments. Ofcom emphasised the need to focus on the specific harms for children – that is, assessing individually the types of priority content (PC) and primary priority content (PPC), identifying content that isn’t formally designated as harmful (non-designated content, or NDC), and considering how platform functionalities and design may affect children differently depending on their age group.
More risk profile triggers do not automatically mean a service is high risk
The updated Codes include a more expansive set of risk profile triggers, based on new evidence from the Children’s Register of Risks gathered during the consultation process. As a result, many services, even those with minimal user-generated content, may appear to trigger numerous risks.
Ofcom explained that the presence of multiple triggers does not necessarily mean a service is high risk, even potentially where this list is quite long compared to the first draft of the Codes. What is crucial is that services take a thoughtful approach to understanding their own specific risks to children and ensure that appropriate mitigations are in place. There is scope for explaining why a risk may initially appear high but, given the specifics of a service and the mitigations in place, it may in fact be low or even negligible.
Notification requirements for NDC
It could be easy to miss, but the OSA includes a specific duty to notify Ofcom where a service provider identifies the presence of NDC in its children’s risk assessment. An email address has now been published for this purpose. Ofcom clarifies that the duty applies to any kind of NDC that a service assesses as presenting more than a negligible level of risk to children. Services should refer to the risk assessment guidance and Ofcom’s interactive tool for further clarification.
Take a cautious, evidence-led approach when considering age ranges
One thing that is new in the final Codes is that services are now required to specifically consider the different age groups of children who may use the service in their assessments. These age groups are consistent with those set out in the UK Age-Appropriate Design Code from the Information Commissioner’s Office, which addresses the protection of children’s personal data during processing.
Ofcom acknowledged that these age bands aren’t rigid – children develop at different rates, and age is just one of several factors that influence risk. However, services are expected to use these categories as a practical framework for assessing risks to children, especially where no specific internal data or other evidence on age group usage currently exists.
In the short term, when preparing this first assessment, services should avoid over-reliance on weak signals such as self-declared age or minimum age thresholds set out in terms and conditions. If there is a possibility that younger children may be accessing the service – whether or not they’re intended users – services should factor in the potential risks to those younger users. Ofcom’s recommendation is to take a precautionary approach and assume broader age group exposure where uncertainty exists. While age-related evidence will improve over time as services gather more data and user insights, the immediate focus should be on making reasonable, risk-based assumptions to ensure children’s safety across different age groups.
Age assurance
What does it mean to “prohibit” content?
Under the Codes, for services where the principal purpose is not the dissemination of content harmful to children, the trigger for highly effective age assurance (HEAA) generally hinges on whether certain content types are “prohibited”. Ofcom clarified that, to meet this requirement, prohibitions must be set out clearly and accessibly in the service’s terms.
Service providers do, however, have some flexibility in how they define these prohibitions. They can either adopt the specific categories of harmful content as defined in the OSA and associated harms guidance or use broader categories if these are comprehensive enough to capture the legally defined harms. For example, services could prohibit “nudity and sexualised content” instead of explicitly prohibiting “pornography”. Where broader language is used, Ofcom explained, services must assess whether the terms are sufficiently clear and accessible to users, including children.
It is important to remember that the effectiveness of these prohibitions depends not only on how they are worded but also on whether they are supported by robust moderation processes to enforce them. While broader or simplified phrasing is permitted, services should ensure that the intent and scope of prohibitions are well communicated and actionable in practice.
In rare cases where prohibited content can’t be removed, HEAA may still apply
Ofcom confirmed that, in a limited set of circumstances, services may still be required to implement HEAA even if it prohibits content harmful to children. This applies where a service is technically unable to take down or moderate such content, although Ofcom emphasised that this scenario is expected to be rare in practice.
Where a service claims that moderation is not feasible, it must document the reasons for this and be prepared for regulatory scrutiny. Ofcom has formal powers to investigate such claims and will assess whether it is technically and operationally possible to implement protective measures. If Ofcom concludes that moderation or HEAA could reasonably be used but has not been, the service may face enforcement action.
Age assurance is evolving fast, and Ofcom is preparing for what comes next
While Ofcom’s immediate focus remains on implementing the OSA, it is actively monitoring international developments, including the European Commission’s draft guidelines on minors under the Digital Services Act (DSA). Ofcom noted that the DSA criteria for effective age assurance closely align with those already embedded in the Codes, which should hopefully ease the compliance burden for global platforms.
Looking ahead, Ofcom confirmed that this is only the first iteration of the Codes and that updates will follow as new technologies, evidence and international practices emerge. The regulator plans to publish two upcoming reports on age assurance: one focused on effectiveness, and another on the role of app stores. It is too early to say what the outcome of those reports will be but watch this space.
Services that take steps now to meet the OSA standards should therefore be well positioned to adapt as regulatory expectations around age assurance inevitably strengthen both at home and abroad.
Other measures
Broad recommender system duties
An important clarification by Ofcom was that, if a service identifies a medium or high risk for even one type of content harmful to children, Ofcom still expects the full suite of recommender system duties under measure PCU E (which relates to recommender systems) to apply across all categories of the relevant content type (i.e., all PC harms must be included if any PC harm is identified, and all PPC harms must be included if any PPC harm is identified), not just the specific harm which was identified as medium or high risk.
When it comes to implementation, services are expected to take a precautionary approach, using any “relevant information” available to them to exclude or give low prominence to harmful content before it is formally reviewed. This includes relying on signals such as metadata, user reports or internal flags – meaning that even pending content sitting in a moderation queue could still require action if there’s a credible indication of harm.
We discussed in the session that this marks a significant shift from traditional content safe harbour principles based on actual knowledge. Ofcom’s expectation is that services act earlier in the content lifecycle, responding to signals that content may be harmful, and not just after it has been conclusively assessed. If a later review determines the content isn’t in fact harmful, it can then be reinstated.
Use of proactive technologies still under review
While the OSA includes duties to “prevent” children from encountering certain types of harmful or illegal content, Ofcom has been careful in its current Codes not to recommend the use of proactive technologies such as content scanning or upload filtering.
Ofcom emphasised that its position is guided by proportionality and current legal constraints, and that a further consultation will be held specifically on the use of proactive technologies. Until that process is complete, Ofcom cannot take a definitive stance.
In the meantime, services are not expected to adopt blanket pre-publication filtering. As noted during the session, it is not realistic to expect platforms to remove all harmful or illegal content before it appears, and Ofcom acknowledges the operational limits of even the most sophisticated moderation systems.
No formal grace period for compliance
There is no formal grace period following the duties becoming enforceable from 25 July 2025. However, Ofcom recognises that full implementation may take time (with a hard deadline of six months, as with the illegal harms duties), and it will take a pragmatic approach to enforcement, particularly where services demonstrate genuine, well-documented progress toward compliance. Services are encouraged to take early action on the “quick wins”, such as updating governance processes, T&Cs and complaints procedures – obligations which are relevant to all services and not dependent on assessment outcomes. Having a clear plan, timeline and documentation of progress will be a helpful indicator of intent and may be considered in any future enforcement context.
Ephemeral content takedown requirements
Ofcom clarified that “taking down” content means ensuring it is no longer accessible to children once the service becomes aware it is harmful. In the case of ephemeral content – such as an in-game chat that disappears when a session ends – this may therefore already satisfy the requirement, provided the content truly becomes unavailable after a short time. However, services must still take appropriate action when they become aware that ephemeral content is harmful. This includes proactively banning certain types of content and ensuring moderation systems can respond if harmful content has not disappeared.
Separate assessments required despite overlap across harm categories
While content types such as hate, abuse and bullying, as well as the three distinct subtypes of “violence” (providing instructions for violent content; violent content against humans; and violent content against animals or fictional creatures), may often overlap in practice, services are still required to assess each harm category separately under the OSA. Ofcom acknowledged the practical challenges this creates but noted that this is set out in the legislation itself and confirmed that it will monitor the situation with the potential to review the categories over time.
Alternative measures must achieve the same protective outcomes
While services are not required to follow the Codes’ recommended measures to the letter, any alternative steps they take must deliver the same level of user and child protection. Ofcom noted that it hasn’t yet seen many alternative approaches being proposed but expects this to evolve as services continue with implementation.
Whether an alternative is a minor variation or a completely different approach, the key consideration will be whether it meets the duties under the OSA. Ofcom also stressed that services must keep a clear, separate record of the measures they are relying on. These records will inform the Supervision Team’s view of sufficiency, and some strong alternatives may be incorporated into future versions of the Codes. Flexibility is permitted but only if the alternative provides an equivalent level of safety.
In-depth 2025-162