Thomson Reuters names eight Keystone Law partners in its Stand-out Lawyers Guide 2026
Andrea James, Andrew Darwin & Anna McKibbin
Keynote
18 Mar 2026
•5 min read
Two recent enforcement decisions from the UK Information Commissioner’s Office (ICO) concerning failures to protect children’s personal data online send a clear warning signal to online platforms about UK regulators’ priorities when it comes to children and online safety.
In February 2026, the ICO imposed a £14.47 million fine on Reddit (a well-known user-curated news and discussion platform) and a £247,590 fine on MediaLab, the owner of Imgur (an image hosting and sharing service, designed for easily sharing photos, GIFs, and memes online).
Key takeaways:
For online platforms and digital services, the decisions provide several practical insights.
1. Operationalising the Children’s Code
Both decisions centre on the UK’s Age Appropriate Design Code, often referred to as the Children’s Code (the Code). The Code, issued by the ICO under the Data Protection Act 2018 and the UK General Data Protection Regulation, translates general data-protection principles into 15 design standards for online services likely to be accessed by children. At its core, the Code requires organisations to place the best interests of the child at the centre of service design.
The Reddit and Imgur decisions illustrate how the ICO is enforcing these standards in practice.
In both cases, the ICO found two central failings:
The ICO emphasised that the absence of risk assessment meant the platforms had not properly considered how children’s data could expose them to harmful or inappropriate content.
2. How children access platforms with social features
A central theme in both decisions is the inadequacy of self-declared age gates, particularly for platforms with user interaction and user-generated content.
In the Reddit case, the platform prohibited users under 13 in its terms of service, yet the ICO found that it had no effective mechanism to verify users’ ages until July 2025. As a result, under-13 users were able to access the platform, and their personal data was processed without a lawful basis. Imgur presented a similar issue: users could access the service without meaningful age checks despite the platform hosting user-generated content and processing personal data.
The ICO was explicit that simply asking users to declare their age is insufficient where children are likely to access the service. At the same time, the decision highlights a persistent compliance tension: stronger age verification often requires collecting additional personal data, while data protection law emphasises data minimisation.
3. Increasing regulatory convergence in the UK
These cases also illustrate the convergence of regulatory regimes governing children’s safety online.
In the UK, children’s online protection is now addressed through two key overlapping frameworks:
Although these regimes have different legal bases, the regulators are increasingly focusing on similar risks.
The ICO has explicitly stated that it is working closely with Ofcom to coordinate regulatory oversight. For businesses, this creates a realistic possibility that a compliance issue identified by one regulator could trigger scrutiny from another.
4. Parallel trends in the European Union
Although these decisions arise under UK law, the regulatory logic closely mirrors developments in the European Union.
Under the General Data Protection Regulation (GDPR), children’s data also receives enhanced protection, and parental consent is required for younger users. Meanwhile, the Digital Services Act (DSA) requires platforms to assess and mitigate systemic risks affecting minors, including exposure to harmful content and the design of recommender systems.
As a result, the UK and EU frameworks are converging around similar expectations such as risk-based design for services likely to be accessed by minors; documented impact assessments; stronger safeguards for children’s personal data; and meaningful age assurance mechanisms.
5. Risk assessments as foundation of compliance
Both ICO decisions illustrate how risk assessments and DPIAs can serve as a critical compliance safeguard. Under the Code, as well as broader UK (and EU) GDPR obligations, platforms that are likely to be accessed by children are expected to assess and document risks to minors. In practice, such assessments are often the primary evidence that an organisation has identified potential harms and implemented proportionate safeguards. Both Reddit and MediaLab could have seriously reduced their fines if they had conducted a DPIA.
More broadly, the regulatory framework governing children online has become increasingly complex. Online services must navigate the Code, guidance under the Online Safety Act, and parallel requirements emerging under EU instruments such as the GDPR and the DSA. The challenge for organisations is therefore less the absence of guidance than the need to map and align these overlapping obligations.
The cases also show that enforcement is directed particularly at services with social or community features. Platforms built around user interaction, messaging, or content sharing fall squarely within the ICO’s priorities. Importantly, a service does not need to view itself as a traditional social network to be treated as such. Reddit’s treatment shows that forum-based or community platforms are increasingly regarded as social platforms with corresponding responsibilities.
For companies operating across both jurisdictions, a single compliance strategy addressing children’s risks across data protection and platform regulation will likely become necessary.
If you have questions or concerns about the protection of children’s safety online, please contact Reign Lee or Thibaut D’hulst.
Download the full version of this article with more insights here.