The UK's Information Commissioner's Office on 5 February 2026 fined MediaLab.AI, Inc. £247,590 for failing to use children's personal information lawfully on Imgur, the image sharing and hosting platform the California-based company owns. The penalty follows a multi-year investigation that found MediaLab allowed children to access Imgur without any of the basic safeguards that UK data protection law requires. The announcement arrived at a moment when regulators across Europe and the United States are pressing platforms hard on child safety - and when the technical and legal standards for age assurance are tightening faster than many operators anticipated.
The fine is relatively modest by the standards of recent enforcement actions against technology companies. Under the UK GDPR, the ICO may issue fines of up to £17.5 million or 4% of an organisation's annual worldwide turnover, whichever is higher. MediaLab's penalty of £247,590 was calibrated against the company's global turnover, the number of children affected, the degree of potential harm, and the duration of the contraventions - which stretched from September 2021 to September 2025, a period of four years. The ICO also took into account MediaLab's acceptance of its provisional findings set out in a Notice of Intent issued in September 2025, as well as the company's commitment to address the infringements if access to the Imgur platform in the UK is restored in future.
What Imgur is, and who owns it
Imgur launched in 2009 as a simple image hosting service designed to make sharing pictures on Reddit easier. It grew into a large standalone community, hosting hundreds of millions of images and attracting substantial traffic from users posting memes, photography, and other visual content. In 2021, MediaLab.AI, Inc. - a company that acquires and operates consumer internet platforms - purchased Imgur from its founder. MediaLab also owns other well-known internet properties. The company is headquartered in the United States, which placed it within the territorial scope of UK data protection law when providing services to users in the UK.
The platform has at various points restricted access to UK users. The ICO noted in its penalty notice that if MediaLab resumes processing the personal data of children in the UK without implementing the measures it has committed to, the regulator may take further enforcement action.
Three specific breaches
The ICO's investigation concluded that MediaLab breached the UK GDPR in three distinct ways.
First, the company failed to implement any measures to check the age of its users. Imgur's terms of service stated that children under 13 could only use the platform with parental supervision, but no mechanism existed to enforce or even approximate that requirement. The platform collected personal data from all users - including children - without having any reliable way to identify who among them was a minor.
Second, MediaLab processed the personal information of children under 13 without parental consent or any other lawful basis. UK law requires that online services wishing to rely on consent as their lawful basis for processing a child's data must obtain that consent from the child's parent or carer. MediaLab had no parental consent mechanism in place. Imgur's terms stated children under 13 needed parental supervision, yet MediaLab, according to the ICO, "did not implement any form of age assurance measures to determine the age of Imgur users and did not have measures in place to obtain parental consent where children under 13 used the platform."
Third, the company failed to carry out a data protection impact assessment (DPIA) to identify and reduce the privacy risks children faced when using the service. DPIAs are a formal requirement under UK GDPR for processing that is likely to result in high risk to individuals. Children's data, particularly on platforms with user-generated content of a potentially sensitive nature, plainly meets that threshold.
Content risks the ICO identified
The investigation identified specific categories of harmful content that children using Imgur were exposed to. The ICO found that personal information often drives the content children see online, and that because MediaLab had no way of knowing the age of Imgur users, children "were at risk of being exposed to harmful content on the platform, including content related to eating disorders, homophobia, antisemitism and images of a sexual or violent nature."
This detail is significant. It illustrates the connection between data processing failures and real-world harm to minors. Age assurance is not merely a compliance tick-box; it is the mechanism by which platforms tailor or restrict the content children encounter. Without it, personalisation and recommendation systems - typically driven by behavioural data - operate on children's data in ways that can surface age-inappropriate material.
John Edwards, the UK Information Commissioner, said in the announcement: "MediaLab failed in its legal duties to protect children, putting them at unnecessary risk. For years, it allowed children to use Imgur without any effective age checks, while collecting and processing their data, which in turn exposed them to harmful and inappropriate content."
Edwards continued: "Age checks help organisations keep children's personal information safe and not used in ways that may harm them, such as by recommending age-inappropriate content. This fine is part of our wider work to drive improvements in how digital platforms use children's personal data. Ignoring the fact that children use these services, while processing their data unlawfully, is not acceptable. Companies that choose to ignore this can expect to face similar enforcement action."
The Children's code and its design standards
The ICO's enforcement action sits within a broader framework. The UK's Children's code - formally the Age Appropriate Design Code - translates the legal requirements of UK GDPR into concrete design standards for online services that are likely to be accessed by children under 18. The code requires that services place children's best interests at the forefront and give them a high level of privacy by default.
In December 2025, the ICO reported strong progress on its Children's code strategy, citing a proactive supervision programme to drive improvements in how social media and video sharing platforms handle children's data. The MediaLab penalty is explicitly characterised as part of this wider intervention. The ICO describes it as "part of a wider intervention by us to improve the safety of children's personal information online."
For organisations uncertain about how to comply, the ICO has issued guidance on age assurance tools - systems that can verify or estimate user age - as a "guardrail to prevent children from accessing online services they shouldn't be using or to help platforms tailor their online experience accordingly." The guidance indicates that organisations can either apply the full protections of the Children's code to all users, or use proportionate age assurance tools to tailor safeguards by age. Where children below a certain age are not allowed to use a service at all, the ICO says organisations must focus on preventing access and enforce their minimum age requirements using "robust age assurance methods."
Further detail on the regulator's expectations is available in its published age assurance opinion. The monetary penalty notice itself was published on 26 February 2026, following a period during which the ICO considered redaction of commercially sensitive and personal information.
The broader regulatory context
The ICO's action against MediaLab does not exist in isolation. Across Europe and North America, the question of how platforms should verify user age - and what happens when they don't - has moved rapidly up the regulatory agenda.
The UK Online Safety Act, which received Royal Assent in October 2023, established sweeping new requirements for platforms serving users in the UK, including mandatory age verification for services hosting adult content. Its enforcement has driven platforms including Bluesky and X to implement age assurance systems. The European Data Protection Board adopted Statement 1/2025 on 11 February 2025, setting out ten principles for GDPR-compliant age assurance, including requirements for data minimisation, the least intrusive verification method available, and a prohibition on additional tracking or profiling through the verification process.
In Germany, Sparkasse partnered with Google in July 2025 to launch the first national wallet-based digital age verification service in the EU, using zero-knowledge proof cryptography to confirm user ages without exposing detailed personal data. The EU's own digital identity framework is tracking a similar path, with the Commission developing continent-wide technical solutions linked to the Digital Services Act and the eIDAS regulation.
In the United States, new COPPA rules published by the FTC on 22 April 2025 took effect on 23 June 2025, with a full compliance deadline of 22 April 2026. Those amendments introduced stricter requirements on consent for third-party data sharing involving children's data - and represented the most significant changes to US children's online privacy protections in over a decade.
Meanwhile, the ICO itself has been active on other fronts. It fined 23andMe £2.31 million in June 2025 following a credential stuffing attack that exposed the personal data of 155,592 UK customers. The UK Data (Use and Access) Act 2025, which received Royal Assent on 10 July 2025, introduced new mandatory complaint reporting obligations for controllers under a framework inserted as section 164B.
Germany's digital economy association BVDW, in a position paper published on 10 February 2026, argued that mandatory age checks should be confined to platforms presenting genuine, high-level risks for minors - principally pornography and gambling - and that advertising-funded editorial outlets serving mixed audiences should not be subject to blanket verification mandates. The paper was backed by a Civey survey of 2,500 people conducted on 3 and 4 February 2026.
What this means for the marketing and advertising industry
For marketing professionals and publishers, the ICO's fine carries practical implications. Programmatic advertisingsystems rely on user data to personalise content and target audiences. When children's data flows into those systems - unidentified - the risks extend beyond regulatory fines. Advertisers may inadvertently target minors, publishers may violate brand safety standards, and the platforms hosting that inventory may find themselves exposed to enforcement action of exactly the kind MediaLab has now faced.
The ICO's investigation makes clear that ignorance of a user's age is not a defence. It is itself the violation. Platforms that collect personal data from users - and whose services are likely to be accessed by children - must take affirmative steps to understand who those users are and to process their data on a lawful basis. Failing to know is failing to comply.
The fine also signals something about enforcement proportionality. £247,590 is not a catastrophic sum for a company of MediaLab's scale. But the reputational consequences, the ongoing regulatory scrutiny, and the requirement to implement compliant systems before resuming UK operations represent real operational costs. The ICO made clear that similar enforcement action awaits companies that choose to ignore these obligations.
The ICO's broader investigation into real-time bidding has long raised concerns about the personal data - including data that may belong to children - that flows through programmatic advertising systems. The MediaLab case reinforces the point: the regulatory focus is not solely on the infrastructure of ad tech, but on the data practices of the platforms that generate the inventory.
Age assurance, once viewed as a compliance requirement primarily for adult content sites, is becoming a baseline expectation for any platform likely to be used by minors. For the marketing industry, that shift has significant targeting, measurement, and data governance implications.
Timeline
- September 2021 - MediaLab begins processing personal data of children using Imgur in ways the ICO later determines breach the UK GDPR.
- March 2024 - IAB raises concerns with the FTC about proposed COPPA amendments, warning of potential harm to children's online access. (PPC Land)
- January 2025 - ICO publishes guidance on consent-or-pay models, including specific considerations for children. (PPC Land)
- 11 February 2025 - European Data Protection Board adopts Statement 1/2025 establishing ten GDPR-compliant principles for age assurance systems. (PPC Land)
- 22 April 2025 - FTC publishes comprehensive COPPA amendments, taking effect 23 June 2025. (PPC Land)
- 5 June 2025 - ICO fines 23andMe £2.31 million following a data breach affecting 155,592 UK customers. (PPC Land)
- 13 June 2025 - Google's Global Director of Privacy Safety criticises Meta's age verification proposal as creating unnecessary risks for children. (PPC Land)
- 1 July 2025 - Sparkasse and Google announce wallet-based digital age verification partnership at the Global Digital Collaboration Conference. (PPC Land)
- 10 July 2025 - Bluesky announces age verification implementation for UK users under the Online Safety Act. (PPC Land)
- 10 July 2025 - Data (Use and Access) Act 2025 receives Royal Assent. (PPC Land)
- 26 July 2025 - X implements age assurance measures in compliance with the UK Online Safety Act, Irish Online Safety Code, and EU DSA. (PPC Land)
- 22 August 2025 - Bluesky blocks access from Mississippi IP addresses rather than comply with that state's age verification law. (PPC Land)
- September 2025 - ICO issues Notice of Intent to MediaLab, setting out provisional findings. MediaLab accepts the findings.
- September 2025 - MediaLab ceases processing personal data of children in the UK via Imgur.
- 26 September 2025 - Meta announces £2.99 monthly subscription option for UK Facebook and Instagram users following ICO guidance. (PPC Land)
- 10 February 2026 - Germany's BVDW publishes position paper on youth protection in digital spaces, calling for risk-proportionate age verification and media literacy over outright platform bans. (PPC Land)
- 4 February 2026 - ICO issues the monetary penalty notice to MediaLab.AI, Inc.
- 5 February 2026 - ICO publicly announces the £247,590 fine against MediaLab for children's privacy failures on Imgur.
- 26 February 2026 - ICO publishes the monetary penalty notice after considering redaction of personal and commercially sensitive information.
Summary
Who: MediaLab.AI, Inc., the US-based owner of the Imgur image hosting platform, was fined by the UK's Information Commissioner's Office (ICO). The penalty was announced by John Edwards, UK Information Commissioner.
What: A £247,590 monetary penalty under section 155(1) of the Data Protection Act 2018, for three violations of UK GDPR: failing to implement any age assurance measures; processing the personal data of children under 13 without parental consent or any other lawful basis; and failing to carry out a data protection impact assessment. The ICO found that children using Imgur were exposed to content related to eating disorders, homophobia, antisemitism, and images of a sexual or violent nature.
When: The contraventions occurred between September 2021 and September 2025. The ICO issued the Notice of Intent in September 2025. The penalty notice was dated 4 February 2026 and publicly announced on 5 February 2026. The full penalty notice was published on 26 February 2026.
Where: The enforcement action covers the processing of personal data of children in the United Kingdom through the Imgur platform. MediaLab.AI, Inc. is incorporated and headquartered in the United States.
Why: MediaLab failed to implement any form of age assurance to determine the ages of Imgur users, processing children's data without a lawful basis and without conducting the required data protection impact assessment. The ICO determined that this exposed children to potentially harmful content and breached their rights under UK GDPR. The fine forms part of a broader ICO programme to improve how digital platforms handle children's personal data under the UK Children's code.