The European Commission today found Meta's Instagram and Facebook in preliminary breach of the Digital Services Act (DSA), concluding that the company has failed to diligently identify, assess and mitigate the risks posed to minors under 13 years old who access its services. The announcement, made in Brussels on April 29, 2026, marks one of the most direct confrontations yet between EU regulators and Meta over child safety on its core social platforms.
The Commission's findings are preliminary and do not prejudge the final outcome of the investigation. However, they carry significant weight. If ultimately confirmed, the Commission may issue a non-compliance decision that can trigger a fine proportionate to the infringement, which shall in no case exceed 6% of the total worldwide annual turnover of the provider. The Commission can also impose periodic penalty payments to compel compliance.
What the Commission found
According to the Commission's press release, despite Meta's own terms and conditions setting the minimum age to access Instagram and Facebook safely at 13, the measures put in place by the company to enforce these restrictions do not seem to be effective. The measures do not adequately prevent minors under the age of 13 from accessing their services, nor do they promptly identify and remove them if they have already gained access.
The failures are specific and documented. When creating an account, according to the Commission, minors below 13 can enter a false birth date that makes them appear at least 13 years old, with no effective controls in place to check the correctness of the self-declared date of birth. In other words, a child can simply lie about their age and the system accepts the input without further verification. This is not a theoretical gap - it is a structural one.
The reporting mechanism that Meta provides for flagging under-13 users on its platforms is a separate problem identified by regulators. According to the Commission, Meta's tool for reporting minors under 13 on the platform is difficult to use and not effective, requiring up to seven clicks just to access the reporting form, which is not automatically pre-filled with the user's information. Even when a minor under 13 is reported for being under the age threshold, there often is no proper follow-up, and the reported minor can simply continue to use the service without any type of check.
Seven clicks to reach a reporting form. No automatic pre-fill. No guaranteed follow-up. These are the technical realities that regulators have placed on record.
The risk assessment problem
Underlying these operational failures, the Commission points to a deeper methodological issue. According to the press release, the findings build on an incomplete and arbitrary risk assessment, which inadequately identifies the risk of minors under 13 accessing Instagram and Facebook and being exposed to age-inappropriate experiences.
Meta's own internal assessment, the Commission argues, contradicts large bodies of evidence from across the European Union. According to the Commission's findings, roughly 10-12% of children under 13 are accessing Instagram and/or Facebook. That is a significant proportion of the under-13 population in the bloc, and it stands directly at odds with Meta's apparently more sanguine internal conclusions about the scale of the problem.
The Commission goes further, stating that Meta seems to have disregarded readily available scientific evidence indicating that younger children are more vulnerable to potential harms caused by services like Facebook and Instagram. Regulators are not simply citing compliance boxes left unticked. They are saying the company looked at the evidence and drew conclusions that the evidence does not support.
At this stage, according to the Commission, Instagram and Facebook must change their risk assessment methodology in order to evaluate which risks arise on these platforms in the European Union, and how they manifest. Moreover, Instagram and Facebook need to strengthen their measures to prevent, detect and remove minors under the age of 13 from their service. Meta must effectively counter and mitigate risks that minors under the age of 13 could experience on the platforms, which must ensure a high level of privacy, safety and security for minors.
What happens next
The DSA grants platforms a right of defence before any final decision. According to the Commission, Instagram and Facebook now have the possibility to examine the documents in the Commission's investigation files and reply in writing to the Commission's preliminary findings. The platforms can take measures to remedy the breaches, in line with the 2025 DSA Guidelines on the protection of minors. In parallel, the European Board for Digital Services will be consulted.
If the Commission's views are ultimately confirmed, the Commission may issue a non-compliance decision, which can trigger a fine proportionate to the infringement which shall in no case exceed 6% of the total worldwide annual turnover of the provider. The Commission can also impose periodic penalty payments to compel a platform to comply.
The preliminary findings are part of the Commission's formal proceedings against Meta, which cover Instagram and Facebook. These formal proceedings are separate from any ongoing investigations into other potential DSA breaches by the company.
Context: a pattern of enforcement
Today's action does not arrive in isolation. The Commission has built a record of DSA enforcement against major platforms over the past two years, with Meta appearing repeatedly across multiple proceedings.
In October 2025, the Commission preliminarily found both TikTok and Meta in breach of DSA transparency obligations, targeting researcher data access and the mechanisms users employ to report illegal content. That proceeding focused on burdensome procedures that leave researchers with partial or unreliable data - a different angle from today's child protection findings, but part of the same broader enforcement push.
In December 2025, the Commission issued a €120 million fine against X - the first non-compliance decision under the DSA - covering deceptive verification design, inadequate advertising repository transparency, and failure to give researchers access to public data.
Prior to that, in April 2025, the Commission fined Meta €200 million for Digital Markets Act violations related to consumer choice and data usage, a decision Meta subsequently appealed.
The DSA has been fully operational since February 17, 2024, applying to all platforms. Its obligations for Very Large Online Platforms - defined as services with more than 45 million monthly active users in the EU - include mandatory risk assessments, mitigation measures, external audits, and transparency reporting. Meta's Facebook and Instagram comfortably exceed that threshold and have been subject to the most stringent tier of DSA obligations throughout the regulation's operational period.
Meta's prior child safety steps - and their limits
Meta has not been passive on youth safety issues in recent years, though regulators are now saying its steps have been insufficient. Instagram launched Teen Accounts in September 2024, a system applying strict privacy and safety settings for users under 16, with parents gaining enhanced controls including time limits and content monitoring. By April 8, 2025, Meta reported at least 54 million active Teen Accounts globally, while also announcing tighter restrictions on Instagram Live and direct message protections for under-16s, alongside plans to extend the framework to Facebook and Messenger.
In October 2025, Meta aligned Teen Account content filtering with PG-13 movie rating standards in the United States, United Kingdom, Australia, and Canada - a move that drew attention for borrowing a familiar cultural reference point rather than deploying purely technical age detection. On April 9, 2026, Instagram expanded its 13+ content rating and a new Limited Content setting to India.
But the Commission's findings today target something distinct from what Teen Accounts address. Teen Accounts apply to users between 13 and 17. The issue the Commission has identified is what happens below 13 - specifically, whether the platform is doing enough to stop children who should not be on the platform at all from being there in the first place. The gap between Meta's voluntary youth safety measures and the Commission's legal requirements is precisely that distinction.
The age verification question has also attracted scrutiny from within the technology industry itself. In June 2025, Google's Global Director of Privacy Safety and Security Policy published a critique of Meta's proposed app store-based age verification system, warning that it would require the sharing of granular age band data with millions of developers who do not need it. The critique highlighted fundamental divisions over methodology, not just intent.
What the DSA requires platforms to do
The DSA does not specify which technical solution platforms must use for age verification. It requires Very Large Online Platforms to conduct thorough risk assessments covering systemic risks - including those related to minors - and to put in place reasonable, proportionate and effective mitigation measures. The regulation sets a duty of diligence, not a specific technical standard.
That framing creates a zone of flexibility but also of judgment. According to the DSA's broader requirements as described across previous enforcement actions, platforms must enforce their own rules - terms and conditions should not simply be written statements but the basis for concrete action to protect users, including children. The Commission's position, as stated in today's press release, is that Meta's terms set the threshold at 13 but the company's systems do not meaningfully enforce that threshold.
The 2025 DSA Guidelines on the protection of minors - referenced in today's press release as a benchmark for any remedial action - provide more detailed guidance on what compliance looks like in practice. Platforms now have the opportunity to act on that guidance before the Commission reaches any final determination.
Why this matters for the marketing community
For advertising and marketing professionals, the DSA's child protection enforcement has direct operational consequences. Platforms that face non-compliance decisions may be required to implement more stringent age detection and verification systems. Tighter controls on who is accessing a platform translate into changes in audience composition data, potential shifts in reach for campaigns targeting specific demographics, and additional compliance requirements for advertisers.
Meta's advertising business in Europe is substantial. The company has previously estimated that its personalized advertising services generated €213 billion in economic activity and supported 1.44 million jobs across the EU in 2024. Any structural changes to how the platforms operate in response to DSA enforcement will move through that ecosystem.
More broadly, the pattern of enforcement the Commission is building - spanning transparency obligations, consumer choice, researcher data access, and now child protection - signals that DSA compliance is not a one-time exercise. Each investigation that results in preliminary findings, and potentially a non-compliance decision, sets a more detailed baseline for what sufficient compliance looks like. The child safety findings today add to that accumulating record.
Timeline
- November 2022 - Digital Services Act enters into force in the European Union
- February 17, 2024 - DSA becomes fully operational for all platforms, including Very Large Online Platforms
- September 2024 - Instagram launches Teen Accounts, applying automatic safety settings for users under 16, with global rollout beginning
- November 28, 2024 - Meta releases comprehensive DSA transparency report covering April to September 2024 operations across Facebook and Instagram
- April 8, 2025 - Meta reports 54 million active Teen Accounts globally and announces additional restrictions for under-16s on Instagram, Facebook, and Messenger
- April 23, 2025 - European Commission fines Meta €200 million for DMA violations related to consumer choice
- June 13, 2025 - Google's privacy chief publishes critique of Meta's proposed app store age verification system, warning of data-sharing risks for children
- July 2, 2025 - Meta formally appeals the Commission's DMA compliance decision
- October 14, 2025 - Instagram aligns Teen Account content filtering with PG-13 rating standards in the US, UK, Australia, and Canada
- October 24, 2025 - European Commission preliminarily finds Meta and TikTok in breach of DSA transparency obligations over researcher data access and content moderation reporting
- December 5, 2025 - European Commission issues €120 million fine against X, the first DSA non-compliance decision
- April 9, 2026 - Instagram expands 13+ content ratings and Limited Content setting to India
- April 29, 2026 - European Commission today issues preliminary findings that Meta's Instagram and Facebook are in breach of the DSA for failing to prevent minors under 13 from accessing their services
Summary
Who: The European Commission has taken action against Meta Platforms, the operator of Instagram and Facebook, the two largest social media platforms in the world by active user base in the European Union.
What: The Commission today issued preliminary findings that Instagram and Facebook are in breach of the Digital Services Act for failing to diligently identify, assess and mitigate the risks of minors under 13 years old accessing their services. Specific failures identified include the absence of effective controls on self-declared birth dates, a reporting tool requiring up to seven clicks that provides no guaranteed follow-up, and an internal risk assessment methodology that the Commission says contradicts EU-wide evidence showing roughly 10-12% of children under 13 access these platforms.
When: The preliminary findings were announced on April 29, 2026. They form part of formal proceedings under the DSA, which has been fully operational since February 17, 2024. The findings are preliminary and do not prejudge the final outcome.
Where: The proceedings are conducted by the European Commission in Brussels. They affect Meta's operations across all EU member states where Instagram and Facebook are available to users.
Why: The DSA requires Very Large Online Platforms to diligently assess and mitigate systemic risks, including risks to minors. The Commission concluded that Meta's measures to enforce its own minimum age of 13 are not effective - children can bypass them with false birth dates, the reporting mechanism is difficult to access, and the company's risk assessment methodology fails to reflect available scientific evidence on the vulnerability of younger children. If the findings are ultimately confirmed, Meta could face a fine of up to 6% of total worldwide annual turnover.