EU finds TikTok and Meta in breach of Digital Services Act transparency rules
European Commission preliminarily finds TikTok and Meta violated DSA transparency obligations on October 24, 2025, affecting researcher data access and content moderation.
The European Commission announced on October 24, 2025, that it had preliminarily found both TikTok and Meta in breach of transparency obligations under the Digital Services Act. The findings target fundamental aspects of platform accountability, from researcher access to public data to the mechanisms users employ to report illegal content.
According to the Commission's press release, Facebook, Instagram and TikTok may have implemented burdensome procedures that leave researchers with partial or unreliable data. This directly impacts their ability to conduct research on whether users, including minors, are exposed to illegal or harmful content. The platforms now face potential fines of up to 6% of their total worldwide annual turnover if the preliminary findings are ultimately confirmed.
Subscribe PPC Land newsletter ✉️ for similar stories like this one. Receive the news every day in your inbox. Free of ads. 10 USD per year.
Researcher data access failures
The Commission's preliminary findings reveal that all three platforms—Facebook, Instagram and TikTok—have established procedures and tools for researchers that appear designed to obstruct rather than facilitate access to public data. According to the Commission, this practice undermines an essential transparency obligation under the DSA, which aims to provide public scrutiny into the potential impact of platforms on physical and mental health.
The findings show these burdensome procedures often result in researchers receiving incomplete or unreliable datasets. This limitation directly affects their capacity to investigate critical issues, particularly concerning minors' exposure to illegal or harmful content. The Commission emphasized that allowing researchers access to platforms' data represents a cornerstone of the DSA's transparency framework.
New possibilities for researchers will emerge on October 29, 2025, when the delegated act on data access comes into force. This regulatory measure will grant access to non-public data from very large online platforms and search engines, aiming to enhance their accountability and identify potential risks arising from their activities.
Meta's content reporting mechanisms under scrutiny
The Commission's investigation into Meta uncovered significant deficiencies in how Facebook and Instagram handle illegal content reporting. According to the press release, neither platform appears to provide a user-friendly and easily accessible "Notice and Action" mechanism for users to flag illegal content, such as child sexual abuse material and terrorist content.
The mechanisms Meta currently applies appear to impose several unnecessary steps and additional demands on users. Both Facebook and Instagram appear to use "dark patterns"—deceptive interface designs—when it comes to the "Notice and Action" mechanisms. The Commission stated that such practices can be confusing and dissuading, potentially rendering Meta's mechanisms to flag and remove illegal content ineffective.
Under the DSA, "Notice and Action" mechanisms serve as key tools allowing EU users and trusted flaggers to inform online platforms that certain content does not comply with EU or national laws. Online platforms do not benefit from the DSA's liability exemption in cases where they have not acted expeditiously after being made aware of the presence of illegal content on their services.
The Commission's views related to Meta's reporting tool, dark patterns and complaint mechanism are based on an in-depth investigation, including co-operation with Coimisiún na Meán, the Irish Digital Services Coordinator.
Appeals process limitations
The Commission also identified problems with Meta's content moderation appeals process. According to the findings, the DSA gives users in the EU the right to challenge content moderation decisions when platforms remove their content or suspend their accounts. However, at this stage, the decision appeal mechanisms of both Facebook and Instagram do not appear to allow users to provide explanations or supporting evidence to substantiate their appeals.
This limitation makes it difficult for users in the EU to further explain why they disagree with Meta's content decision. The Commission stated this restriction limits the effectiveness of the appeals mechanism, which is designed to provide users with meaningful recourse when they believe platform decisions were incorrect.
Implications for the marketing community
The enforcement action carries significant implications for digital marketers and advertisers who rely on these platforms for reaching audiences. The Digital Services Act has been reshaping how platforms operate in the European Union since its implementation, requiring unprecedented levels of transparency and accountability. When platforms face potential fines and operational restrictions, advertising ecosystems can experience disruption.
Meta's platforms—Facebook and Instagram—represent critical advertising channels for businesses of all sizes. Any changes to how these platforms handle content moderation or implement new compliance measures could affect ad delivery, targeting capabilities, and overall campaign performance. TikTok, despite being newer to the digital advertising landscape, has rapidly become essential for brands targeting younger demographics.
The DSA's transparency requirements extend beyond content moderation to include how platforms use data and algorithms to serve content and advertisements. Researchers' ability to access platform data helps illuminate these mechanisms, potentially leading to a better understanding of how advertising systems function and their broader societal impacts.
For marketers, the findings suggest increased regulatory scrutiny will continue. Platforms may need to implement significant operational changes to comply with DSA requirements, which could alter familiar workflows and advertising interfaces. The threat of fines up to 6% of global annual turnover creates strong incentives for platforms to address the Commission's concerns.
Technical implementation challenges
The Commission's findings highlight specific technical deficiencies in how platforms have structured their data access systems. For researchers requesting access to public data, the platforms appear to have created multi-step approval processes that result in incomplete datasets. These technical barriers prevent thorough analysis of content distribution patterns, user exposure to harmful material, and algorithmic amplification of specific content types.
The "dark patterns" identified in Meta's Notice and Action mechanisms represent deliberate design choices that make reporting illegal content more difficult than necessary. According to the Commission, these interface designs include unnecessary steps and additional demands that confuse users and discourage reporting. Such design choices contradict the DSA's requirement for easily accessible reporting mechanisms.
Meta's appeals process similarly suffers from technical limitations that prevent users from submitting supporting evidence or detailed explanations when challenging content moderation decisions. The Commission's findings suggest these systems were not designed with user empowerment as a priority, instead creating hurdles that limit the effectiveness of appeals.
Buy ads on PPC Land. PPC Land has standard and native ad formats via major DSPs and ad platforms like Google Ads. Via an auction CPM, you can reach industry professionals.
Legal process and potential penalties
The Commission emphasized that these are preliminary findings which do not prejudge the outcome of the investigation. Facebook, Instagram and TikTok now have the possibility to examine the documents in the Commission's investigation files and reply in writing to the Commission's preliminary findings. The platforms can take measures to remedy the breaches during this period.
In parallel, the European Board for Digital Services will be consulted on the findings. This consultation process ensures that member states have input into the Commission's enforcement actions and helps maintain consistent application of DSA rules across the EU.
If the Commission's views are ultimately confirmed, the Commission may issue a non-compliance decision. Such a decision can trigger a fine of up to 6% of the total worldwide annual turnover of the provider. The Commission can also impose periodic penalty payments to compel a platform to comply with its obligations under the DSA.
Ongoing investigations
According to the Commission, these preliminary findings are part of formal proceedings launched into Meta and TikTok under the DSA. The Commission continues its investigation into other potential breaches that are part of these ongoing proceedings. These formal DSA proceedings are distinct from ongoing investigations against Facebook, Instagram and TikTok concerning compliance with other relevant EU law.
The multiplicity of investigations reflects the comprehensive scope of the DSA's requirements and the Commission's determination to enforce them rigorously. Platforms operating in the EU must now navigate various compliance frameworks simultaneously, each carrying its own penalties for non-compliance.
Executive Vice-President for Tech Sovereignty, Security and Democracy Henna Virkkunen stated on October 24, 2025: "Our democracies depend on trust. That means platforms must empower users, respect their rights, and open their systems to scrutiny. The DSA makes this a duty, not a choice. With today's actions, we have now issued preliminary findings on researchers' access to data to four platforms. We are making sure platforms are accountable for their services, as ensured by EU law, towards users and society."
This statement indicates that the Commission has now issued preliminary findings on researchers' access to data to four platforms in total, though only TikTok and Meta were named in this specific announcement. The emphasis on accountability and transparency signals that the Commission views these cases as fundamental to the DSA's broader mission of platform governance.
Subscribe PPC Land newsletter ✉️ for similar stories like this one. Receive the news every day in your inbox. Free of ads. 10 USD per year.
Timeline
- October 24, 2025: European Commission announces preliminary findings that TikTok and Meta breached DSA transparency obligations
- October 24, 2025: Commission finds Meta's Facebook and Instagram used "dark patterns" in Notice and Action mechanisms
- October 24, 2025: Henna Virkkunen announces this marks the fourth platform to receive preliminary findings on researcher data access
- October 29, 2025: Delegated act on data access comes into force, granting researchers access to non-public data from very large online platforms
Subscribe PPC Land newsletter ✉️ for similar stories like this one. Receive the news every day in your inbox. Free of ads. 10 USD per year.
Summary
Who: The European Commission took action against TikTok and Meta (which operates Facebook and Instagram), affecting researchers, EU users, and trusted flaggers who interact with these platforms.
What: The Commission preliminarily found the platforms in breach of DSA transparency obligations, specifically regarding researcher access to public data, user-friendly mechanisms to report illegal content, and effective appeals processes for content moderation decisions.
When: The Commission announced the preliminary findings on October 24, 2025, with the delegated act on data access set to come into force on October 29, 2025.
Where: The enforcement action applies throughout the European Union, affecting how these platforms operate for EU users and researchers based in member states.
Why: The Commission acted to enforce the DSA's transparency requirements, which aim to ensure platforms empower users, respect their rights, and open their systems to scrutiny, protecting democracies that depend on trust in digital services.