European Commission dismisses censorship claims around Digital Services Act

European Commission spokesperson defends Digital Services Act with concrete data showing 35% success rate in content challenges, dismissing censorship allegations as "complete nonsense."

EU flag over Brussels with digital shield and breaking chains symbolizing DSA protection
EU flag over Brussels with digital shield and breaking chains symbolizing DSA protection

European Commission Spokesperson for Tech Sovereignty Thomas Regnier issued a forceful defense of the Digital Services Act on his LinkedIn profile, categorically rejecting recent censorship allegations as "complete nonsense, completely unfounded, completely wrong."

The statement, posted on August 28, 2025, comes amid mounting criticism from various quarters about the EU's flagship digital regulation. Regnier emphasized that the Digital Services Act functions as the opposite of censorship, providing robust protection for free speech across the European Union.

According to the Commission spokesperson, concrete data demonstrates the DSA's effectiveness in protecting legitimate content. In the second half of 2024, 16 million content removal decisions taken by TikTok and Meta were challenged by users within the EU framework established by the Digital Services Act.

0:00
/1:18

The success rate for these challenges reached 35 percent, meaning more than one-third of content moderation decisions initially taken by these major platforms were deemed unjustified and subsequently reversed. This translates to approximately 5.6 million pieces of content being restored to users after successful appeals through the DSA framework.

"This is the opposite of censorship. This is protection of free speech," Regnier stated in his LinkedIn post, which garnered significant attention from digital policy experts and industry professionals.

The Digital Services Act, which entered into force in November 2022, establishes comprehensive rules for online platforms operating within the European Union. The regulation requires Very Large Online Platforms with more than 45 million monthly active users to implement robust content moderation systems while providing users meaningful recourse when they believe content has been wrongfully removed.

For marketing professionals tracking regulatory developments, these figures represent a significant vindication of the DSA's approach to content governance. Unlike traditional regulatory frameworks that rely primarily on platform self-regulation, the DSA creates enforceable rights for users to challenge platform decisions.

The numbers reveal substantial gaps in initial content moderation accuracy by major platforms. Meta's Facebook and Instagram, along with TikTok, collectively processed hundreds of millions of content decisions during the reporting period. The 16 million challenges represent a fraction of total moderation actions, yet the 35 percent success rate indicates systematic over-removal by automated systems.

Technical implementation of the DSA's appeal mechanisms required significant platform modifications. Meta assembled a cross-functional team of over 1,000 professionals specifically to develop DSA compliance solutions, according to industry reports. These investments demonstrate the regulatory burden while highlighting the substantial resources required for effective content governance at scale.

The Commission's data contradicts assertions from critics who characterize the DSA as censorship infrastructure. Recent analysis by the U.S. House Committee on the Judiciary raised concerns about European regulators potentially targeting American political discourse, but the restoration statistics suggest robust protection for legitimate expression.

Platform performance varies significantly across content categories and enforcement mechanisms. The DSA requires detailed transparency reporting from designated platforms, enabling researchers and policymakers to identify patterns in content moderation effectiveness. These reports provide unprecedented visibility into previously opaque algorithmic decision-making processes.

Beyond content restoration, the DSA establishes broader obligations for platform risk management. Very Large Online Platforms must conduct annual risk assessments addressing systemic threats including illegal content, fundamental rights violations, and societal harms. External auditors validate these assessments, creating accountability mechanisms for platform governance.

The regulation's tiered approach applies different requirements based on platform size and societal impact. While smaller platforms face basic transparency obligations, major services like Google Search, YouTube, Facebook, Instagram, and TikTok operate under enhanced scrutiny including algorithmic transparency requirements and researcher data access provisions.

Enforcement mechanisms extend beyond content decisions to advertising practices and recommendation systems. The DSA prohibits targeting advertisements to minors and restricts profiling based on sensitive characteristics including political opinions, religious beliefs, and sexual orientation. These provisions directly impact digital marketing strategies across European markets.

Recent court decisions demonstrate practical DSA implementation challenges. The Düsseldorf Regional Court's January 15, 2025 ruling in Skinport v. Google Ireland Limited established platform liability as "disruptive parties" when failing to prevent trademark violations in advertising. This precedent creates new compliance obligations for advertising platforms operating within the EU.

The Commission emphasized that legitimate content moderation focuses on genuine threats requiring international cooperation. Regnier's statement highlighted priorities including prevention of youth self-harm, terrorist content removal, combating child sexual abuse material, and eliminating financial scams.

Advertise on ppc land

Buy ads on PPC Land. PPC Land has standard and native ad formats via major DSPs and ad platforms like Google Ads. Via an auction CPM, you can reach industry professionals.

Learn more

These priorities reflect broader challenges facing digital platforms beyond traditional censorship concerns. The DSA's framework addresses systemic risks while preserving fundamental rights protections through procedural safeguards and appeal mechanisms.

Implementation complexity increases as multiple regulatory frameworks overlap. The Digital Services Act operates alongside the General Data Protection Regulation, Digital Markets Act, and emerging AI Act requirements. This regulatory convergence creates substantial compliance challenges for platforms while potentially enhancing user protections.

Industry response varies across different platform categories and business models. Social media platforms invested heavily in compliance infrastructure, while search engines and e-commerce marketplaces adapted existing systems to meet DSA transparency requirements. Advertising technology companies developed new solutions for regulatory reporting and user consent management.

The Commission's defense comes as global regulatory frameworks increasingly scrutinize platform governance. Similar legislation in the United Kingdom, Australia, and other jurisdictions creates international precedents for content moderation oversight. The DSA's success in protecting legitimate expression influences regulatory approaches worldwide.

Market implications extend beyond compliance costs to competitive dynamics. Enhanced transparency requirements enable smaller platforms to demonstrate superior content governance compared to established competitors. This regulatory differentiation potentially benefits platforms emphasizing user rights and algorithmic accountability.

The Commission's data release strategy reflects broader transparency commitments under the DSA framework. Regular reporting requirements create public accountability for both platforms and regulators while enabling evidence-based policy development. These transparency mechanisms distinguish the EU approach from traditional regulatory models relying on industry self-reporting.

Future enforcement priorities include expanding coverage to emerging platform categories and addressing cross-border coordination challenges. The DSA's global influence through extraterritorial application affects content policies worldwide, creating opportunities for regulatory arbitrage while establishing minimum protection standards.

The 35 percent success rate in content challenges represents more than statistical validation of the DSA framework. These figures demonstrate practical implementation of user rights protections while maintaining platform responsibility for illegal content removal. This balance addresses concerns from both free expression advocates and those demanding stronger content governance.

Platform investment in compliance infrastructure continues expanding as enforcement mechanisms mature. The Commission indicated ongoing dialogue with designated platforms regarding implementation best practices while maintaining enforcement authority for non-compliance. Substantial financial penalties provide incentive structures for maintaining adequate content governance systems.

The controversy surrounding censorship allegations reflects broader tensions between national sovereignty and global platform governance. European regulators assert jurisdiction over platform operations affecting EU users while respecting fundamental rights protections established by the Charter of Fundamental Rights of the European Union.

Technical standards development continues through multi-stakeholder processes involving platform operators, civil society organizations, academic researchers, and regulatory authorities. These collaborative approaches aim to establish effective content governance while preserving innovation incentives and competitive dynamics.

For marketers navigating this regulatory landscape, the Commission's emphatic defense provides clarity about enforcement priorities and procedural protections. The DSA framework creates predictable rules for advertising practices while maintaining flexibility for legitimate business operations within established boundaries.

Timeline

PPC Land explains

Digital Services Act (DSA): The European Union's comprehensive regulatory framework that entered into force in November 2022, establishing uniform rules for online intermediary services across all 27 member states. The DSA creates a tiered approach where different obligations apply based on platform size and societal impact, with the most stringent requirements for Very Large Online Platforms serving more than 45 million monthly users. The regulation mandates transparency in content moderation, algorithmic accountability, and user rights protections while requiring platforms to conduct annual risk assessments addressing systemic threats to democratic discourse and fundamental rights.

Content Moderation: The systematic process by which digital platforms identify, review, and take action on user-generated content that potentially violates platform policies or applicable laws. Under the DSA framework, content moderation encompasses automated detection systems, human review processes, and mandatory appeal mechanisms that enable users to challenge platform decisions. The 35% success rate in overturning initial moderation decisions demonstrates the complexity of balancing free expression with legitimate safety concerns at the massive scale required by global platforms.

Very Large Online Platforms (VLOPs): Digital services designated by the European Commission based on reaching the threshold of 45 million monthly active users within the EU, triggering enhanced regulatory obligations under the DSA. These platforms include major services like Google Search, YouTube, Facebook, Instagram, TikTok, and others that face heightened scrutiny including algorithmic transparency requirements, external auditing obligations, and researcher data access provisions. The VLOP designation reflects the platforms' significant societal impact and potential to influence democratic discourse, economic competition, and fundamental rights across European markets.

European Commission: The executive arm of the European Union responsible for proposing legislation, implementing decisions, and enforcing EU treaties across member states. In the context of digital regulation, the Commission serves as the primary enforcement authority for the DSA, conducting investigations, imposing financial penalties, and coordinating with national regulators to ensure consistent implementation. Thomas Regnier represents the Commission as Spokesperson for Tech Sovereignty, communicating policy positions and defending regulatory approaches against external criticism from various stakeholders.

Platform: Digital intermediary services that connect users with content, services, or other users through technological infrastructure and algorithmic systems. Under DSA terminology, platforms encompass social media networks, search engines, e-commerce marketplaces, app stores, and other online services that host, transmit, or index user-generated content. Platform obligations vary significantly based on user numbers, with enhanced requirements for those meeting VLOP thresholds while maintaining proportionate rules for smaller services to avoid stifling innovation or creating barriers to market entry.

Compliance: The process of meeting regulatory requirements through technical implementation, policy development, and operational changes that align platform practices with legal obligations. DSA compliance involves substantial investments in content moderation infrastructure, transparency reporting systems, user appeal mechanisms, and risk assessment procedures that require cross-functional coordination across legal, technical, and policy teams. Meta's deployment of over 1,000 professionals specifically for DSA compliance demonstrates the resource intensity required for effective regulatory adherence at scale.

Transparency: The DSA's fundamental principle requiring platforms to provide clear, accessible information about their content policies, algorithmic decision-making processes, and enforcement actions to users, researchers, and regulatory authorities. Transparency obligations include detailed reporting on content moderation decisions, advertising practices, recommendation system functioning, and risk mitigation measures that enable external accountability and informed policy development. These requirements represent a significant departure from traditional platform opacity, creating new opportunities for academic research and regulatory oversight.

Regulatory: The comprehensive framework of laws, rules, and enforcement mechanisms that govern digital platform operations within specific jurisdictions, with the DSA representing the EU's flagship approach to platform governance. Regulatory convergence across multiple frameworks including GDPR, Digital Markets Act, and emerging AI Act creates complex compliance landscapes that require sophisticated legal and technical expertise to navigate effectively. The regulatory approach emphasizes ex-ante obligations rather than purely reactive enforcement, establishing proactive requirements for risk management and user protection.

Enforcement: The mechanisms by which regulatory authorities ensure platform compliance with legal obligations, including investigation procedures, financial penalties, and operational restrictions for non-compliant services. DSA enforcement combines regulatory dialogue, formal proceedings, and potential sanctions up to 6% of global annual turnover for serious violations, creating significant incentive structures for maintaining adequate governance systems. The Commission's enforcement strategy emphasizes proportionate responses while maintaining credible deterrent effects against regulatory violations that could harm user rights or democratic institutions.

Framework: The structural approach underlying the DSA's comprehensive regulation of digital services, establishing coherent principles, procedures, and obligations that create predictable rules for platform operations while preserving flexibility for innovation and competitive dynamics. The DSA framework integrates content governance, algorithmic accountability, transparency requirements, and user rights protections into a unified regulatory system that addresses systemic risks while respecting fundamental rights protections established by European Union treaties and the Charter of Fundamental Rights.

Summary

Who: European Commission Spokesperson for Tech Sovereignty Thomas Regnier defended the Digital Services Act against censorship allegations, with data involving TikTok and Meta platform decisions.

What: Regnier dismissed censorship claims as "complete nonsense" while presenting data showing 35% success rate for content challenges, resulting in restoration of 5.6 million pieces of content wrongfully removed by platforms.

When: The defense was posted on August 28, 2025, addressing criticism that has mounted throughout 2025 regarding the DSA's implementation and enforcement mechanisms.

Where: The statement was made via LinkedIn and applies to Digital Services Act enforcement across all 27 European Union member states, affecting global platform operations.

Why: Growing criticism from U.S. lawmakers and free speech advocates prompted the Commission to provide concrete evidence demonstrating the DSA protects rather than restricts legitimate expression through effective appeal mechanisms.