House committee exposes how EU's DSA targets American political speech

Congressional investigation reveals European Union regulators classify common political statements as illegal hate speech under Digital Services Act, potentially affecting global content moderation policies.

EU's Digital Services Act creates global censorship chain from Brussels to US social media platforms.
EU's Digital Services Act creates global censorship chain from Brussels to US social media platforms.

The U.S. House Committee on the Judiciary released an interim report on July 25, 2025, exposing how the European Union's Digital Services Act serves as a tool for censoring American political discourse. According to committee findings, European regulators have classified statements like "we need to take back our country" as illegal hate speech, raising concerns about the law's extraterritorial reach and impact on global free speech.

The 147-page report details how the DSA, passed in October 2022, requires major social media platforms to identify and mitigate alleged "systemic risks" including "misleading or deceptive content," "disinformation," and "hate speech" – even when such content "is not illegal." According to the committee's investigation, platforms that fail to comply face fines up to six percent of their global revenue, potentially amounting to billions of dollars for major American technology companies.

The report reveals evidence from a previously undisclosed European Commission workshop held on May 7, 2025, where officials provided specific examples of content they consider problematic. In one exercise, Commission regulators categorized the phrase "we need to take back our country" as "coded language" and "illegal hate speech" requiring platform censorship. This political rhetoric has been used across the American political spectrum by figures including former Vice President Kamala Harris, Senator Elizabeth Warren, and President Donald Trump.

European Commission officials also specifically targeted humor and satirical content during the closed-door session. Workshop materials asked platforms how they could use "content moderation processes" to "address memes that may be used to spread hate speech or discriminatory ideologies." The commission's approach suggests a broad interpretation of harmful content that extends well beyond traditionally illegal material.

The congressional investigation documented how DSA enforcement creates pressure for global content moderation changes. Since major platforms typically maintain unified terms of service worldwide, DSA requirements effectively impose European speech standards on American users. According to the report, Commission regulators expect platforms to "review and update terms and conditions based on the [DSA] risks they identified" and incorporate these changes into their global policies.

The committee obtained internal documents showing European authorities targeting specific political viewpoints. In November 2024, Poland's National Research Institute flagged a TikTok post stating "electric cars are neither ecological nor an economical solution." French authorities ordered removal of commentary about immigration policy following a terrorist attack by a Syrian refugee. German officials classified calls for deportation of criminal aliens as "incitement to hatred" and "incitement to violence."

These enforcement actions demonstrate a pattern of targeting conservative political speech on immigration and environmental issues. The report notes that censorship requests appear "largely one-sided, almost uniformly targeting political conservatives." The Commission's fictional example used an account handle "@Patriot90" as the primary perpetrator of hate speech, suggesting regulatory focus on patriotic or conservative social media personas.

The investigation also revealed how the DSA creates structural incentives for increased censorship through trusted flaggers and mandatory arbitration systems. European authorities designate government-approved third parties as "trusted flaggers" whose content removal requests receive priority processing. Many of these organizations have direct financial relationships with European governments or built-in conflicts of interest with targeted platforms.

For example, the German trusted flagger Hate Aid is currently in litigation with X, the platform it flags for content violations. French flaggers e-Enfance and Association Point de Contact receive government funding and work closely with law enforcement agencies. These relationships raise questions about the independence and impartiality of the flagging system.

The out-of-court arbitration process creates additional pressure for content removal. Platforms must bear all costs when they lose arbitration cases, creating financial incentives to remove flagged content before disputes reach arbitrators. Since arbitrators must be certified by European regulators, they face potential conflicts when ruling against government preferences.

The DSA's global impact extends beyond content policies to platform design features. X withdrew from the EU's voluntary Code of Conduct on Disinformation in May 2023 because it mandated use of third-party fact-checkers, which X does not employ. Less than two months after DSA obligations became legally binding, the Commission opened an investigation into X's Community Notes feature. The company now reportedly faces more than $1 billion in potential fines.

According to the report, the DSA was specifically designed to target American technology companies while exempting European competitors. The law applies to platforms with more than 45 million EU users, a threshold that captures major American companies while excluding most European services. Spotify avoided designation by claiming its music streaming service operates separately from its podcasting platform, despite integrated functionality.

The committee found that European regulators attempted to conceal their censorship activities from public scrutiny. Unlike Digital Markets Act workshops, which are open to the public and recorded, the May 2025 DSA workshop was conducted under strict confidentiality rules. Commission officials specifically instructed participants not to "describe the exercise scenarios" used during the event.

European enforcement extends beyond platform policies to individual content decisions. Under current EU judicial precedent from the 2019 Eva Glawischnig-Piesczek v. Facebook case, individual member states can issue global content takedown orders. This authority means that content removed for European users could potentially be blocked worldwide, depending on platform technical capabilities and legal interpretation.

The investigation documented the DSA's roots in response to alleged Russian interference in the 2016 U.S. presidential election and 2017 French presidential election. However, academic studies cited in the report found that Russian social media activities had minimal impact on the 2016 election outcome, raising questions about the factual basis for comprehensive digital censorship legislation.

The DSA's requirements create significant compliance costs for affected platforms. YouTube's Global Head of Trust and Safety testified to the committee that DSA compliance required "very significant effort" from numerous teams within the company. Meta has assigned over 1,000 professionals to develop DSA compliance solutions, investing more than 20,000 hours in audit preparation alone.

The committee's investigation continues as American technology companies produce ongoing communications with European regulators under congressional subpoenas issued in February 2025. These subpoenas cover communications dating back to 2020 and continue in perpetuity, meaning the committee receives real-time documentation of European censorship demands.

Industry observers have noted the potential for the DSA to establish global censorship precedents. According to the Digital Services Act's explicit goals, European officials hope its effects will "extend far beyond Europe, changing company policies in the United States and elsewhere." This extraterritorial ambition represents a direct challenge to American constitutional principles of free speech and democratic discourse.

The revelations come amid broader concerns about foreign interference in American political processes through content moderation. PPC Land previously reported on European Commission violations of privacy laws while conducting targeted political advertising campaigns. The European Data Protection Supervisor found the Commission illegally processed personal data including political views and religious beliefs during promotional campaigns for proposed legislation.

The House Judiciary Committee plans to use these findings to inform legislative reforms protecting American free speech rights from foreign censorship regimes. The report represents the first comprehensive investigation into how European digital regulations affect American constitutional rights and democratic discourse in the modern digital public square.

Timeline

Key Terms Explained

Digital Services Act (DSA) 

The European Union's comprehensive digital regulation law passed in October 2022, requiring online platforms to identify and mitigate "systemic risks" including alleged disinformation and hate speech. The DSA applies to platforms with more than 45 million EU users and authorizes fines up to six percent of global revenue. According to the committee report, the law effectively functions as a censorship mechanism targeting American technology companies while exempting most European competitors through carefully crafted thresholds and exemptions.

Content Moderation

The systematic process by which social media platforms review, restrict, or remove user-generated content based on established community guidelines and legal requirements. Under DSA mandates, platforms must implement risk assessment frameworks and mitigation measures that often result in increased content removal. The committee found that European requirements force platforms to modify their global content moderation policies, effectively exporting EU speech restrictions to American users through unified terms of service.

European Commission

The executive branch of the European Union responsible for enforcing the Digital Services Act and other EU regulations. Commission officials, including former Commissioner Thierry Breton and current Executive Vice-President Henna Virkkunen, have threatened American companies with regulatory retaliation for hosting political content. The committee investigation revealed how Commission regulators use closed-door workshops and enforcement actions to pressure platforms into adopting more restrictive speech policies.

Systemic Risks

Broadly defined categories of potential harm that Very Large Online Platforms must assess and mitigate under DSA requirements. These include "misleading or deceptive content," "disinformation," "hate speech," and "negative effects on civic discourse and electoral processes." The committee found that European regulators interpret these terms expansively to encompass ordinary political debate, satirical content, and conservative viewpoints on immigration and environmental policy, creating pressure for widespread content censorship.

Very Large Online Platforms (VLOPs)

Social media services with more than 45 million monthly EU users subject to the most stringent DSA requirements, including risk assessments, mitigation measures, and direct Commission supervision. The threshold appears designed to capture major American companies like Meta, Google, and X while excluding European competitors. VLOPs must undergo independent audits, provide researcher access to data, and implement enhanced transparency measures that significantly increase compliance costs and regulatory exposure.

Trusted Flaggers

Government-approved third-party organizations that receive priority treatment for content removal requests under DSA Article 22. These entities often have financial relationships with European governments or conflicts of interest with targeted platforms. The committee found examples including German organization Hate Aid, which is simultaneously flagging X content while pursuing litigation against the company, demonstrating how the trusted flagger system can be weaponized against platforms that resist European censorship demands.

House Committee on the Judiciary

The U.S. congressional committee with jurisdiction over civil liberties and constitutional rights, currently investigating foreign threats to American free speech. Led by Chairman Jim Jordan, the committee issued subpoenas to eight major technology companies in February 2025, compelling production of communications with European regulators. The investigation aims to inform legislative reforms protecting American constitutional rights from foreign censorship regimes and regulatory overreach.

Global Content Policies

Unified terms of service and community guidelines that major platforms typically apply worldwide rather than maintaining separate standards for different jurisdictions. This technical reality enables European regulations to affect American users when platforms modify global policies to comply with DSA requirements. The committee found that Commission regulators explicitly expect platforms to update their worldwide terms and conditions based on European risk assessments, effectively globalizing EU speech restrictions.

Political Speech

Core First Amendment-protected expression involving discussion of governmental affairs, political candidates, and public policy issues. The committee investigation documented European targeting of common American political rhetoric, including statements like "we need to take back our country" classified as illegal hate speech. This pattern reveals how European definitions of harmful content conflict with American constitutional principles that afford the highest protection to political discourse and democratic debate.

Enforcement Actions

Regulatory investigations, fines, and compliance proceedings initiated by the European Commission against non-compliant platforms. Current enforcement includes formal proceedings against X, Meta, TikTok, and other American companies, with X reportedly facing over $1 billion in potential penalties. The committee found that enforcement appears politically motivated and one-sided, consistently targeting conservative viewpoints while threatening companies that resist European censorship demands with severe financial consequences designed to compel compliance.

Summary

Who: The U.S. House Committee on the Judiciary, led by Chairman Jim Jordan, investigated European Union officials and technology companies affected by the Digital Services Act.

What: The committee found that EU regulators use the DSA to pressure American social media platforms to censor political speech, including common political statements, humor, and commentary on immigration and environmental issues.

When: The investigation began in August 2024 following threats against X Corp., with the interim report released July 25, 2025, covering events from 2020 to present.

Where: The censorship regime operates from Brussels through the European Commission and extends globally through platform content moderation policies affecting American users.

Why: European officials claim the DSA protects against illegal content and disinformation, but committee findings suggest the law serves to silence political opposition and export European speech restrictions worldwide.