German court ruling shows how EU's Digital Services Act enables business censorship
Düsseldorf court decision in January 2025 confirms Google liable under DSA provisions while German businesses systematically abuse review removal systems.

The European Union's Digital Services Act promised to create a safer digital environment, but its implementation reveals troubling patterns of censorship and abuse. According to the Düsseldorf Regional Court, which issued an interim injunction in proceedings between Stuttgart-based company Skinport and Google Ireland Limited, the DSA can hold platforms liable as "disruptive parties" when failing to prevent trademark violations in advertising.
The court's January 15, 2025 decision (case no.: 2a O 112/23) marks the first major ruling on platform liability under the DSA framework. However, this legal precedent emerges alongside widespread abuse of the same regulatory mechanisms by German businesses seeking to manipulate online reviews and silence critics.
Subscribe the PPC Land newsletter ✉️ for similar stories like this one. Receive the news every day in your inbox. Free of ads. 10 USD per year.
Systematic review manipulation threatens online integrity
Multiple sources document how German businesses exploit DSA notification systems to remove unfavorable reviews without judicial oversight. When users receive a notification that their review has been taken down, and it talks about defamation and possible legal action, this typically represents not genuine legal threats but rather automated responses to simple form submissions.
According to rewboss, a prominent content creator who has analyzed the phenomenon, "The scary stuff in the notification about defamation is just a standard text. It does not mean that anybody's lawyers have become involved — yet — and it doesn't mean the police are going to kick down your door and confiscate your phone."
The analysis reveals that businesses systematically exploit Article 16 of the DSA, which requires platforms to provide mechanisms for reporting allegedly illegal content. rewboss explains that the system creates a simple pathway for abuse: "It means that somebody has flagged your review as potentially defamatory, but at this stage that is just their opinion."
The process requires minimal effort from businesses, according to rewboss's investigation. "A business owner looking to artificially boost its ratings just fills out a simple form; and that form asks the business owner to state what they believe makes the content illegal. So they simply select the option: 'Defamation'."
The content creator's analysis reveals how platforms struggle with the volume of requests. "Platforms like Google, TrustPilot and TripAdvisor receive thousands upon thousands of takedown requests every single day; and they simply don't have the time or the resources to review every review, much less predict in each case how a court of law would rule."
Platform liability creates enforcement challenges
Platform operators face difficult choices when evaluating removal requests. Platforms like Google, TrustPilot and TripAdvisor receive thousands upon thousands of takedown requests every single day; and they simply don't have the time or the resources to review every review, much less predict in each case how a court of law would rule.
The economic incentives favor removal over legal battles, rewboss observes. "Their corporate lawyer will tell them to play it safe and disable everything that's reported. That way they can eliminate any possibility of being found liable for something like defamation."
Users seeking to challenge removals face significant obstacles. According to rewboss, "You the reviewer will have to attempt to appeal the takedown, but even there the platform is going to err on the side of caution. They have absolutely no interest whatsoever in fighting your legal battles for you."
Germany emerges as censorship epicenter
While the DSA applies across all 27 EU member states, Germany demonstrates unique patterns of abuse, according to rewboss's research. "This is an EU regulation; so you'd expect it to be a problem all over the EU. But for some reason, it's only in Germany that it seems to be happening on an industrial scale."
The content creator documents how commercial services actively promote review removal. "I've received spam from shady companies promising to help me identify and remove bad reviews, even though I don't have a business that you could review. And they're pretty blatant about it: 'One-star reviews are bad for business! Let us help!'"
rewboss suggests the scale of automation makes abuse trivial: "It seems to me that it should be fairly simple to run a script on your business profile that automatically flags every single one-star review that it finds. It wouldn't even need AI."
Legal framework enables abuse
German law provides additional tools beyond defamation claims, rewboss explains. "German law also has the concept of malicious gossip, and if you're sued or prosecuted for that, you're going to have problems." This concept creates lower burden of proof requirements than traditional defamation cases.
According to rewboss's legal analysis, "Malicious gossip is when you say things that are not proveably true in order to harm another's reputation. In that case, the restaurant only needs to convince the court that you can't prove that you were telling the truth."
The implications extend beyond simple review disputes. "What this means is that if you are sued for malicious gossip, you will have to prove that you were telling the truth. And that can be very, very difficult," rewboss warns.
Court decision demonstrates platform accountability
The Düsseldorf court ruling provides technical guidance on platform obligations under the DSA. The "Störerhaftung" (Breach of Duty of Care) enshrined in German law allows third parties who merely contribute to the infringement of a protected good to be held liable.
Platform liability requirements operate within specific parameters. According to Article 8 DSA, providers of intermediary services – such as Google – in this case are not subject to a general obligation to monitor the information they transmit or store or to actively search for circumstances indicating illegal activity.
However, knowledge triggers responsibility. Google argued before the regional court: It had no "knowledge giving rise to liability". According to its lawyers, the company first became aware of the controversial text advertisement from June 8, 2023 when the first court order was served on June 20, 2023.
Broader regulatory implications
The DSA represents part of a broader regulatory framework reshaping digital platforms. LinkedIn discontinued group-based ad targeting in June 2024 following European Commission complaints about potential violations.
Industry organizations voice concerns about regulatory complexity. Europe's advertising industry opposes the Digital Fairness Act, warning that overlapping regulations create compliance challenges while potentially stifling innovation.
Implementation reveals systemic weaknesses
Platform compliance efforts demonstrate the scale of required changes. Meta released its comprehensive Digital Services Act (DSA) transparency report on November 28, 2024, revealing detailed insights into content moderation and platform safety measures across Facebook and Instagram in the European Union.
Resource requirements prove substantial. In 2023, Meta assembled a cross-functional team of over 1,000 professionals to develop DSA compliance solutions. These investments highlight the regulatory burden on platforms while questions remain about effectiveness.
Marketing community impact
The DSA's enforcement mechanisms create new compliance challenges for digital marketing professionals operating across European markets. App Store developers face new requirements as platforms implement trader verification systems.
Transparency requirements extend to advertising operations. IAB Tech Lab released specifications for DSA transparency to help platforms comply with advertising disclosure mandates, though implementation proves complex across the programmatic advertising ecosystem.
The regulatory framework continues expanding. The EU follows the UK with age verification requirements in 2026, creating additional compliance burdens for platforms and advertisers.
Consumer protection versus business manipulation
The DSA aimed to protect consumers from harmful content and deceptive practices. However, implementation reveals how well-intentioned regulations can become tools for business manipulation and censorship.
rewboss concludes that "while the DSA was intended to protect consumers against false advertising and other forms of misinformation, it seems to have given German businesses an easy way to scam people — especially tourists."
For consumers seeking reliable information, rewboss recommends alternative strategies: "If you're looking for somewhere to eat, I suggest not simply going to the place with the best rating. Instead, look for places that have a mix of good and bad reviews."
The regulatory framework lacks mechanisms to distinguish between legitimate consumer protection and abusive censorship attempts. This fundamental weakness undermines the DSA's stated goals while creating new avenues for suppressing critical feedback.
Timeline
October 2022: Digital Services Act published in Official Journal of the European Union
November 16, 2022: DSA enters into force
February 17, 2024: DSA becomes fully applicable to all platforms
June 2024: LinkedIn removes group targeting in Europe due to DSA compliance
November 2024: IAB Europe publishes DSA transparency solution
December 2024: EU watchdog rules Commission illegally targeted political ads
January 15, 2025: Düsseldorf court confirms Google liable under DSA provisions
July 2025: IAB Europe coalition submits Digital Fairness Act challenge letter
Subscribe the PPC Land newsletter ✉️ for similar stories like this one. Receive the news every day in your inbox. Free of ads. 10 USD per year.
PPC Land explains
Digital Services Act (DSA): The European Union's comprehensive regulatory framework that entered into force in November 2022 and became fully applicable in February 2024. The DSA establishes rules for online platforms, search engines, and other intermediary services operating in the EU, requiring them to implement content moderation systems, transparency measures, and risk assessment procedures. While designed to create a safer digital environment, the regulation's broad scope and enforcement mechanisms have created opportunities for abuse by businesses seeking to manipulate online reputation systems.
Störerhaftung (Interferer's Liability): A German legal concept that allows third parties who contribute to the infringement of protected rights to be held liable, even without being direct perpetrators or participants. Under this doctrine, platforms like Google can be held responsible for trademark violations or other rights infringements if they fail to take adequate measures after gaining knowledge of the violations. The Düsseldorf court's January 2025 ruling confirmed this liability framework applies under DSA provisions, creating new obligations for platform operators.
Article 16 Notice-and-Action Mechanism: A specific provision within the DSA that requires platforms to establish clear mechanisms for users and authorities to report allegedly illegal content. This article creates the legal framework that German businesses exploit to remove unfavorable reviews by simply flagging them as "defamatory" through automated forms. The mechanism was intended to protect consumers from genuinely harmful content but has become a tool for systematic censorship when businesses abuse the reporting process.
Malicious Gossip (Üble Nachrede): A German legal concept distinct from defamation that criminalizes statements that cannot be proven true and are made to damage someone's reputation. Unlike defamation, which requires proving deliberate falsehood, malicious gossip shifts the burden of proof to the defendant, who must demonstrate the truthfulness of their statements. This lower legal threshold makes it easier for German businesses to threaten reviewers with legal action, even when reviews contain subjective opinions about service quality.
Content Moderation: The systematic process by which online platforms review, remove, or restrict user-generated content based on community guidelines, legal requirements, or automated detection systems. Under DSA mandates, platforms must implement comprehensive content moderation frameworks that often result in over-removal of content due to liability concerns. The scale of required moderation—with platforms processing thousands of removal requests daily—creates practical challenges that businesses exploit through mass reporting campaigns.
Platform Liability: The legal responsibility that online platforms bear for content posted by third parties on their services. The DSA maintains limited liability protections for platforms that act as neutral intermediaries, but creates specific obligations once platforms gain knowledge of illegal content. This knowledge-based liability framework creates economic incentives for platforms to remove reported content rather than risk legal challenges, fundamentally altering the balance between free expression and platform protection.
Very Large Online Platforms (VLOPs): Digital services with more than 45 million monthly active users in the EU, subject to enhanced DSA obligations including algorithmic transparency, risk assessments, and external auditing requirements. Major platforms like Google, Meta's Facebook and Instagram, and others must comply with stricter content moderation standards, transparency reporting, and researcher data access provisions. These enhanced requirements demonstrate the DSA's tiered approach to regulation based on platform size and societal impact.
Defamation vs. Opinion: The legal distinction between factual claims that can be proven false (defamation) and subjective opinions protected by free speech principles. German businesses systematically blur this distinction by claiming that negative reviews constitute defamatory statements rather than protected consumer opinions. The DSA's broad content reporting mechanisms fail to adequately distinguish between genuine defamation claims and attempts to suppress legitimate criticism, enabling widespread abuse of takedown procedures.
Review Manipulation: The deliberate attempt to artificially influence online ratings and reviews through fake positive reviews, systematic removal of negative feedback, or intimidation of critics. German businesses have industrialized this practice by exploiting DSA reporting mechanisms to remove unfavorable reviews en masse. Commercial services now openly advertise review removal capabilities, transforming what should be authentic consumer feedback into manipulated marketing tools that mislead potential customers.
Regulatory Arbitrage: The practice of exploiting differences in regulatory enforcement or interpretation across jurisdictions to gain competitive advantages. While the DSA applies uniformly across all EU member states, Germany's unique legal concepts like malicious gossip and its business culture's apparent tolerance for aggressive review manipulation create de facto regulatory arbitrage. This allows German businesses to suppress criticism more effectively than competitors in other EU countries, distorting market competition and consumer protection.
Subscribe the PPC Land newsletter ✉️ for similar stories like this one. Receive the news every day in your inbox. Free of ads. 10 USD per year.
Summary
Who: German businesses exploiting DSA notification systems, platform operators managing compliance burdens, consumers facing censored reviews, and the Düsseldorf Regional Court establishing legal precedent.
What: Systematic abuse of the Digital Services Act's content reporting mechanisms to remove unfavorable business reviews, combined with a landmark court ruling holding Google liable for trademark violations in advertising under DSA provisions.
When: The DSA became fully applicable in February 2024, with abuse patterns emerging throughout 2024 and culminating in the January 15, 2025 court decision that clarified platform liability under the regulation.
Where: Primarily Germany, where businesses demonstrate "industrial scale" abuse of review removal systems, though the DSA applies across all 27 EU member states and affects global platforms.
Why: Businesses exploit regulatory mechanisms designed for consumer protection to manipulate online reputation and suppress criticism, while platforms face legal liability for failing to act on reported violations, creating incentives for over-removal of content.