German businesses systematically delete critical reviews using EU Digital Services Act

German companies abuse Digital Services Act notification mechanisms to remove unfavorable reviews, creating artificial rating inflation across platforms.

Restaurant review in Google Maps
Restaurant review in Google Maps

German businesses have weaponized the European Union's Digital Services Act to systematically remove unfavorable online reviews, transforming what was designed as consumer protection legislation into a tool for reputation manipulation. According to multiple sources, this abuse has reached what one expert describes as "industrial scale," particularly affecting Google Reviews, TripAdvisor, and Trustpilot platforms.

The Digital Services Act, which became fully operational on February 17, 2024, includes Article 16 provisions requiring platforms to provide mechanisms for reporting potentially illegal content. German businesses now exploit these mechanisms by filing mass complaints against negative reviews, claiming defamation even when reviews describe genuine customer experiences.

"Business owners looking to artificially boost ratings just fill out a simple form," explains rewboss, a prominent YouTube content creator who documented this phenomenon on August 1, 2025. The process involves selecting "Defamation" from dropdown menus, triggering automatic review removal while platforms err on the side of caution to avoid legal liability.

Platforms like Google, TripAdvisor, and Trustpilot receive thousands of takedown requests daily but lack resources to properly evaluate each complaint. "They simply don't have the time or the resources to review every review, much less predict in each case how a court of law would rule," rewboss noted. Corporate lawyers advise platforms to disable reported content rather than risk liability for hosting potentially defamatory material.

Industrial-scale manipulation emerges

The abuse has become so widespread that specialized service companies now offer review removal as a commercial service. "I've received spam from shady companies promising to help me identify and remove bad reviews, even though I don't have a business," rewboss reported. These companies advertise bluntly: "One-star reviews are bad for business! Let us help!"

The Düsseldorf Regional Court's January 15, 2025 decision in the case of Stuttgart-based Skinport versus Google Ireland Limited established important precedent regarding platform liability under DSA provisions. The court ruled that platforms can be held liable as "disruptive parties" when failing to prevent certain violations, further incentivizing conservative content moderation practices.

Former Google review moderation specialist @Buzy_Lizard, commenting on the rewboss video, revealed that while reviews aren't automatically deleted upon complaint, "if the reviewer does not respond within a given timeframe, the review is removed." This creates an information asymmetry favoring businesses over individual consumers.

German legal concepts particularly facilitate this manipulation. Unlike traditional defamation requiring proof of deliberate falsehood, German law includes "malicious gossip" provisions where defendants must prove their statements were true rather than prosecutors proving they were false.

"What this means is that if you are sued for malicious gossip, you will have to prove that you were telling the truth. And that can be very, very difficult," rewboss explained. This burden-shifting mechanism makes consumers vulnerable even when providing factual accounts of poor service.

The phenomenon affects multiple business categories beyond restaurants. Commenters on the rewboss video reported successful review removals targeting doctors, hotels, legal firms, and driving schools. One commenter noted: "A restaurant even had my 4-star review deleted. Only 5 stars accepted. The whole rating system became useless."

Platform responses vary

Google's approach involves notifying reviewers about removal requests and providing limited appeal opportunities. However, the appeals process heavily favors removing content rather than defending potentially controversial reviews. A former content moderator explained that reviews are "much more likely to stay online" if reviewers respond to removal notifications, but many consumers never see these warnings.

The European Commission has defended the Digital Services Act against censorship allegations, with spokesperson Thomas Regnier emphasizing a 35% success rate in content challenges. However, this statistic encompasses all content types across all platforms, not specifically review abuse in Germany.

Photographic evidence appears to provide the strongest defense against false removal claims. "The main advice I can give to reviewers to make it more difficult for their reviews to be removed is to post photos from the business as this is considered photographic proof of your visit," the former Google moderator recommended.

Advertise on ppc land

Buy ads on PPC Land. PPC Land has standard and native ad formats via major DSPs and ad platforms like Google Ads. Via an auction CPM, you can reach industry professionals.

Learn more

Broader implications for digital marketing

This systematic manipulation undermines the reliability of online reviews that marketing professionals rely on for local search optimization and reputation management strategies. When legitimate negative feedback gets removed while potentially fake positive reviews remain, it distorts the authentic consumer sentiment that drives local SEO rankings and purchasing decisions.

The abuse particularly impacts tourism-dependent businesses where international visitors may lack knowledge about German legal mechanisms to challenge review removals. Tourist-facing establishments can exploit information asymmetries to maintain artificially inflated ratings.

Enforcement challenges persist

Despite comprehensive DSA implementation across all EU member states, Germany demonstrates unique patterns of systematic abuse not observed at similar scales in other jurisdictions. The regulation's complexity creates compliance burdens for platforms while potentially enabling the very manipulation it aimed to prevent.

The situation highlights tensions between legitimate content moderation and free expression protection. While the DSA includes appeal mechanisms and procedural safeguards, practical implementation favors content removal over detailed case-by-case analysis due to resource constraints and legal risk aversion.

Multiple commenters reported that automated systems could easily flag all one-star reviews for removal without requiring artificial intelligence. "It wouldn't even need AI," rewboss observed, describing potential scripts that automatically report negative reviews across business profiles.

Advertise on ppc land

Buy ads on PPC Land. PPC Land has standard and native ad formats via major DSPs and ad platforms like Google Ads. Via an auction CPM, you can reach industry professionals.

Learn more

Long-term market effects

The manipulation creates broader market distortions affecting consumer choice and business competition. When negative reviews disappear while positive ones remain, consumers lose access to balanced information needed for informed purchasing decisions.

Some users have adapted by specifically seeking businesses with mixed review profiles rather than uniformly high ratings. "Look for places that have a mix of good and bad reviews," rewboss advised. "Two-star or four-star reviews with details on what was good and what wasn't so good are much more likely to be helpful and accurate."

The phenomenon demonstrates how regulatory mechanisms designed for consumer protection can be weaponized by businesses to suppress legitimate criticism. As digital competition frameworks continue evolving, regulators face challenges balancing content moderation effectiveness with protection against abuse.

For consumers encountering poor service, especially regarding health and safety violations, reporting to relevant authorities often proves more effective than online reviews. Food safety departments can conduct inspections and take enforcement action against establishments with genuine hygiene problems, providing remedies that review removal cannot eliminate.

The German review manipulation represents a significant challenge for the Digital Services Act's effectiveness and raises questions about whether current appeal mechanisms provide sufficient protection against systematic abuse by businesses seeking to artificially inflate their online reputations.

Timeline

Summary

Who: German businesses systematically exploit Digital Services Act notification mechanisms to remove critical reviews, while platforms like Google, TripAdvisor, and Trustpilot err on caution to avoid legal liability.

What: Systematic abuse of EU Digital Services Act Article 16 reporting mechanisms to remove unfavorable business reviews through false defamation claims, creating artificial rating inflation across review platforms.

When: The abuse escalated throughout 2024 after the Digital Services Act became fully operational on February 17, 2024, with documentation emerging in August 2025 showing industrial-scale manipulation.

Where: Primarily Germany, where businesses demonstrate unprecedented systematic abuse of review removal systems, though the Digital Services Act applies across all 27 EU member states.

Why: Businesses exploit regulatory mechanisms designed for consumer protection to manipulate online reputations and suppress legitimate criticism, while platforms remove content to avoid potential legal liability rather than conduct thorough case-by-case evaluations.

PPC Land explains

Digital Services Act (DSA): The European Union's comprehensive regulatory framework that entered into force in November 2022, establishing uniform rules for online intermediary services across all 27 member states. The DSA creates a tiered approach where different obligations apply based on platform size and societal impact, with the most stringent requirements for Very Large Online Platforms serving more than 45 million monthly users. The regulation mandates transparency in content moderation, algorithmic accountability, and user rights protections while requiring platforms to conduct annual risk assessments addressing systemic threats to democratic discourse and fundamental rights.

Article 16: A specific provision within the Digital Services Act that requires platforms to provide mechanisms allowing users to report content they believe violates applicable laws. This article establishes the notification system that German businesses systematically abuse by filing mass complaints against negative reviews, claiming defamation even when reviews describe genuine customer experiences. The provision was designed to enable legitimate content reporting but has become the primary tool for review manipulation due to platforms' conservative approach to potential legal liability.

Defamation: The legal concept involving deliberately spreading false statements to harm another person's reputation, requiring proof that the defendant knew their statements were untrue. In the context of German review manipulation, businesses falsely claim that negative but truthful reviews constitute defamation to trigger automatic removal by risk-averse platforms. German law distinguishes between defamation and malicious gossip, with defamation requiring intentional falsehood while malicious gossip shifts the burden of proof to defendants who must demonstrate their statements were accurate.

Platform liability: The legal responsibility that online platforms face for hosting potentially illegal content, particularly under Digital Services Act provisions that can hold platforms accountable as "disruptive parties" when they fail to prevent certain violations. The Düsseldorf Regional Court's January 15, 2025 decision established important precedent by ruling that Google could be held liable under DSA provisions, further incentivizing conservative content moderation practices where platforms remove reported content rather than risk legal consequences.

Review manipulation: The systematic practice of artificially altering online review profiles through false reporting, fake reviews, or suppression of legitimate criticism to create misleading impressions of business quality. In Germany, this manipulation occurs primarily through abuse of Digital Services Act reporting mechanisms, where businesses file mass defamation complaints against genuine negative reviews. The practice undermines consumer decision-making and creates unfair competitive advantages for businesses willing to engage in deceptive practices.

Malicious gossip: A German legal concept distinct from defamation, where defendants must prove their statements were true rather than prosecutors proving they were false, creating a lower standard for business claims against reviewers. Under this framework, restaurants and other businesses only need to convince courts that reviewers cannot prove they were telling the truth, shifting the burden of evidence to consumers who may lack resources or documentation to defend factual accounts of poor service experiences.

Content moderation: The process by which online platforms review, evaluate, and potentially remove user-generated content based on legal requirements, platform policies, and safety considerations. Under the Digital Services Act, platforms must implement systematic content moderation procedures including automated detection, human review, and appeals mechanisms. However, the volume of takedown requests and legal risk aversion leads platforms to err on the side of removing reported content rather than conducting thorough case-by-case evaluations.

Google Reviews: The review system integrated with Google's local business listings and search results, representing one of the most influential platforms for consumer feedback and local search optimization. German businesses particularly target Google Reviews for manipulation because of their prominent placement in search results and significant impact on consumer behavior. Former Google content moderators report that the platform receives thousands of daily takedown requests specifically from German businesses, with reviews being removed if users fail to respond to removal notifications within specified timeframes.

Appeal mechanisms: The procedural safeguards required under the Digital Services Act that allow users to challenge platform decisions regarding content removal, designed to protect legitimate expression while maintaining effective content moderation. Despite these mechanisms existing, practical implementation heavily favors content removal over detailed appeals processes due to resource constraints and platforms' legal risk aversion. The European Commission reports a 35% success rate in content challenges, though this statistic encompasses all content types across all platforms rather than specifically addressing review manipulation in Germany.

Industrial scale: A term describing the systematic, mass-production approach that German businesses have adopted for review manipulation, involving specialized service companies, automated complaint filing, and coordinated campaigns against negative feedback. This scale of abuse distinguishes Germany from other EU member states where similar Digital Services Act provisions exist but are not exploited with comparable intensity or organization. The industrial approach includes spam companies offering review removal services, scripts that automatically flag one-star reviews, and coordinated efforts across multiple platforms and business types.