US websites sue UK regulator over Online Safety Act enforcement
Two prominent American websites filed a federal lawsuit challenging the UK communications regulator's attempts to enforce British content moderation laws against US-based platforms.

4chan Community Support LLC and Lolcow, LLC, which operates Kiwi Farms, filed the civil action in the US District Court for the District of Columbia on August 27, 2025. The complaint targets the UK Office of Communications (Ofcom), alleging violations of First Amendment rights and improper service of process under international law.
According to the lawsuit, Ofcom sent a series of enforcement notices beginning March 26, 2025, demanding compliance with the UK's Online Safety Act 2023. The regulator threatened penalties of up to £18 million or 10% of worldwide revenue, along with potential criminal sanctions including imprisonment.
Subscribe PPC Land newsletter ✉️ for similar stories like this one. Receive the news every day in your inbox. Free of ads. 10 USD per year.
Enforcement actions target American platforms
The complaint details a timeline of escalating enforcement actions. Ofcom sent 4chan an initial "legally binding information notice" on April 14, 2025, followed by additional demands through August 12, 2025. Kiwi Farms received its first advisory letter on March 26, 2025, with a second demand on July 25, 2025.
Both platforms operate exclusively within the United States and maintain no physical presence, operations, or infrastructure outside American territorial limits. The lawsuit contends that neither Delaware-based 4chan nor West Virginia-based Kiwi Farms answer to UK regulatory authority.
According to court documents, Ofcom demanded that the platforms conduct written risk assessments, implement age verification systems, and remove user-generated content deemed illegal under British law. The regulator also required detailed record-keeping and compliance reports demonstrating adherence to UK content moderation standards.
The enforcement actions represent Ofcom's first four social media enforcement targets under the Online Safety Act, according to the complaint. All four targeted platforms are American-owned, including SaSu and Gab alongside the two plaintiffs.
Constitutional challenges to UK law
The lawsuit raises several constitutional arguments against the UK's regulatory approach. Plaintiffs assert that Ofcom's demands violate First Amendment free speech protections by compelling platforms to remove content that remains legal under US law.
Section 179 of the Online Safety Act creates what the complaint characterizes as a "defamation crime" by criminalizing false communications intended to cause psychological harm. The plaintiffs argue this provision directly conflicts with First Amendment protections, noting that similar defamation crimes were permanently abolished in America when the First Amendment was ratified on December 15, 1791.
The lawsuit also challenges age verification requirements under Section 12(4) of the Online Safety Act. These provisions would prevent users from accessing platforms anonymously or pseudonymously, according to the complaint. The plaintiffs maintain that anonymous political speech receives specific constitutional protection under the First Amendment.
Fourth and Fifth Amendment violations form additional grounds for the legal challenge. Ofcom's information demands under Section 100 Orders purportedly require platforms to provide potentially incriminating information without judicial warrants or proper international legal procedures.
Service of process disputes
Both platforms contest the validity of Ofcom's service of process. The complaint alleges that enforcement notices were improperly delivered through email without following established US-UK Mutual Legal Assistance Treaty procedures.
4chan received enforcement communications through its corporate services vendor, which lacks authorization to accept legal service on behalf of the platform. Kiwi Farms similarly received demands via email rather than through proper international legal channels.
The lawsuit seeks declaratory judgment that all Ofcom service attempts were invalid under US law and international treaty obligations. This procedural challenge could establish precedent for how foreign regulators must serve process on American internet companies.
Buy ads on PPC Land. PPC Land has standard and native ad formats via major DSPs and ad platforms like Google Ads. Via an auction CPM, you can reach industry professionals.
Technical compliance burdens
The Online Safety Act imposes extensive technical and administrative requirements on covered platforms. Section 9 mandates written risk assessments evaluating user exposure to various content categories, with ongoing updates required for service modifications.
Platforms must implement content moderation systems capable of proactively removing "priority illegal content" while maintaining notice-and-takedown procedures for other material. These technical requirements extend to recommendation algorithms and user interaction systems.
Age verification systems present particular technical challenges. The Act requires robust verification methods meeting Ofcom's technical guidance, effectively eliminating anonymous platform usage. Implementation costs and privacy implications create additional compliance burdens for affected platforms.
Ofcom's commercial enforcement model
The complaint characterizes Ofcom as a commercial enterprise funded through fees charged to regulated companies. Large platforms with qualifying worldwide revenue exceeding £250 million face an "Ofcom Service Fee" of approximately 0.02% of global turnover.
This funding model creates what the lawsuit describes as a "statutory monopoly on content moderation policy-writing services." Unlike traditional government agencies, Ofcom operates through industry fees rather than taxpayer funding, according to its own public statements.
The commercial enforcement structure raises jurisdictional questions about Ofcom's status under US law. The complaint argues that these activities constitute commercial activity within the United States, potentially limiting any sovereign immunity claims.
Industry-wide implications
The enforcement actions extend beyond the two plaintiffs to major American social media companies. According to court documents, Ofcom has reportedly targeted larger platforms including X, Rumble, and Reddit with similar enforcement actions.
PPC Land previously reported that X's Global Government Affairs team published a comprehensive critique of the Online Safety Act on August 1, 2025, characterizing the legislation as creating an unnecessarily aggressive regulatory environment.
The UK's approach demonstrates how European regulators increasingly seek to enforce domestic content moderation standards against American platforms. Similar patterns emerged with the European Union's Digital Services Act, which PPC Land documented as targeting conservative political speech through content removal demands.
Payment processor enforcement mechanisms
The Online Safety Act grants Ofcom authority to coordinate with payment processors in enforcing compliance measures. PPC Land reported that Visa and Mastercard have implemented UK government censorship policies across digital platforms, including Steam and adult content sites.
These business disruption measures allow regulators to restrict platform revenue without direct platform cooperation. Payment processor coordination represents a significant enforcement tool that extends regulatory reach beyond traditional content moderation requirements.
Legal precedent considerations
The lawsuit represents the first major constitutional challenge to extraterritorial application of the Online Safety Act against American companies. Court decisions could establish important precedent for international internet regulation and jurisdictional limits.
PPC Land covered the Wikimedia Foundation's legal challenge to the Act's categorization regulations in July 2025. However, that case focused on platform classification rather than jurisdictional authority over foreign companies.
The constitutional arguments raised in this lawsuit could influence similar regulatory disputes between European authorities and American technology companies. European enforcement actions increasingly target American platforms through extraterritorial application of domestic laws.
Public opposition to UK regulations
The Online Safety Act faces growing public resistance within the UK. A petition calling for repeal has gathered over 251,000 signatures, according to PPC Land reporting, arguing that the legislation's scope exceeds necessity in a free society.
VPN usage surged 1,400% in the UK following the Act's implementation on July 25, 2025, demonstrating significant user resistance to the new regulatory framework. The dramatic increase in circumvention tools suggests widespread public rejection of the legislation's restrictions.
Implementation triggered substantial compliance costs across the technology sector. The enforcement approach differs significantly from previous voluntary frameworks by establishing legally binding obligations backed by criminal penalties.
Relief sought
The plaintiffs seek declaratory judgment that Ofcom's service attempts were improper and invalid under international law. They request permanent injunctions prohibiting further enforcement actions without proper service through established treaty procedures.
Additional relief includes declarations that Ofcom's orders are unenforceable in the United States as inconsistent with constitutional protections and federal law. The lawsuit specifically cites conflicts with the First, Fourth, and Fifth Amendments, along with the Communications Decency Act and SPEECH Act.
The case will test the limits of extraterritorial regulatory enforcement in the digital age. Court decisions could establish important boundaries for how foreign regulators may legally compel compliance from American internet companies.
Subscribe PPC Land newsletter ✉️ for similar stories like this one. Receive the news every day in your inbox. Free of ads. 10 USD per year.
Timeline
- October 26, 2023: UK Online Safety Act receives Royal Assent
- February 2025: Online Safety Act categorization regulations take effect
- March 17, 2025: Ofcom gains full enforcement powers for illegal content duties
- March 26, 2025: Ofcom sends advisory letter to Kiwi Farms
- March 28, 2025: Ofcom threatens enforcement action in Recorded Future statement
- April 14, 2025: Ofcom sends first legally binding information notice to 4chan
- April 30, 2025: Ofcom sends failure to respond letter to 4chan
- June 9, 2025: Ofcom notifies 4chan of investigation opening
- June 16, 2025: Ofcom sends final legal notice to 4chan
- July 9, 2025: Ofcom sends preliminary contravention email to 4chan
- July 10, 2025: Bluesky announces UK age verification implementation
- July 17, 2025: Wikipedia files legal challenge against categorization rules
- July 25, 2025: UK Online Safety Act enforcement triggers 1,400% VPN surge
- July 25, 2025: Ofcom sends second demand to Kiwi Farms
- August 1, 2025: X publishes comprehensive critique of Online Safety Act
- August 12, 2025: Ofcom sends provisional decision notice to 4chan
- August 27, 2025: 4chan and Kiwi Farms file federal lawsuit against Ofcom
Subscribe PPC Land newsletter ✉️ for similar stories like this one. Receive the news every day in your inbox. Free of ads. 10 USD per year.
Summary
Who: 4chan Community Support LLC and Lolcow, LLC (operating Kiwi Farms) filed suit against the UK Office of Communications (Ofcom) in US District Court for the District of Columbia.
What: The American websites challenge Ofcom's enforcement of the UK Online Safety Act against US-based platforms, alleging constitutional violations and improper service of process. Ofcom demanded compliance with UK content moderation laws, threatening fines up to £18 million and criminal penalties.
When: The lawsuit was filed on August 27, 2025, following a series of enforcement actions beginning March 26, 2025. The UK Online Safety Act received Royal Assent on October 26, 2023, with enforcement powers taking effect March 17, 2025.
Where: The case was filed in the US District Court for the District of Columbia. The plaintiffs operate exclusively within the United States (Delaware and West Virginia), while Ofcom operates from the United Kingdom under British regulatory authority.
Why: The platforms argue that Ofcom's enforcement violates First Amendment free speech protections, Fourth Amendment privacy rights, and Fifth Amendment due process guarantees. They contend that UK regulators lack jurisdiction over American companies and improperly served legal process without following international treaty procedures.
Subscribe PPC Land newsletter ✉️ for similar stories like this one. Receive the news every day in your inbox. Free of ads. 10 USD per year.
PPC Land explains
Online Safety Act: The comprehensive UK legislation enacted on October 26, 2023, that establishes statutory duties for internet platforms to protect users from harmful content. The Act creates legally binding obligations for social media companies, search engines, and adult content platforms operating in the UK, marking a fundamental shift from voluntary industry standards to mandatory compliance requirements backed by criminal penalties. The legislation grants Ofcom extensive enforcement powers including substantial financial penalties and business disruption measures that extend far beyond traditional content moderation approaches.
Ofcom: The UK Office of Communications, established as a statutory corporation under the Communications Act 2003, which serves as Britain's communications regulator with expanded authority under the Online Safety Act. Operating as an industry-funded entity rather than a traditional government agency, Ofcom collects fees from regulated companies to fund its enforcement operations. The organization possesses comprehensive regulatory powers including the ability to impose fines up to £18 million or 10% of worldwide revenue, coordinate with payment processors to restrict platform operations, and make criminal prosecution referrals.
First Amendment: The constitutional provision protecting freedom of speech, press, religion, assembly, and petition in the United States, which forms the primary legal defense against Ofcom's content moderation demands. The plaintiffs argue that UK requirements to remove user-generated content and conduct risk assessments violate their constitutional rights to host protected speech without government interference. The amendment's protections extend to anonymous political speech and editorial decisions about platform content, creating direct conflicts with UK regulatory requirements.
Section 100 Orders: Legal demands issued by Ofcom under the Online Safety Act compelling platforms to provide information about their compliance status, risk assessments, and content moderation practices. These orders carry criminal penalties for non-compliance, including potential imprisonment up to two years, and can require platforms to disclose potentially incriminating information without judicial oversight. The plaintiffs argue these demands violate Fourth Amendment protections against unreasonable searches and Fifth Amendment rights against self-incrimination.
Constitutional violations: The legal theory underlying the lawsuit that UK enforcement actions conflict with fundamental American constitutional protections including free speech, privacy, and due process rights. The complaint alleges that Ofcom's demands would force platforms to act as government censors, removing speech that remains fully protected under US law. These violations encompass both direct content restrictions and compelled speech requirements that force platforms to adopt UK-approved content moderation policies.
Platforms: Internet services including social media companies, forums, and content-sharing sites that fall under the Online Safety Act's regulatory scope. The legislation applies broadly to services with UK user connections regardless of their physical location or incorporation jurisdiction. According to Ofcom, over 100,000 online services worldwide are potentially subject to the Act's requirements, with most of these platforms being US-based rather than British companies.
Enforcement actions: Ofcom's systematic regulatory activities designed to compel platform compliance with UK content moderation requirements through escalating penalties and threats. These actions include legally binding information notices, investigation openings, provisional decision notices, and coordination with payment processors to restrict non-compliant platforms. The enforcement approach represents a significant departure from previous voluntary frameworks by establishing government authority to compel platform behavior through legally binding obligations.
Service of process: The legal procedure for officially delivering court documents and regulatory demands to parties in legal proceedings, which forms a key procedural challenge in the lawsuit. The plaintiffs argue that Ofcom improperly served enforcement notices through email without following established US-UK Mutual Legal Assistance Treaty procedures required for international legal process. This procedural dispute could establish important precedent for how foreign regulators must properly serve legal documents on American companies.
Risk assessments: Mandatory compliance documents that platforms must prepare under the Online Safety Act evaluating user exposure to various categories of potentially harmful content. These assessments must be kept current with service modifications and made available to Ofcom upon demand, creating ongoing administrative burdens for covered platforms. The requirements represent a fundamental shift toward proactive content evaluation rather than reactive moderation, forcing platforms to continuously analyze their services through UK regulatory frameworks.
Age verification: Technical systems that platforms must implement to prevent minors from accessing adult content, representing one of the most technically challenging and privacy-invasive requirements under the Online Safety Act. These systems must meet robust standards defined by Ofcom and effectively eliminate anonymous platform usage by requiring identity verification for access. The verification requirements create direct conflicts with First Amendment protections for anonymous political speech and impose significant technical implementation costs on affected platforms.