Federal court allows child exploitation case against X to proceed
Court ruling opens door for tech platforms to face liability for defective reporting systems.

The Ninth Circuit Court of Appeals on August 1, 2025, delivered a mixed but significant ruling in Doe v. Twitter, Inc., allowing key portions of a child exploitation case to proceed against X Corp (formerly Twitter). The decision represents a notable crack in the digital advertising ecosystem's longstanding Section 230 protections.
Subscribe the PPC Land newsletter ✉️ for similar stories like this one. Receive the news every day in your inbox. Free of ads. 10 USD per year.
Court reverses lower court dismissal
The federal appeals court reversed portions of a district court decision that had dismissed the entire case under Section 230 of the Communications Decency Act. The plaintiffs, identified as John Doe #1 and John Doe #2, were minors when child sexual abuse material depicting them was posted and distributed on Twitter's platform.
According to court documents filed January 20, 2021, the case originated when a 13-year-old plaintiff was coerced through Snapchat into producing explicit content. The materials later surfaced on Twitter in 2019, receiving over 167,000 views and 2,223 retweets before removal.
"Twitter assigned the report claim number 0136403334. It did not take action against the @StraightBross account," the complaint states, referring to early warnings about child exploitation material.
Platform's delayed response draws scrutiny
The court examined Twitter's response after being notified about the illegal content. On January 28, 2020, Twitter initially told the plaintiff: "We've reviewed the content, and didn't find a violation of our policies, so no action will be taken at this time."
Twitter only removed the content after intervention by the U.S. Department of Homeland Security on January 30, 2020. "Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children," according to the complaint.
Legal implications for digital platforms
The Ninth Circuit found that two specific claims could proceed beyond Section 230 immunity protections. First, the court allowed product liability claims based on Twitter's defective reporting infrastructure design. The platform made it "extremely difficult" to report child sexual abuse material, requiring users to locate specialized forms rather than using standard reporting mechanisms.
Second, the court permitted negligence per se claims to proceed. "The duty to report child pornography to NCMEC from Twitter's role as a publisher," the appeals court noted, finding that Twitter's obligation under 18 U.S.C. § 2258A was separate from its content publication activities.
However, the court affirmed dismissal of claims alleging Twitter knowingly benefited from sex trafficking, ruling those activities remained protected under Section 230.
Challenging Section 230's scope
The ruling marks a departure from traditional Section 230 interpretations that have broadly protected platforms from liability related to third-party content. For digital advertising stakeholders, the decision signals potential accountability for platform design choices that facilitate harmful content distribution.
"Twitter does not allow users to report a tweet for CSAM through the easily-accessible report function," according to analysis by the Canadian Centre for Child Protection cited in court filings. "One must first locate the child sexual exploitation report form."
Twitter received the lowest rating among major platforms for its child sexual abuse material reporting structure, according to the December 2020 study.
Technical architecture under examination
The court scrutinized Twitter's search and recommendation systems. Plaintiffs alleged the platform's algorithms suggested hashtags commonly used to distribute illegal content. When users searched for hashtags like "#megalinks" - associated with child exploitation material - Twitter's system would recommend related terms and accounts.
"Twitter's software is designed so that a search for the #megalinks hashtag returns suggestions for other hashtags that are related to CSAM and users that use the #megalinks hashtag to discuss or distribute CSAM," the complaint states.
Revenue model concerns
Court documents detail how Twitter monetized all content on its platform through advertising, regardless of legality. "As long as content on Twitter's platform remains live, Twitter monetizes that content," the filing alleges.
This business model created financial incentives to maintain engagement-driving content, even when that content violated platform policies or federal law. Twitter generated over $936 million in advertising revenue during the third quarter of 2020, representing approximately 80% of total revenue.
Narrow but meaningful precedent
Legal experts note the ruling's limited scope while acknowledging its potential impact. The decision doesn't eliminate Section 230 protections but creates exceptions for specific platform design failures and statutory reporting violations.
The case specifically addressed situations where platforms have "actual knowledge" of illegal content but fail to meet federal reporting requirements. This standard differs from broader liability theories that the court rejected.
Industry response and implications
For digital advertising platforms, the ruling suggests increased scrutiny of content moderation infrastructure and compliance systems. Platforms may need to redesign reporting mechanisms and strengthen relationships with law enforcement agencies to avoid similar liability.
The decision also highlights potential risks in algorithmic content promotion systems that may inadvertently facilitate illegal content distribution. Platform operators will likely review recommendation algorithms for similar vulnerabilities.
Timeline
- January 20, 2021: Initial complaint filed in Northern District of California
- August 19, 2021: District court grants motion to dismiss most claims
- May 3, 2023: Ninth Circuit issues interlocutory appeal decision
- December 11, 2023: District court dismisses remaining claims
- January 9, 2024: Plaintiffs appeal final dismissal
- August 1, 2025: Ninth Circuit issues current ruling allowing some claims to proceed
Legal representation
The plaintiffs are represented by attorneys from the National Center on Sexual Exploitation, The Haba Law Firm, and The Matiasic Firm. X Corp is represented by Quinn Emanuel Urquhart & Sullivan LLP.
Looking ahead
The case now returns to the district court for proceedings on the surviving claims. The outcome could establish important precedents for platform liability in cases involving defective safety infrastructure and federal reporting violations.
This development represents another evolution in the complex relationship between digital advertising, platform accountability, and content safety. While Section 230 protections remain largely intact, platforms face increasing pressure to invest in robust content moderation and reporting systems.
Key terms explained
Section 230: This federal law provides broad immunity to internet platforms from liability for content posted by third-party users. Enacted as part of the Communications Decency Act of 1996, Section 230 has been called "the twenty-six words that created the internet" for its role in enabling modern social media platforms. The law protects platforms from being treated as publishers of user-generated content, but the Ninth Circuit found specific exceptions where platforms' own design choices and compliance failures create liability.
Child Sexual Abuse Material (CSAM): The legal term for visual depictions of minors engaged in sexually explicit conduct, previously referred to as child pornography. CSAM encompasses images and videos created through exploitation, coercion, or trafficking of children. Federal law requires electronic service providers to report known CSAM to the National Center for Missing and Exploited Children within specific timeframes, a requirement that became central to this case.
Ninth Circuit Court of Appeals: The federal appellate court with jurisdiction over nine western states, including California where this case originated. The Ninth Circuit is known for handling many technology and internet law cases due to Silicon Valley's location within its territory. This court's ruling creates binding precedent for all federal courts within its jurisdiction and significant persuasive authority for courts nationwide.
Product liability: A legal theory holding manufacturers responsible for defective products that cause harm to users. In this case, plaintiffs argued Twitter's platform design constituted a defective product because it made reporting CSAM extremely difficult while simultaneously facilitating the distribution of such material through its search and recommendation systems. The court found these design defects could create liability separate from Twitter's role as a content publisher.
Negligence per se: A legal doctrine where violation of a statute automatically establishes negligence if the plaintiff belongs to the class the statute was designed to protect. Here, plaintiffs argued Twitter's failure to promptly report CSAM to authorities violated federal reporting requirements under 18 U.S.C. § 2258A. The court agreed this statutory violation could proceed as it didn't depend on Twitter's publishing activities.
NCMEC (National Center for Missing and Exploited Children): A federally-mandated clearinghouse that receives reports of child exploitation from electronic service providers, law enforcement, and the public. Under federal law, platforms must report known CSAM to NCMEC "as soon as reasonably possible" after obtaining actual knowledge. Twitter's delay in reporting the plaintiffs' case became a key factor in the court's ruling.
Reporting infrastructure: The technical systems and user interfaces that platforms provide for users to report harmful or illegal content. The court found Twitter's reporting system defectively designed because it required users to navigate complex forms rather than using standard reporting mechanisms. This infrastructure defect created liability because it hindered detection and removal of illegal content.
Federal reporting requirements: Statutory obligations under 18 U.S.C. § 2258A requiring electronic communication service providers to report apparent child exploitation to NCMEC upon gaining actual knowledge. These requirements exist independently of platforms' content moderation activities and create specific duties that cannot be circumvented through Section 230 immunity, according to the court's analysis.
Interlocutory appeal: A legal procedure allowing appeals of specific rulings before final judgment in a case. The district court certified questions about FOSTA's scope and TVPRA standards for immediate appellate review, leading to the Ninth Circuit's 2023 decision that was then applied to the current ruling. This process allowed important legal questions to be resolved while the case continued.
TVPRA (Trafficking Victims Protection Reauthorization Act): Federal legislation creating civil remedies for sex trafficking victims, including claims against those who knowingly benefit from trafficking ventures. While the court dismissed most TVPRA claims under Section 230 immunity, the law's intersection with platform liability remains an evolving area of legal development as courts balance victim protection with internet platform immunity.
Summary
Who: Two minor plaintiffs (John Doe #1 and #2) suing X Corp (formerly Twitter)
What: Federal appeals court allows product liability and negligence claims to proceed despite Section 230 protections
When: Ninth Circuit ruling issued August 1, 2025, regarding case filed January 20, 2021
Where: Northern District of California (trial court), Ninth Circuit Court of Appeals
Why: Court found Twitter's defective reporting infrastructure and failure to meet federal reporting requirements fall outside Section 230 immunity protections