The Massachusetts Supreme Judicial Court today ruled that Meta Platforms, Inc. and Instagram, LLC cannot use a key federal internet immunity law to escape a state lawsuit alleging the company deliberately designed Instagram to addict children to its platform. The April 10, 2026, decision, written by Justice Wendlandt, marks one of the most detailed appellate examinations of Section 230 of the Communications Decency Act of 1996 as it applies to platform design - rather than user-generated content - and has direct implications for how social media companies structure advertising-dependent products aimed at young audiences.
The case, Commonwealth v. Meta Platforms, Inc., was originally filed in Suffolk Superior Court on October 24, 2023. A Superior Court judge denied Meta's motion to dismiss, and the company sought immediate appellate review. The Supreme Judicial Court accepted the case for direct review, limiting its examination to whether Section 230 protection is immediately appealable and, if so, whether that protection bars the Commonwealth's claims. The court today answered both questions: yes to the first, no to the second.
The revenue model at the center of the case
According to the complaint, Meta derives substantially all its revenue from selling advertising to third parties who seek to target promotional material to likely consumers using data Meta collects about users' preferences. Third parties pay Meta per "advertisement impression" - defined as the number of times an advertisement is on screen for a target audience. This revenue structure creates a direct financial incentive for Meta to increase users' screen time, because the more time a user spends on the platform, the more advertisement impressions Meta can sell.
Instagram is used by over 33 million young people in the United States. The platform has over 300,000 daily active users in Massachusetts alone who are between thirteen and seventeen years old, according to the complaint's allegations, which the court accepted as true for purposes of the motion to dismiss.
The Commonwealth alleges this advertising-driven model led Meta to design Instagram with four specific features intended to exploit neurological vulnerabilities in young users. First, Instagram enables approximately forty types of default audiovisual and haptic notifications - alerts triggered by likes, follow requests, messages, new stories, and reel uploads. According to the complaint, the volume is designed to overwhelm young users and compel them to repeatedly reopen the application. Second, an "infinite scroll" feature delivers a continuous stream of posts and advertisements without requiring any affirmative act from a user, while an "autoplay" feature automatically advances through stories and reels. Third, certain "ephemeral" features - stories that disappear after twenty-four hours and live streams only available in real time - are alleged to induce fear of missing out, or FOMO, among teenage users. Fourth, the complaint identifies two intermittent variable reward, or IVR, mechanisms that the court compared to a slot machine: notifications delivered on an unpredictable schedule alongside neutral ones, and a feed refresh function that manufactures a deliberate delay before displaying new content following a user's swipe.
Four counts, four allegations
The Commonwealth structured its complaint into four counts under Massachusetts General Laws chapter 93A, section 2, the state's consumer protection statute. Count I alleges Meta employed these design features to increase young users' time on the platform in violation of that statute. Count II alleges Meta made public statements - attributed to its chief executive officer, its global head of safety, and the head of Instagram - that the platform is safe and nonaddictive, while simultaneously being aware from internal studies that the design features induced addiction and harmed young users' health. Count III addresses children under thirteen: it alleges Meta falsely claimed to exclude underage users through age-gating while knowing that hundreds of thousands of underage users possessed Instagram accounts and that age-gating measures were ineffective - and that Meta refused to invest in effective mechanisms despite knowing of the acute harm caused by the platform's features to that age group. Count IV alleges that the conduct across the first three counts created a public nuisance of youth addiction.
The Section 230 question
Section 230(c)(1) of the Communications Decency Act states that no provider of an interactive computer service shall be treated as "the publisher or speaker of any information provided by another information content provider." Since its enactment as part of the Telecommunications Act of 1996, this provision has shielded platforms from liability for content generated by their users. Meta argued that this immunity extended to its design choices, including algorithmic features that determine how and to whom content is displayed.
The court's analysis turned on whether the Commonwealth's claims sought to hold Meta liable as an intermediary for third-party content or for its own conduct. The court reviewed the legislative history of Section 230 at length, tracing it to two pre-1996 cases: Cubby, Inc. v. CompuServe, Inc. from 1991, in which a federal district court found CompuServe functioned more like a library than a publisher because it exercised little editorial control, and Stratton Oakmont, Inc. v. Prodigy Services Co. from 1995, in which a New York court held that a service provider was liable as a publisher because it had undertaken to remove some offensive comments. Congress enacted Section 230 in response to this second case, concerned that service providers would either limit speech or abandon efforts to remove inappropriate content to avoid liability.
The court drew a distinction between two competing interpretations. Meta argued that the phrase "treated as the publisher" extends to any claim implicating editorial choices - including decisions about how and to whom to distribute content. The Commonwealth argued that publisher liability at common law has always required both a dissemination element and a content element: meaning the claim must seek to hold someone liable based on the content of the information published, not merely on the fact of publication. According to the decision, the court adopted the common-law reading. "Protection under Section 230(c)(1) extends only to bar certain claims imposing liability for specific information that another party provided," the court stated, citing Henderson v. Source for Public Data, L.P., a 2022 Fourth Circuit decision.
Applied to the four counts, the court reached consistent conclusions. The unfair business practices claim in Count I does not allege harm stemming from any specific piece of third-party content - the harm alleged flows from the design features themselves, which function to prolong engagement independently of what any particular user happens to post. The deceptive business practices claim in Count II is based on Meta's own speech, not on third-party content. Count III similarly rests on Meta's affirmative misstatements about age-gating and on the design of the age-gating mechanism itself, not on any published user information. Because Section 230 does not bar Counts I through III, the court held it also does not bar the public nuisance claim in Count IV, which depends on the same alleged conduct.
The court also addressed a procedural question: whether Section 230 creates immunity from suit - meaning the right not to be haled into court at all - or merely immunity from liability at trial's end. This distinction matters because only immunity from suit permits an immediate interlocutory appeal before a case reaches judgment. The court concluded, joining courts including the Fourth, Ninth, and First Circuits, that Section 230(e)(3) provides immunity from suit, because the statutory text says "no cause of action may be brought" as a phrase separate from and additional to "no liability may be imposed." This reading allowed Meta's appeal to proceed but ultimately did not help the company.
Context: a broadening litigation landscape
The Massachusetts ruling arrives as Meta faces mounting legal pressure in multiple jurisdictions over matters connected to platform design and data practices. A Third Circuit court limited Section 230 immunity in a TikTok case in 2024, finding that the statute does not protect algorithmically promoted content that a platform knows to be harmful. In the United States alone, a California Superior Court entered a $50 million final judgment against Meta over Facebook user data shared with third-party developers on March 3, 2026. In Germany, Dresden's Higher Regional Court issued final rulings against Meta's Business Tools tracking practices on February 3, 2026. A Madrid court ordered Meta to pay €479 million to 87 Spanish digital publishers over GDPR-violating behavioral advertising. Texas launched investigations into Meta over children's privacy practices in December 2024.
The Massachusetts case directly concerns the platform's relationship with its advertising revenue model. The complaint acknowledges that Instagram's main feed consists of a continuous stream of posts from followed accounts, suggested posts from accounts the user does not follow, and advertisements; that the explore page similarly mixes organic content with advertising content; and that the stories banner also mixes followed accounts' posts with advertisements. Each of these surfaces - the ones the design features are alleged to have been engineered to maximize engagement with - is an advertising inventory surface.
Meta's full-year 2025 advertising revenue reached $196.2 billion, with Q4 2025 alone generating $58.1 billion, representing 24% year-over-year growth. The platform has also introduced a series of teen safety features in recent years, including Instagram Teen Accounts in September 2024, enhanced restrictions announced in April 2025, and PG-13 content rating alignment introduced in October 2025. None of these developments were before the court in today's case, which addresses conduct alleged from the platform's earlier period.
What the decision does and does not decide
The court was careful to note the limited scope of its ruling. At the motion to dismiss stage, it accepted all allegations as true and drew all reasonable inferences in the Commonwealth's favor. It did not assess the merits of any claim, nor did it determine that Meta is liable for any of the conduct alleged. The decision says only that Section 230 does not require the case to be dismissed before discovery and trial. Meta retains the ability to contest each count on its merits.
The court also addressed a pending federal counterpart. A multidistrict litigation raising similar claims is currently before the United States Court of Appeals for the Ninth Circuit - California v. Meta Platforms Inc., Nos. 24-7032 and related cases - following oral argument on January 6, 2026. The Massachusetts court noted it was not persuaded by the reasoning of a Northern District of California judge who had reached a different conclusion on Section 230 in those proceedings.
The case returns to the Suffolk Superior Court for further proceedings. The Commonwealth will need to prove that the design features it identifies actually induced compulsive use in young users, that Meta knew of the harm, that public statements attributed to executives were materially false, and that Meta failed to implement effective age-gating despite knowing of the risks to children under thirteen. These are contested factual questions. Today's decision means those questions will be answered in litigation rather than foreclosed by federal immunity.
For the advertising and marketing community, the decision represents a judicial framework that distinguishes between platform design choices made to maximize advertising revenue and the content that fills the resulting inventory. A platform's decision about how to pace notification delivery, how long content remains available, and how to structure feed refresh functions is, according to this court, a product design choice - not a publishing decision shielded by Section 230. That distinction, if it holds through further appeals and is adopted in other circuits, could affect the range of state-level challenges that social media advertising platforms must litigate across the United States.
Timeline
- October 24, 1991 - Cubby, Inc. v. CompuServe, Inc. decided; federal court finds CompuServe functions as distributor, not publisher, because it exercised little editorial control over third-party newsletters in its electronic library
- May 25, 1995 - Stratton Oakmont, Inc. v. Prodigy Services Co. decided; New York court holds Prodigy liable as publisher because it undertook to moderate content on its bulletin board
- February 8, 1996 - Section 230 of the Communications Decency Act enacted as part of the Telecommunications Act of 1996, designed in part to reverse the Prodigy decision
- September 2024 - Instagram launches Teen Accounts with built-in protections for users under 18; rollout begins in the United States (PPC Land)
- September 1, 2024 - Third Circuit Court limits Section 230 immunity for TikTok in case involving algorithmic recommendations of harmful content (PPC Land)
- December 12, 2024 - Texas Attorney General Ken Paxton launches investigations into Meta and fourteen other companies over children's privacy practices (PPC Land)
- January 26, 2025 - Meta removes detailed targeting exclusions for ad accounts (PPC Land)
- February 11, 2025 - Instagram expands Teen Accounts to India on Safer Internet Day (PPC Land)
- April 8, 2025 - Meta announces enhanced Teen Account restrictions and expansion to Facebook and Messenger (PPC Land)
- October 24, 2025 - European Commission finds Meta in breach of Digital Services Act transparency obligations (PPC Land)
- October 14, 2025 - Instagram aligns Teen Account content filtering with PG-13 movie rating standards (PPC Land)
- November 19, 2025 - Madrid Commercial Court No. 15 orders Meta to pay €479 million to 87 Spanish publishers for GDPR violations in behavioral advertising (PPC Land)
- January 6, 2026 - Ninth Circuit holds oral argument in California v. Meta Platforms Inc., the federal multidistrict litigation raising similar child addiction claims
- January 28, 2026 - Meta reports Q4 2025 advertising revenue of $58.1 billion, 24% year-over-year growth; full-year total reaches $196.2 billion (PPC Land)
- February 3, 2026 - Dresden Higher Regional Court issues final rulings against Meta, awarding €1,500 per plaintiff for Business Tools tracking violations (PPC Land)
- March 3, 2026 - San Francisco Superior Court enters $50 million final judgment against Meta over Facebook user data shared with third-party developers (PPC Land)
- April 10, 2026 - Massachusetts Supreme Judicial Court rules Section 230(c)(1) does not bar the Commonwealth's claims against Meta over Instagram's alleged addictive design targeting children
Summary
Who: The Commonwealth of Massachusetts, represented by the Attorney General's office, brought the case against Meta Platforms, Inc. and Instagram, LLC. The Supreme Judicial Court's decision was written by Justice Wendlandt, with Chief Justice Budd and Justices Gaziano, Kafker, Georges, and Wolohojian present. Amicus briefs were filed by twenty-five state and territorial attorneys general, NetChoice, the Electronic Privacy Information Center, TechFreedom, and others.
What: The Massachusetts Supreme Judicial Court ruled today that Section 230(c)(1) of the Communications Decency Act does not bar the Commonwealth's four-count lawsuit alleging Meta designed Instagram to induce compulsive use by children through specific product features, publicly misled consumers and regulators about the platform's safety, falsely claimed to exclude users under thirteen through ineffective age-gating, and created a public nuisance of youth addiction. The court affirmed the Superior Court's denial of Meta's motion to dismiss on Section 230 grounds.
When: The Superior Court complaint was filed on October 24, 2023. The motion to dismiss was heard by Judge Peter B. Krupp. The Supreme Judicial Court accepted direct appellate review, and the decision was issued today, April 10, 2026.
Where: The case was litigated in the Suffolk Superior Court Department and appealed directly to the Massachusetts Supreme Judicial Court in Boston. The events underlying the complaint concern Instagram's operations in Massachusetts, where the platform is alleged to have more than 300,000 daily active users aged thirteen to seventeen.
Why: The case matters because it tests whether state consumer protection laws can reach social media platform design choices that are alleged to exploit the neurological vulnerabilities of children in pursuit of maximizing advertising revenue. The court concluded that Section 230 was designed to protect platforms acting as intermediaries for third-party content - not to shield platforms from liability for their own product design decisions and their own public statements. The case now returns to the Superior Court for proceedings on the merits.