Massachusetts can press its lawsuit claiming Instagram’s design hooked minors. The state’s top court said Meta can’t use Section 230 to stop the case — at least not yet.

Court narrows reach of Section 230

Look, the ruling landed with legal force. On Friday, the Massachusetts Supreme Judicial Court rejected Meta Platforms Inc.’s bid to block Attorney General Andrea Joy Campbell’s lawsuit by invoking the federal immunity known as Section 230 of the Communications Decency Act.

The unanimous opinion, written by Associate Justice Dalila Argaez Wendlandt, drew a bright line between claims that attack third-party content and claims that target how a company built its product. The court said a plaintiff can pursue allegations that Meta designed Instagram in ways that made it addictive for young users and then misled consumers about safety — even though the platform delivers third-party posts.

That distinction matters. If it sticks, it could carve out a path for state claims that focus on product design and company statements rather than on the words or images posted by users.

What the state alleges

Attorney General Andrea Joy Campbell filed the suit in 2023, arguing that features such as infinite scroll, autoplay and intermittent rewards were engineered to keep minors watching and returning.

The complaint says those design choices exploit developmental vulnerabilities in children and teenagers and that Meta downplayed or misrepresented the risks.

Campbell’s office argues those practices amount to unfair or deceptive business conduct under Massachusetts law — not merely editorial decisions protected by federal law. The state also challenges Instagram’s age-verification tools, saying they’re ineffective and expose underage users to heightened harm.

“Big tech companies like Meta have designed platforms that harm young people, all while downplaying the risks,” Attorney General Andrea Joy Campbell said in a statement released after the ruling. “Today’s victory is a major step in holding these companies accountable for practices that have fueled the youth mental health crisis and put profits over kids.”

Court’s legal reasoning

Justice Dalila Argaez Wendlandt wrote that Section 230 is narrower than Meta urged. The protections apply when a lawsuit seeks to hold a platform liable for third-party content it hosted. But the court said the state’s claims rest on Meta’s own actions and statements — such as the design of the app and the company’s assurances about safety — and therefore fall outside that shield.

The opinion acknowledged that some features the state challenges are tied to publishing activities — algorithms decide what content appears and in what order. But the court rejected the idea that any claim touching publication must automatically be barred. “The fact that a claim concerns publishing activities, including the use of algorithms in connection with publishing activities, isn't enough to bring the claim within the immunity provided by Section 230,” the court wrote.

So the door stays open for Massachusetts to press evidence about how Instagram’s mechanics work, how Meta described those mechanics to the public, and whether the company misled users and parents.

Context: a wave of recent rulings and suits

The state’s victory comes after a pair of high-profile jury decisions that favored plaintiffs claiming social platforms harmed young people. In California, a jury awarded $6 million to a 20-year-old woman who said she became addicted to Instagram when she was a child; the verdict included Google’s YouTube in that case. A separate jury in New Mexico ordered Meta to pay $375 million, finding the company liable under that state’s consumer-protection law for harms to children.

Those recent outcomes have legal observers and policymakers on edge. Meta has appealed both verdicts. The Massachusetts ruling reinforces a trend: courts are wrestling with whether established federal protections should block state-level claims that attack product design or corporate statements rather than third-party posts.

At least 40 other states and the District of Columbia filed related litigation in 2023, and lawmakers in several states are pursuing stricter limits on children’s access to social platforms. In Massachusetts, the House approved a bill that would bar social media accounts for kids under 14 and require parental permission for 14- and 15-year-olds; the measure awaits Senate action and the signature of Governor Maura Healey.

Reactions and legal stakes

Meta has argued broadly for Section 230 protection, saying the law should shield platforms whenever content and platform features intersect. The company maintains the distinction the court rejected is artificial — that design and content are often inseparable in social apps. Meta’s public statements have emphasized efforts to improve safety and to work with parents, experts and law enforcement.

But the Massachusetts court signaled skepticism about that expansive view. In oral arguments last December, justices questioned whether features like infinite scroll were simply ways to publish or whether they were product choices meant to attract and hold attention. The opinion reflected that skepticism.

Legal experts warn the litigation could reshape how platforms operate. If states prevail on claims that focus on algorithmic design or company statements, tech firms might be pushed to change core features, alter how they disclose risks, or face substantial damages. Meta has estimated potential damages in some suits could reach into the high tens of billions of dollars, according to filings and industry reports.

Free speech and broader concerns

Thing is, civil libertarians have raised alarms. Some advocacy groups argue that allowing states to treat editorial judgments as product defects risks chilling free expression — by punishing companies for promoting or amplifying lawful speech. They warn courts should be cautious about recharacterizing editorial choices in ways that bypass federal protections.

The Massachusetts ruling didn’t resolve those First Amendment issues. It focused on legal thresholds for Section 230 at the pleading stage — not on the merits of the state’s claims or potential constitutional limits. The court left room for Meta to press those defenses later in the case.

What to watch next

The decision sends the case back to lower court for the parties to dig into discovery — documents, internal research and internal communications about product design and youth safety. If the state uncovers evidence that supports its allegations, the litigation could move toward trial.

Meta will likely press appeals and defenses tied to both federal law and the First Amendment. The company says the ruling is procedural and doesn’t resolve the underlying facts — and it has showd that it’s taken steps intended to protect young users.

Meanwhile, state attorneys general and plaintiffs nationwide are watching. A favorable ruling for Massachusetts could tilt other courts to allow similar cases to proceed. But outcomes will vary by jurisdiction, and federal appeals courts could issue contrasting interpretations of Section 230.

One thing is clear: the legal fight over how to hold platforms accountable for harms tied to product design is far from settled. And the implications for social media business models are potentially big — from feature design to transparency about risks.

Related Articles

“The fact that a claim concerns publishing activities, including the use of algorithms in connection with publishing activities, isn't enough to bring the claim within the immunity provided by Section 230,” wrote Associate Justice Dalila Argaez Wendlandt.