On March 26, a Los Angeles jury found Meta and YouTube negligent in contributing to a young person's social media addiction, awarding $6 million in damages. One day earlier, a New Mexico jury found Meta violated the state's unfair practices act by designing features that harm children. Two verdicts in two states, on two different legal theories, in the same week. Meta stock fell 8% — roughly $150 billion in market cap — in a single day. The ratio of market cap destroyed to damages awarded is approximately 25,000 to 1. That ratio is the story.
The Leak
This arc begins in September 2021, when the Wall Street Journal published the Facebook Files — internal documents showing Facebook's own research found Instagram was harmful to a significant percentage of teenage users. Within weeks, the whistleblower behind the leak, Frances Haugen, went on 60 Minutes, and Facebook entered crisis mode.
The internal research was damning not because it proved harm — the evidence was more nuanced than headlines suggested. It was damning because it proved knowledge. Facebook had studied the effects of its products on adolescents, identified risks, and continued operating without meaningful changes. In courtrooms, knowledge is the difference between negligence and indifference.
The Filings
The lawsuits began within months. By mid-2022, Meta faced eight new lawsuits in a single week claiming its algorithms caused attention-related harms. By September, more than 70 lawsuits had been filed against Meta, Snap, TikTok, and Google, with parents making product liability claims — the same legal theory used against tobacco and opioids.
-
SEP 2021The Wall Street Journal publishes the Facebook Files. Internal research shows Instagram harms teens. Frances Haugen goes public on 60 Minutes.
- May 2022 California passes a bill letting parents sue social media companies over children's addiction.
- Sep 2022 70+ lawsuits filed against Meta, Snap, TikTok, Google. Product liability theory — the same used against tobacco.
-
OCT 2023A California judge allows a group of lawsuits to proceed, consolidating cases. California and dozens of states sue Meta in federal court. NYC sues separately.
-
NOV 2023A US judge rejects efforts by Alphabet, ByteDance, Meta, and Snap to dismiss the cases. Section 230 does not apply.
- Apr 2024 Zuckerberg avoids personal liability in about two dozen cases, but the corporate cases survive.
- Aug 2024 A court rules TikTok must face a lawsuit over a 10-year-old's death after watching choking challenges.
- Oct 2025 A judge orders Zuckerberg, Spiegel, and Mosseri to testify in person.
The Shield That Didn't Hold
For two decades, Section 230 of the Communications Decency Act was Big Tech's legal shield: platforms couldn't be held liable for content their users posted. The social media companies expected it to protect them here too.
The courts said no. The distinction: Section 230 protects platforms from liability for what users post. It does not protect them from liability for how they designed the product that delivers it. Algorithms that maximize engagement, notifications that interrupt sleep, infinite scroll that prevents natural stopping points — these are product design choices, not user content. And product design has never been protected by Section 230.
By September 2024, legal analysts were writing that Section 230's protections were in jeopardy. The shield didn't break. It simply didn't apply.
The Split
Facing this legal landscape, the defendants split. In January 2026, Snap settled its social media addiction lawsuit. Days later, TikTok settled a California case ahead of the bellwether trial. They ran the numbers and chose to pay rather than let juries decide.
Meta and Google fought.
The LA trial — a bellwether chosen to represent the broader multi-district litigation — opened in February. The 20-year-old plaintiff testified about years of compulsive use. In New Mexico, a separate trial examined Meta's practices under the state's consumer protection law. On March 5, testimony revealed that Zuckerberg had downplayed Meta's own internal findings.
Then, in one week, both trials concluded.
March 25: New Mexico. Unfair practices act violation. March 26: Los Angeles. Negligence. Two juries, two states, two legal theories, one conclusion: Meta designed products it knew could harm children, and did not adequately change course.
Twenty-Five Thousand to One
The $6 million doesn't matter to a company with $165 billion in annual revenue. The Wall Street Journal framed the verdict as "a win for the plaintiffs bar, not kids" — and in the narrow sense, it's right. $6 million doesn't change a product, fund a therapy program, or redesign an algorithm.
But the market didn't react to $6 million. It reacted to what $6 million means when multiplied across every pending case in the multi-district litigation. The bellwether trial exists for exactly this purpose: to signal whether the broader pool of cases has merit. The signal was unambiguous.
The market priced the precedent, not the verdict. And the precedent is that juries will find liability.
Snap and TikTok settled because they understood this arithmetic before the jury did. Meta and Google fought and learned it in public.
What Congress Couldn't
The New York Times observed that as lawmakers struggle to pass online safety laws, juries are doing what Congress won't. This is precisely correct — and it's worth understanding why.
For five years, social media regulation followed a script: a hearing would generate headlines, a bill would be introduced, and nothing would pass. The bipartisan Senate bill amending Section 230 introduced in November 2025 was the latest iteration. The structural problem was always the same: any federal law broad enough to be effective would face First Amendment challenges, lobbying pressure, and the sheer difficulty of legislating technology that changes faster than Congress can draft.
The trial system has none of these constraints. It doesn't need to pass a law. It doesn't need bipartisan support. It doesn't need to define "addiction" or "harmful" in statutory language. It needs twelve people to look at evidence and answer a question: did the company know, and did it act reasonably?
Two juries answered. The market extrapolated. And in the space between a $6 million verdict and a $150 billion market reaction, you can see the shape of what's coming — not regulation from above, but liability from below, case by case, jury by jury, until the financial math of not changing the product becomes worse than the financial math of changing it.
The $6 million that the Wall Street Journal dismissed is the opening bid. The $150 billion the market erased is the forecast. Between those two numbers lies the answer to a question Congress spent five years avoiding: who regulates social media's impact on children? This week, two juries volunteered.