A 20-Year-Old Is About to Tell a Jury What Instagram Did to Her Brain
The first plaintiff in the landmark social media addiction trial takes the stand. Here's why this case matters for everyone.
She started scrolling before she turned 10. By the time she was a teenager, she couldn't stop.
Now she's 20 years old, known in court documents only as KGM, and this week she'll tell a Los Angeles jury exactly what happened to her mind during the decade she spent inside Instagram and YouTube. Her testimony is the first of its kind — the opening act in what legal experts are calling the most consequential tech trial since the tobacco industry faced a courtroom reckoning in the 1990s.
And whether you use social media or not, the outcome could change how every app on your phone works.
The Trial Nobody Can Look Away From
Here's the setup. More than 1,600 plaintiffs — families, individuals, over 250 school districts — have filed lawsuits against Meta (which owns Instagram and Facebook), YouTube, TikTok, and Snap. They all share a central claim: these companies deliberately designed their platforms to hook young users, and the mental health damage was a foreseeable consequence.
KGM's case was selected as the first "bellwether" trial — a legal term for a test case that helps both sides gauge how a jury might react. Think of it as the opening shot. If the jury sides with KGM, it could shape outcomes for thousands of similar cases. If it doesn't, the companies breathe easier.
TikTok and Snap settled with KGM's legal team before the trial started. The terms weren't disclosed. That leaves Meta and YouTube as the remaining defendants.
Both deny wrongdoing. Meta says KGM's mental health struggles predated her social media use. YouTube says providing young people with a "safer, healthier experience has always been core to our work."
The jury will decide who's telling the truth.
"These Companies Built Machines"
The trial opened earlier this month with a line that's already getting quoted everywhere.
"These companies built machines designed to addict the brains of children," said Mark Lanier, the plaintiff's attorney, standing in front of children's blocks spelling out A-B-C: Addicting, Brains, Children. "And they did it on purpose."
It's dramatic courtroom theatre. But the evidence behind it is what makes this case different from previous attempts to hold tech companies accountable.
Internal Meta documents shown to the jury revealed that 11-year-olds were four times more likely to keep returning to Meta's apps compared to older users. Another document from 2015 showed an estimated 30% of 10- to 12-year-olds in the U.S. were already using Instagram — a platform that officially requires users to be 13.
A 2018 internal Meta memo put it bluntly: "If we wanna win big with teens, we must bring them in as tweens."
And then there's the Zuckerberg factor. The Meta CEO testified in person on February 18th, and things got heated. When confronted with a 2015 email where he demanded "time spent increases by 12%" across Meta's platforms, Zuckerberg grew visibly irritated. "You're mischaracterizing what I'm saying," he told the plaintiff's lawyer repeatedly.
On beauty filters — the appearance-altering tools that KGM's lawsuit says contributed to her body dysmorphia — Zuckerberg acknowledged that Meta's own hired experts confirmed the filters contributed to body-image problems among young girls. But he refused to remove them, calling it "paternalistic."
The moment that may linger longest: Lanier had six lawyers unspool a 35-foot-wide collage of hundreds of selfies KGM had posted to Instagram. He asked Zuckerberg to look at them. Was her account ever flagged for this level of use as a child?
Zuckerberg didn't answer.
Why This Isn't Just About One Person
KGM's story is specific. She alleges she developed a compulsion to scroll through photos and videos for hours every day before she was 10. Depression, anxiety, body dysmorphia followed. Her mother tried to block the apps. It didn't work. The notifications kept pulling her back.
Meta's defence team argues that KGM's mental health issues stem from a difficult home life — domestic violence, early therapy starting at age three. They're not wrong that her circumstances were complicated. But the plaintiff's legal team isn't claiming social media was the only factor. They're arguing it was a "substantial factor." And that's the legal standard the jury needs to meet.
This distinction matters because it mirrors something researchers have been wrestling with for years. No serious scientist claims social media is the sole cause of the youth mental health crisis. But a growing body of evidence suggests it's a meaningful contributor — especially when apps are designed to maximise time spent rather than wellbeing.
The numbers paint a picture. Teen depression and anxiety rates have climbed steadily since the early 2010s, tracking closely with smartphone adoption and social media use. A study published earlier this year found that students who relied heavily on AI and social media tools scored 17% lower on comprehension tasks. The American Psychological Association has called for warning labels on social media platforms.
None of this proves causation on its own. Together, it forms a pattern that's increasingly difficult to dismiss.
The Tobacco Parallel — and Where It Breaks Down
Legal observers keep comparing this trial to the tobacco lawsuits of the 1990s, and it's easy to see why. In both cases, internal documents showed companies knew about risks and marketed to young people anyway. In both cases, the companies argued that individual choice, not product design, was to blame.
But there's a critical difference. Cigarettes deliver nicotine. The biological pathway from product to addiction is well-understood. Social media's mechanisms are subtler. Infinite scroll, autoplay, notification loops, algorithmic recommendation — these features don't inject a chemical into your bloodstream. They exploit psychological patterns: variable reward schedules, social comparison, fear of missing out.
Lanier called Instagram and YouTube "digital casinos," comparing endless swiping to pulling a slot machine handle. It's a vivid analogy. Whether a jury finds it legally sufficient is another question.
The companies will argue they're platforms, not products — that the content users see is generated by other users, not by Meta or YouTube themselves. Section 230 of the Communications Decency Act has traditionally protected tech companies from liability for user-generated content. Whether that shield holds when the complaint is about the recommendation algorithm itself — not the content, but the system that decides what you see and when — is one of the core legal questions this trial will test.
What Happens Next
KGM is expected to testify Thursday. After her, the jury will hear from her mother, her sister, a former therapist, former Meta whistleblowers, and expert witnesses. The trial is scheduled to last six weeks.
Meanwhile, 29 state Attorneys General have filed a separate case demanding that a federal judge force Meta to remove all accounts belonging to users under 13 and fundamentally alter how its platforms operate for young people.
And this isn't just an American story. The EU's Digital Services Act already requires platforms to assess and mitigate risks to minors. Australia passed a law in late 2024 banning social media for children under 16. The UK's Online Safety Act went into effect in 2025. Each country is running the same experiment with different rules — trying to figure out where the line falls between personal responsibility and corporate accountability.
The Question That Won't Go Away
Here's what makes this trial genuinely interesting, beyond the legal drama.
We've spent the last 15 years building information systems that compete for human attention. Billions of dollars depend on keeping people scrolling. The algorithms got smarter. The content got more personalised. The notifications got more precise. And an entire generation grew up inside these systems before anyone fully understood what the systems were doing.
KGM was 10. She didn't choose to be part of an attention experiment. Neither did the other 1,600 plaintiffs. Neither did most of us.
The trial in Los Angeles isn't really about one woman's Instagram use. It's about whether the companies that designed these attention systems bear any responsibility for what happens inside them. It's about whether "we didn't mean to" is a sufficient answer when your own internal data shows you knew what was happening.
The jury will deliver a verdict in a few weeks. But the bigger question — how do we build information systems that respect human attention instead of exploiting it — will be with us for a lot longer than that.
Keep Reading
Explore Perspectives
Get this delivered free every morning
The daily briefing with perspectives from 7 regions — straight to your inbox.