Select Language:
A historic trial scheduled to begin this week in Los Angeles could set a legal precedent concerning whether social media platforms intentionally design their services to be addictive to children.
Jury selection will start on Tuesday in California’s state court for whatis being called a “bellwether” case—meaning its outcome might influence a wave of similar lawsuits nationwide.
The defendants include tech giants Alphabet (owner of YouTube), ByteDance (which owns TikTok), and Meta (the parent company of Instagram). Mark Zuckerberg, Meta’s co-founder and CEO, is expected to testify.
Hundreds of lawsuits accuse social media companies of addicting minors to content that has contributed to issues like depression, eating disorders, psychiatric hospitalization, and even suicide.
Legal teams representing the plaintiffs are adopting strategies reminiscent of those used against the tobacco industry in the 1990s and early 2000s, arguing that these companies are selling a defective product.
The trial, overseen by Judge Carolyn Kuhl, is set to begin in the first week of February once the jury is chosen. It centers around allegations that a 19-year-old woman, identified by the initials KGM, suffered serious mental health issues due to social media addiction.
“This is the first case where a social media company has faced a jury over harm caused to children,” said Matthew Bergman, founder of the Social Media Victims Law Center, which is involved in over 1,000 similar cases.
The center is dedicated to holding social media companies accountable for harms inflicted on young users. Bergman added, “Having KGM and her family stand in a courtroom alongside these powerful, wealthy corporations is a meaningful victory in itself. We are confident in our ability to prove, by the preponderance of evidence, that their design choices caused harm to her—and we’re ready to do so.”
A key focus of the case is whether the platforms’ design—rather than simply their content—ends up causing harm. A favorable verdict could serve as a crucial data point, potentially encouraging settlement of many similar cases.
Netflix last week disclosed that it reached an agreement to avoid a civil trial accusing it, along with Meta, TikTok, and YouTube, of promoting addictive behavior among youth. The terms of that settlement have not been made public.
While these companies claim protections under Section 230 of the Federal Communications Decency Act—which generally shields them from liability for user-posted content—they are now being challenged on grounds that their business models are built to grab attention and push content that damages mental health.
Bergman explained, “Our focus isn’t about failing to remove harmful content. It’s about how these platforms are purpose-built to addict kids, with algorithms that show kids what they can’t stop watching, not just what they want to see.”
Lawsuits of this nature are also progressing through federal courts in Northern California and various state courts nationwide. The companies involved have not responded to requests for comment.



