A landmark trial is underway in Los Angeles County Superior Court, pitting social media giants Meta (Instagram), Google (YouTube), and previously TikTok, against allegations that their platforms are intentionally addictive and harmful to young users. The case, brought by a 19-year-old plaintiff identified as KGM, could set a precedent for over a thousand similar lawsuits and potentially force significant changes to how these platforms operate.
The core of the lawsuit centers on claims that the design of these social media applications – features like infinite scrolling, algorithmic recommendations, and auto-playing videos – are deliberately engineered to maximize user engagement, particularly among children and adolescents. KGM alleges that her use of these platforms, beginning at age eight, led to addiction, depression, anxiety, and suicidal thoughts. Her legal team argues that this negative impact isn’t accidental, but a direct result of business strategies focused on increasing time spent on the apps.
TikTok settled with KGM just before the trial began, but remains a defendant in other personal injury cases, according to Joseph VanZandt, co-lead counsel for the plaintiff. Snap, the parent company of Snapchat, also reached an undisclosed settlement last week. The trial will now proceed against Meta and YouTube, marking the first time these major social media companies will present their defense before a jury. Jury selection is expected to take several days, with 75 potential jurors questioned daily.
The legal strategy employed by KGM’s team echoes those used against tobacco companies in the 1990s, where it was demonstrated that companies concealed information about the harmful effects of their products. That litigation ultimately resulted in a settlement of $206 billion, the largest in U.S. History. The plaintiffs in the current case argue that social media platforms, like cigarettes, constitute a “defective product” due to their addictive potential and the resulting harm to young people’s mental health. Specific features cited include recommendation algorithms, autoplay, and beauty filters, which are accused of promoting harmful comparisons and body dysmorphia.
Meta and YouTube are defending against these claims by arguing that their products are not inherently addictive and that there is no conclusive scientific evidence to prove a direct link between platform use and mental health issues. They are also invoking federal law, which protects digital platforms from being held liable for content posted by their users. Meta has stated that the lawsuits oversimplify the complex issue of adolescent mental health, attributing it solely to social media while overlooking other contributing factors such as academic pressure, school insecurity, socioeconomic challenges, and substance abuse.
This trial arrives amidst growing international scrutiny of the impact of social media on children and teenagers. Several governments are taking steps to limit minors’ exposure to these digital environments. France recently enacted a law setting the minimum age for opening a social media account at 15. Australia has blocked access to these applications for users under 16.
In the United States, while Congress has debated stricter regulations, efforts have largely stalled. Some states, including California, Texas, and Ohio, have passed laws aimed at protecting minors, but technology companies have successfully blocked many of these measures in court, citing freedom of speech concerns.
The outcome of this case could be a “bellwether” for future litigation, potentially forcing tech giants to overhaul their platforms. According to Clay Calvert, a nonresident senior fellow of technology policy studies at the American Enterprise Institute, the trial will also serve as a test case for determining what damages, if any, may be awarded to plaintiffs. VanZandt stated that this trial represents “the ground zero of our fight against social media,” where society will establish new expectations and standards for how social media companies treat children.
