Los Angeles is currently hosting a landmark trial that could reshape the legal landscape for social media companies. Meta, the parent company of Instagram and Google, the owner of YouTube, are facing accusations that their platforms were intentionally designed to be addictive, leading to mental health issues in young users. The case, unfolding in California State Court, is the first of approximately 1,500 similar lawsuits against social media giants to reach trial.
The plaintiff, a 20-year-old woman identified as Kaley (KGM), alleges that addictive features on Instagram and YouTube contributed to her development of anxiety, body dysmorphia, and suicidal thoughts. Her legal team argues that the platforms prioritize engagement over user well-being, employing techniques akin to those used in “digital casinos” to keep users scrolling indefinitely. Mark Lanier, Kaley’s lawyer, described the “endless scroll feature” as creating dopamine hits that can lead to addiction, comparing the act of swiping on a phone to pulling the handle of a slot machine, but for “mental stimulation” rather than monetary gain.
The core of the argument centers on the deliberate design choices made by Meta and Google. Lanier presented jurors with internal company documents, including studies and emails, suggesting that the companies were aware of the potential for addiction and its disproportionate impact on vulnerable users, particularly those experiencing trauma or stress. One internal Google document reportedly likened YouTube to a casino, while a Meta employee communication described Instagram as “like a drug,” with employees acting as “pushers.” A Meta study, dubbed “Project Myst,” reportedly found that parental controls had limited impact on preventing addiction.
The defense, representing Meta and YouTube, is expected to argue that Kaley’s mental health challenges stemmed from a difficult family life, rather than social media use. This sets the stage for a battle of narratives, with the plaintiff attempting to demonstrate a direct causal link between platform features and her mental health decline, and the defendants seeking to attribute her struggles to external factors.
The trial’s significance extends beyond Kaley’s individual case. The outcome could establish legal precedents that influence how future lawsuits against social media companies are resolved. Several other major players in the social media space have already settled similar claims. TikTok reached a settlement just prior to the start of this trial, and Snap also settled for an undisclosed sum. The fact that these settlements occurred before reaching trial suggests a recognition of potential liability on the part of those companies.
The case is attracting significant attention from parents, safety advocates, and legal experts, who view it as a crucial moment for accountability in the tech industry. Executives from Meta, Google, and YouTube – including Mark Zuckerberg, Adam Mosseri, and Neal Mohan – are scheduled to testify in the coming weeks, potentially providing insights into the companies’ internal decision-making processes and their understanding of the potential harms associated with their platforms.
The broader context of this trial is a growing wave of concern regarding the impact of social media on mental health, particularly among young people. The legal challenges represent a shift in strategy for those seeking to address these concerns, moving beyond calls for self-regulation and towards legal accountability. The Atlantic recently questioned whether Instagram could “ruin your life,” highlighting the increasing scrutiny of the platform’s effects on users.
This trial is one of several landmark cases expected this year that aim to hold social media companies responsible for harms to children. The outcome will likely be closely watched by investors, regulators, and the public alike, as it could have far-reaching implications for the future of social media and the responsibilities of tech companies to protect their users.
The legal proceedings are taking place against a backdrop of increasing awareness of the addictive nature of social media platforms. The comparison to “digital casinos” underscores the argument that these platforms are not simply neutral tools, but are actively engineered to maximize user engagement, even at the expense of mental well-being. The case hinges on whether the court will accept the argument that this deliberate design constitutes negligence or intentional harm.
