Meta Fined $375M in Child Safety Case: Social Media Trials Heat Up
- A New Mexico jury delivered a significant blow to Meta Platforms on Tuesday, finding the social media giant liable for violating state consumer protection law and ordering the...
- While the $375 million penalty represents a small fraction of Meta’s $201 billion revenue in 2025, the case marks the first time a jury has ruled against the...
- The New Mexico case distinguished itself through a novel investigative approach.
Meta Hit with $375 Million Penalty in New Mexico Child Safety Case
A New Mexico jury delivered a significant blow to Meta Platforms on Tuesday, finding the social media giant liable for violating state consumer protection law and ordering the company to pay $375 million in civil penalties. The verdict, stemming from a lawsuit brought by New Mexico Attorney General Raúl Torrez, centers on allegations that Meta misled users about the safety of its platforms – Facebook, Instagram, and WhatsApp – and enabled child sexual exploitation.
While the $375 million penalty represents a small fraction of Meta’s $201 billion revenue in 2025, the case marks the first time a jury has ruled against the company on these specific claims. It signals a growing shift in public perception and legal scrutiny of social media companies’ responsibilities in safeguarding young users.
The New Mexico case distinguished itself through a novel investigative approach. Attorney General Torrez’s team reportedly posed as children on Meta’s platforms, documenting instances of sexual solicitation and the company’s response – or lack thereof – to those interactions. This evidence formed a core component of the state’s argument that Meta prioritized profit over child safety.
The jury found Meta engaged in “unconscionable” trade practices, unfairly exploiting the vulnerabilities and inexperience of children. Specifically, the state argued Meta misrepresented the safety of its platforms and failed to adequately protect young users from harmful content and predatory behavior. Torrez intends to pursue further action in May, seeking court orders requiring Meta to implement platform changes designed to protect children and impose additional financial penalties.
Meta has stated its disagreement with the verdict and plans to appeal. A company spokesperson acknowledged the challenges of identifying and removing harmful content but maintained that Meta “works hard to keep people safe on our platforms.” Despite the legal challenge, Meta shares saw a slight increase of 0.8% in after-hours trading following the announcement.
Broader Legal Challenges Facing Social Media Companies
The New Mexico verdict arrives as Meta and other social media companies face a wave of lawsuits alleging harm to young people. A separate, ongoing case in Los Angeles centers on claims that Meta and YouTube designed their platforms to be intentionally addictive, particularly for young users. That case, which has already seen settlements from TikTok and Snap, focuses on the mental health impacts of prolonged social media use.
The Los Angeles case revolves around the experiences of a 20-year-old plaintiff, whose case is being used as a bellwether to determine the potential outcomes of thousands of similar lawsuits. Attorneys representing plaintiffs argue that platform design features are deliberately engineered to maximize engagement, even at the expense of users’ well-being.
Beyond individual lawsuits, school districts across the country are also joining forces in a multidistrict litigation against social media companies. This case, scheduled to go to trial this summer in California, alleges that social media platforms contribute to addiction and mental health problems among students. Attorneys involved in this litigation draw parallels to previous legal battles against pharmaceutical companies over the opioid crisis, arguing that social media companies knowingly downplayed the risks associated with their products.
The Future of Section 230 and Tech Regulation
These legal challenges could have far-reaching implications for the future of social media regulation. The outcomes of these cases could challenge the protections afforded to tech companies under Section 230 of the Communications Decency Act, which currently shields them from liability for content posted by users. A weakening of Section 230 could open the door to increased legal accountability for social media companies.
The cases also raise questions about the First Amendment rights of social media platforms. While companies argue that they have a right to moderate content, plaintiffs contend that this right does not extend to knowingly facilitating harm to children. The resolution of these cases could reshape the legal landscape for social media companies and influence how they operate in the years to come.
While the legal battles unfold, the debate over social media’s impact on mental health continues. Some researchers question whether “addiction” is the appropriate term to describe heavy social media use, as it is not a formally recognized disorder. However, growing concerns among parents, educators, and lawmakers are driving calls for greater regulation and increased corporate responsibility.
