Meta: $375M Fine for Child Safety Concerns & Exploitation Claims
- A New Mexico jury has ordered Meta, the parent company of Facebook, Instagram and WhatsApp, to pay $375 million in civil penalties after finding the tech giant misled...
- The lawsuit, filed in December 2023, alleged that Meta knowingly designed its platforms in ways that endangered children, exposing them to harmful content, including sexually explicit material, and...
- Attorney General Torrez described the verdict as a “historic victory” for children and families, stating that Meta executives were aware of the harm their products caused, disregarded warnings...
A New Mexico jury has ordered Meta, the parent company of Facebook, Instagram and WhatsApp, to pay $375 million in civil penalties after finding the tech giant misled consumers about the safety of its platforms, particularly concerning children. The verdict, delivered on , marks the first time a state has successfully sued Meta over child safety issues, according to New Mexico Attorney General Raúl Torrez.
The Core of the Case
The lawsuit, filed in December 2023, alleged that Meta knowingly designed its platforms in ways that endangered children, exposing them to harmful content, including sexually explicit material, and facilitating contact with potential sexual predators. The jury found Meta liable for violating New Mexico’s Unfair Practices Act, imposing the maximum penalty of $5,000 per violation, totaling $375 million. The case was bolstered by a two-year investigation by The Guardian, published in April 2024, which revealed how Facebook and Instagram had become marketplaces for child sex trafficking. That investigation was directly cited in the complaint.
Attorney General Torrez described the verdict as a “historic victory” for children and families, stating that Meta executives were aware of the harm their products caused, disregarded warnings from employees, and misrepresented the extent of their knowledge to the public. “Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough,” Torrez said.
Internal Concerns and Whistleblower Testimony
The trial revealed internal Meta research and employee testimony highlighting the company’s awareness of the risks its platforms posed to young users. Arturo Béjar, a former engineering leader at Meta who left the company in 2021 and became a whistleblower, testified about experiments he conducted on Instagram demonstrating that underage users were being served sexualized content. He recounted a personal experience where his own daughter was contacted by a stranger on Instagram. According to court documents, internal research at one point showed that 16% of Instagram users reported being shown unwanted nudity or sexual activity in a single week.
Meta’s defense centered on the challenges of identifying and removing harmful content and bad actors from its platforms. A Meta spokesperson stated the company disagrees with the verdict and intends to appeal, asserting their commitment to keeping people safe online. “We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors and harmful content. We remain confident in our record of protecting teens online,” the spokesperson said.
Broader Implications and Future Steps
This ruling arrives at a time of increasing scrutiny of social media companies and their impact on young people’s mental and emotional well-being. The case echoes growing concerns about the addictive nature of these platforms and their potential to expose children to harmful content. The $375 million penalty represents a significant financial blow to Meta, but the long-term implications extend beyond the monetary cost.
The New Mexico Attorney General’s office indicated that it will pursue additional financial penalties and seek court-ordered changes to Meta’s platforms in the next phase of the case. This could involve implementing stricter age verification measures, enhancing content moderation practices, and increasing transparency regarding algorithms and data collection.
The outcome of this case is likely to influence similar legal challenges against other social media companies. It also underscores the growing pressure on tech firms to prioritize user safety, particularly for vulnerable populations, over profit maximization. Observers will be watching closely to see how Meta responds to the verdict and whether it will lead to meaningful changes in its policies and practices. The appeal process will be a key area to monitor, as will any legislative responses to the ruling.
