Skip to main content
News Directory 3
  • Home
  • Business
  • Entertainment
  • Health
  • News
  • Sports
  • Tech
  • World
Menu
  • Home
  • Business
  • Entertainment
  • Health
  • News
  • Sports
  • Tech
  • World
Instagram DM Nudity Filter: Meta Knew of Risks to Teens for 6 Years

Instagram DM Nudity Filter: Meta Knew of Risks to Teens for 6 Years

February 24, 2026 Lisa Park - Tech Editor Tech

A lawsuit alleging that Meta’s Instagram is addictive and harmful to teens has revealed a significant delay between the company’s internal recognition of risks associated with direct messaging and the implementation of safety features. Newly unsealed court documents show Instagram head Adam Mosseri acknowledged the potential for harmful content, including unsolicited explicit images, in DMs as early as August 2018, yet a feature to automatically blur such images wasn’t launched until April 2024.

The revelation came during a deposition related to a federal lawsuit in the U.S. District Court for the Northern District of California, where plaintiffs argue that social media platforms are intentionally designed to maximize user engagement, leading to addictive behaviors in teenagers. Meta, along with Snap, TikTok and YouTube (Google), are named as defendants. Similar cases are also underway in Los Angeles County Superior Court and New Mexico, all seeking to hold big tech accountable for the well-being of young users.

According to the court filings, Mosseri was questioned about an email exchange with Guy Rosen, Meta’s VP and Chief Information Security Officer. The email highlighted the potential for “horrible” things to occur within Instagram’s direct messaging system. The plaintiff’s lawyer specifically referenced “dick pics,” and Mosseri confirmed the validity of that concern. Despite this awareness, Meta did not introduce the automatic blurring of explicit images in DMs until nearly six years later.

During the deposition, Mosseri defended the company’s approach, stating that Meta attempted to balance user privacy with safety concerns. He argued that problematic content could be sent through any messaging app, not just Instagram. However, prosecutors focused on the extended period between recognizing the issue and taking action, rather than the current state of safety features on the platform.

The lawsuit also brought to light statistics regarding harmful activity experienced by young Instagram users. A survey revealed that 19.2% of respondents aged 13 to 15 reported seeing unwanted nudity or sexual images on the platform. 8.4% of the same age group indicated they had witnessed self-harm or threats of self-harm within the past seven days of using the app. These figures underscore the potential for significant harm within the platform’s ecosystem.

While the nudity filter represents one of several updates implemented by Instagram to protect teenagers, the delay in its deployment has drawn criticism. The case centers on the allegation that Meta prioritized growth and engagement over the safety of its youngest users. Lawyers involved in the litigation are attempting to demonstrate that the company was aware of the risks but deliberately delayed implementing solutions.

Meta spokesperson Liza Crenshaw responded to the allegations by emphasizing the company’s ongoing efforts to enhance teen safety. “For over a decade, we’ve listened to parents, worked with experts and law enforcement, and conducted in-depth research to understand the issues that matter most,” Crenshaw stated. “We use these insights to make meaningful changes—like introducing Teen Accounts with built-in protections and providing parents with tools to manage their teens’ experiences. We’re proud of the progress we’ve made, and we’re always working to do better.”

The introduction of “Teen Accounts” on Instagram, launched in late 2023 and subsequently expanded to Facebook and Messenger in April 2025, is a key component of Meta’s safety strategy. These accounts include built-in privacy and content controls. A new requirement for parental approval before a teen can initiate an Instagram Live broadcast further strengthens these protections. The automatic blurring of potentially explicit images in DMs, now enabled by default for users under 16, is another significant update.

However, the timing of these developments coincides with increasing scrutiny from lawmakers and regulators in both the U.S. And Europe. The legal challenges and legislative efforts reflect a growing concern about the impact of social media on youth mental health and well-being. The debate extends to the question of age verification, with various laws being proposed and enacted across the United States to limit access to potentially harmful content for minors.

The deposition also touched upon internal discussions within Meta regarding user addiction. An email from a Facebook intern in 2017 expressed a desire to identify “addicted” users and explore potential interventions. This internal focus suggests an early awareness of the potential for compulsive behavior linked to the platform, even before the specific concerns about direct messaging surfaced.

The outcome of these lawsuits could have significant implications for the future of social media regulation and the responsibilities of tech companies regarding the safety of their users, particularly adolescents. The cases are likely to shape the ongoing debate about the balance between user engagement, privacy, and the protection of vulnerable populations.

Share this:

  • Share on Facebook (Opens in new window) Facebook
  • Share on X (Opens in new window) X

Related

apps, Instagram, lawsuit, Meta, social media, social media lawsuit, teens

Search:

News Directory 3

ByoDirectory is a comprehensive directory of businesses and services across the United States. Find what you need, when you need it.

Quick Links

  • Copyright Notice
  • Disclaimer
  • Terms and Conditions

Browse by State

  • Alabama
  • Alaska
  • Arizona
  • Arkansas
  • California
  • Colorado

Connect With Us

© 2026 News Directory 3. All rights reserved.

Privacy Policy Terms of Service