Skip to main content
News Directory 3
  • Home
  • Business
  • Entertainment
  • Health
  • News
  • Sports
  • Tech
  • World
Menu
  • Home
  • Business
  • Entertainment
  • Health
  • News
  • Sports
  • Tech
  • World
Meta Widens Teen Account Safety on Facebook & Messenger

Meta Widens Teen Account Safety on Facebook & Messenger

April 8, 2025 Catherine Williams - Chief Editor Tech

Meta Introduces ‘Teen accounts’ Amid Scrutiny Over Youth⁣ Safety

Table of Contents

  • Meta Introduces ‘Teen accounts’ Amid Scrutiny Over Youth⁣ Safety
    • Legislative Pressure Mounts
    • Enhanced Safety Measures
    • legislative ‍Landscape
  • Meta’s Teen Accounts: A⁣ Deep ‍Dive into Safety and Scrutiny
    • What are ⁤”Teen Accounts” on Facebook and Messenger?
    • Why is Meta‌ launching Teen Accounts?
    • What ⁣specific​ safety measures are being implemented?
    • When will these updates ‍be implemented?
    • Is⁣ this a ⁤reaction to‍ legislative pressure?
    • What legislation is being discussed?
    • Have there been ⁣any lawsuits against Meta related to youth safety?
    • Are other social​ media platforms ​addressing youth safety?
    • Key ‌Points ⁣of the⁣ New⁤ Teen‍ Accounts
    • What ⁣is the current ⁤stance of the House of Representatives on child online safety?

MENLO PARK, Calif.– Meta Platforms (META) is​ launching its⁤ “Teen Accounts”​ feature on facebook‌ and Messenger, the company announced⁢ Tuesday. The move comes as⁤ Meta faces increasing ⁣criticism for allegedly failing to adequately protect young users from online harm.

The company stated that these enhanced data protection and child safety features, building on similar measures introduced on ‍Instagram last year, ⁣are designed to address concerns about⁣ the amount of time young people spend on social media platforms.

Legislative Pressure Mounts

meta’s expansion‌ of safety features for adolescents‌ coincides ‌with⁤ growing calls from lawmakers for⁤ legislative action, including proposals like the Kids Online Safety act (KOSA), aimed at shielding children from ⁤the potential harms of social media.

Meta, along with ByteDance’s TikTok and Google’s YouTube (GOOG), are already facing⁤ numerous lawsuits filed on behalf of children and school districts, alleging that the platforms’ addictive‌ nature contributes to⁣ negative ​outcomes.

In 2023, a ​coalition of 33 U.S. states, including California and New York, sued Meta, raising‍ concerns about the dangers its platforms pose and accusing the company ​of ⁤misleading the public.

Enhanced Safety Measures

Meta said that users ‍under the age⁣ of 16 will ⁣now require parental⁣ permission before‍ making their accounts public. The ​company is also deactivating ‌a feature that automatically displays images potentially containing nudity in direct messages.

“We will introduce these⁣ updates in the next few⁤ months,” the company stated.

legislative ‍Landscape

In July ⁢2024, the U.S.⁢ Senate⁢ introduced two online safety bills – KOSA and The Children and Teens’ Online Privacy Protection ‌Act ​– that would hold social media companies accountable for the ‌impact their platforms have on children and adolescents.

While ⁢the Republican-led House of Representatives⁣ rejected KOSA last year, lawmakers have indicated plans to advance new legislation ⁢aimed at protecting children online.

Most ⁤major platforms, including Facebook, Instagram, and TikTok, currently allow users aged 13 and older.

Meta’s Teen Accounts: A⁣ Deep ‍Dive into Safety and Scrutiny

What are ⁤”Teen Accounts” on Facebook and Messenger?

Meta⁢ Platforms (META) has introduced a new feature called​ “Teen ‌Accounts” on⁢ Facebook and Messenger. This feature is⁢ designed to enhance ‍data protection ​and child safety for ​younger users. According to Meta, the goal of these new features is⁢ to address concerns about the⁣ amount of time young people spend on social media platforms, as stated ⁢in​ thier announcement on Tuesday.

Why is Meta‌ launching Teen Accounts?

Meta is launching “Teen Accounts” in response to increasing ⁣criticism and ⁢scrutiny regarding youth safety ‌on its ‍platforms. It’s facing allegations⁣ of‌ failing ⁤to adequately protect young users from online harm. This‍ move mirrors similar safety measures ⁤previously introduced‌ on Instagram.

What ⁣specific​ safety measures are being implemented?

Meta is implementing several key ⁤safety measures‍ for teen users:

  • Parental consent will be required before users under 16 can make their‌ accounts public.
  • A‌ feature that‍ automatically displays images potentially containing nudity ‍in‌ direct messages will be deactivated.

When will these updates ‍be implemented?

Meta stated​ that these updates ‍will be introduced ⁤in the ​next few ⁢months, as mentioned in their announcement.

Is⁣ this a ⁤reaction to‍ legislative pressure?

Yes,⁣ the launch of “Teen accounts” and the introduction​ of new safety features coincides with growing ⁢calls from‌ lawmakers, seeking legislative​ action⁢ regarding‍ online safety for children. Lawmakers are pushing for stricter regulations to protect children from the ​potential harms of social media platforms. ​

What legislation is being discussed?

Several legislative measures are being considered, including:

  • The Kids Online Safety Act (KOSA): This ⁤proposal is aimed at‍ shielding children from the potential harms of social media.
  • The⁤ Children and Teens’ Online Privacy ‍Protection Act: This Act ⁢would hold social ‍media companies accountable for the impact their platforms have on children⁣ and adolescents.

Have there been ⁣any lawsuits against Meta related to youth safety?

Yes. Meta,​ along with TikTok and YouTube, are facing numerous lawsuits on behalf of children and school districts.⁢ These ‌lawsuits allege that the platforms’ ⁤addictive nature contributes to negative outcomes⁤ for young⁣ users. ​‍ Moreover,⁢ a coalition of 33 U.S.⁣ states, ‍including⁢ California ⁣and New york, sued⁣ Meta in 2023 citing⁤ the dangers ‍its​ platforms pose and accusing the company of misleading the public.

Are other social​ media platforms ​addressing youth safety?

Yes, many major platforms, including Facebook, Instagram, and ⁢TikTok, are implementing safety measures and allowing users aged 13 and‌ older. However,regulations and ⁣implementations may vary across these platforms. Specifically, Instagram ‍already had similar ​measures in⁤ place before these Teen Account updates.

Key ‌Points ⁣of the⁣ New⁤ Teen‍ Accounts

Here’s a fast ‍summary of⁣ the⁣ changes:

Feature Details
Age Requirement Parental permission needed for under 16
Account Visibility Parental‍ controls before ‍making account public
Direct message ⁤Safety Automatic display of nudity deactivated
Legislative Context coincides with calls for legislative action like ‍KOSA

What ⁣is the current ⁤stance of the House of Representatives on child online safety?

The Republican-led ‌House of ‍Representatives⁢ rejected KOSA last year, but lawmakers have indicated plans to advance new ⁤legislation‍ aimed at ⁤protecting children online.

Share this:

  • Share on Facebook (Opens in new window) Facebook
  • Share on X (Opens in new window) X

Related

Search:

News Directory 3

ByoDirectory is a comprehensive directory of businesses and services across the United States. Find what you need, when you need it.

Quick Links

  • Copyright Notice
  • Disclaimer
  • Terms and Conditions

Browse by State

  • Alabama
  • Alaska
  • Arizona
  • Arkansas
  • California
  • Colorado

Connect With Us

© 2026 News Directory 3. All rights reserved.

Privacy Policy Terms of Service