Home » Tech » Congress Wants Big Tech to Parent Your Kids

Congress Wants Big Tech to Parent Your Kids

by Lisa Park - Tech Editor

Lawmakers in Washington are once again ​focusing ​on kids, screens, and mental health. But according⁤ to Congress, Big Tech is somehow both ⁣the problem and the solution.⁢ The Senate Commerce Commitee recently held a hearing on “examining the effect of technology on America’s youth.” Witnesses warned about “addictive” online content,mental​ health,and kids spending too much time buried in screen. At the center of the debate is a bill from Sens. Ted Cruz (R-TX) and Brian Schatz (D-HI) called the Kids Off Social Media Act (KOSMA), which they say will ​protect children ⁢and⁢ “empower parents.” 

That’s a reasonable ‌goal, especially at a time when many parents ⁣feel overwhelmed ⁢and nervous⁢ about how much‌ time ⁢their kids spend ⁤on screens. But while the bill’s press release contains soothing language, KOSMA doesn’t actually ‌give parents more control. 

Rather of respecting how ‍most parents guide their kids ‍towards healthy and⁢ educational content, KOSMA⁣ hands ⁣the control⁤ panel to Big ​Tech. That’s right-this bill would take power away from parents, and​ hand it over to the ⁢companies that lawmakers ⁢say are the problem.  

Kids Under 13 Are ⁣Already Banned From ⁣social Media

One of the⁣ main promises of KOSMA‍ is simple and dramatic:‌ it would ban​ kids under 13 from social media. Based on the language of bill sponsors, one might ⁤think that’s a big change, and that ⁣today’s rules let kids wander freely into social media‍ sites. But that’s⁤ not ⁢the case.   

Every major platform ⁣already draws the same line: kids under ​13 cannot have an account. FacebookInstagramTikTokXyoutubeSnapchatdiscordSpotify, and even blogging platforms like wordpress all say essentially‍ the same⁣ thing-if you’re under 13,you’re not allowed. That age line has ⁤been there for many years, mostly because of how online services comply ​with a federal privacy law called COPPA

Of course, everyone knows many kids under​ 13 are on these sites anyways. The real question is how and why they get access. 

Most Social Media Use by Younger Kids Is Family-Mediated 

If lawmakers⁤ picture under-13 social‍ media use as ⁢a bunch

TikTok’s potential Account​ Restrictions for Users Under 18

TikTok is facing increasing⁤ pressure⁢ to restrict the​ accounts of users under 18,perhaps leading to account locks,suspensions,or terminations if users cannot verify they are adults,and may require intrusive forms of identification to do so. This stems from concerns ⁢about data privacy and compliance with evolving regulations regarding children’s online safety.

The Children’s Online Privacy Protection Act (COPPA) and TikTok

The core of the​ issue revolves ‌around the Children’s​ Online Privacy Protection Act (COPPA) of 1998. COPPA places ⁣specific requirements ‌on websites and online services ​directed to children⁢ under 13, including obtaining verifiable parental consent ​before collecting, using, or disclosing personal information. ⁣TikTok,while not explicitly *directed* to children,attracts a meaningful ⁤young user base,creating compliance challenges.

TikTok initially settled a COPPA violation case with ​the federal Trade Commission ⁣(FTC) in Febuary 2019, agreeing to ⁤pay $5.7 million. The FTC’s complaint alleged that TikTok illegally collected personal‌ information from ⁣children ​without parental consent. As part of the settlement, TikTok was⁢ required to implement measures to comply with COPPA.

Potential Verification Methods and‌ Privacy Concerns

To ​comply with COPPA and ‍address ‌broader concerns ​about age verification, ‌TikTok‌ is exploring various methods. These include requesting scans of government-issued IDs, biometric‌ data, or other forms of intrusive verification. The concern is ‌that requiring such data poses significant privacy risks for users, even those over 13.

In December 2023, TikTok paused a feature⁣ that would have required users‌ to verify their age with ‍a⁢ state-issued ⁣ID. Reuters ⁣reported that this pause came after⁣ privacy experts raised concerns about ‌the security of the collected data and the potential ⁢for misuse.TikTok stated it was pausing the rollout to address these concerns.

Impact on “Family” Accounts

The potential restrictions notably affect accounts shared by families, where multiple users, including those under 18, may access the platform.‍ TikTok may view these accounts as non-compliant if all⁤ users cannot verify their age.​ this coudl lead​ to the account being locked, suspended, or terminated, forcing families to create separate accounts and potentially provide intrusive verification ⁤for each user.

As ⁣of January 22,2026,TikTok ‍has​ not fully implemented a⁤ mandatory age verification system.TikTok’s Newsroom provides updates on policy changes, but ⁣currently does not detail a finalized plan for widespread age ​verification beyond ongoing efforts to educate users about age restrictions and parental ⁤controls.

Current Status (as of ‌January 22, 2026)

TikTok continues to navigate the complex landscape‍ of children’s online privacy. While the company has taken steps to comply with COPPA and address age verification ‍concerns, a definitive solution remains elusive. The company is actively exploring options, but is balancing compliance with user‌ privacy and data security. There have been no major policy changes implemented ⁢since the pause of the⁤ ID verification feature in December 2023.

You may also like

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.