Europe Social Media & Kids: New Rules
European Union nations are poised to implement new social media rules to safeguard children online. France, Spain, and Greece are leading the charge, proposing stricter regulations that coudl fundamentally change how minors access platforms. The core of the proposal includes mandatory age verification and robust parental controls, responding to growing concerns about the impact of addictive content on children’s mental and physical health.The EU is already scrutinizing platforms like Meta’s Facebook and Instagram, and TikTok, under the Digital Services Act (DSA), investigating their efforts to protect children. These potential changes follow France’s 2023 law requiring parental consent for users under 15, illustrating the EU’s proactive stance on digital child safety. Explore the full scope of these impending changes, the EU’s investigations, and how these measures align with a broader effort to combat online harms; News directory 3 has the full story. Discover what’s next for children’s online safety.
EU Countries Seek Social Media Limits for Children’s Online Safety
Mounting concerns over the impact of social media on children’s well-being have prompted several European countries to push for stricter regulations. France, Spain, and Greece are spearheading an effort to limit minors’ access to online platforms across the European Union.
The proposal, set to be presented to EU officials in Luxembourg, aims to establish a uniform age of digital adulthood. This would require parental consent for children to access social media, addressing worries about addictive content and its potential harm to mental and physical health. Digital Minister Dimitris Papastergiou of Greece emphasized the need for europe to act swiftly to protect children.
France has already taken steps to regulate online content, including a 2023 law mandating parental consent for users under 15. Additionally, adult websites now require age verification to prevent access by minors. TikTok recently banned the #SkinnyTok hashtag, responding to pressure regarding content promoting extreme thinness.
The countries are concerned about algorithmic design that increases children’s exposure to harmful content, potentially worsening anxiety and self-esteem issues. They advocate for an EU-wide request supporting parental controls,age verification,and limitations on certain apps for minors. The European Commission plans to launch an age-verification app that protects personal data.
The EU is actively investigating Meta’s Facebook and Instagram, and also TikTok, under the Digital Services Act (DSA). These probes focus on whether the platforms are adequately preventing children from accessing harmful content. Concerns exist regarding the effectiveness of Meta’s age-verification tools,and investigations have been launched into pornographic platforms over similar issues.
What’s next
The EU’s digital watchdog will finalize guidelines for platforms to protect minors after a public consultation. These guidelines include setting children’s accounts to private by default and simplifying user blocking and muting.Discussions continue regarding a law to combat child sexual abuse material,though disagreements persist over privacy concerns.