Global Governments Accelerate Push for Digital Regulation
- Governments across Europe, North America, and Oceania accelerated efforts this week to restrict children’s access to social media, introducing new verification tools, convening international summits, and issuing legal...
- The momentum follows years of concern over the impact of social media on minors’ mental health, privacy, and exposure to harmful content.
- Thierry Breton, European Commissioner for Internal Market, said the app aims to harmonize enforcement of the Digital Services Act’s youth protection provisions across the bloc.
Governments across Europe, North America, and Oceania accelerated efforts this week to restrict children’s access to social media, introducing new verification tools, convening international summits, and issuing legal warnings to platforms as part of a growing global debate over youth safety online.
The momentum follows years of concern over the impact of social media on minors’ mental health, privacy, and exposure to harmful content. On April 15, the European Commission announced the launch of a pilot age-verification app designed to prevent users under 16 from accessing major social networking services without parental consent. The tool, developed in collaboration with national regulators and tech firms, uses government-issued digital identity systems already in place in several EU member states to confirm age at point of login.
Thierry Breton, European Commissioner for Internal Market, said the app aims to harmonize enforcement of the Digital Services Act’s youth protection provisions across the bloc. “We cannot rely on self-declaration by users,” Breton stated in a press briefing. “This system provides a secure, privacy-preserving way to verify age without collecting unnecessary personal data.” The pilot will run in Germany, France, and Spain through the summer, with potential expansion to all 27 EU countries pending evaluation.
Meanwhile, on April 17, a virtual summit hosted by the United Kingdom brought together leaders from Canada, Japan, New Zealand, and the European Union to discuss coordinated action on regulating underage social media use. UK Technology Secretary Michelle Donelan chaired the meeting, emphasizing the need for interoperable standards. “We are not acting in isolation,” Donelan said. “If one country raises the bar, others must follow to prevent platforms from simply shifting younger users to less regulated spaces.”
The summit resulted in a joint statement affirming support for age-assurance technologies, increased transparency requirements for platforms regarding underage user data, and exploration of common frameworks for parental consent mechanisms. Participants agreed to reconvene in September to review progress.
In Australia, the government took a more confrontational approach, issuing a formal legal notice to Meta Platforms Inc. On April 16 alleging repeated violations of the country’s Online Safety Act. The Australian eSafety Commissioner warned that Facebook and Instagram may have allowed children under 13 to create accounts despite age restrictions, potentially exposing them to harmful algorithms and predatory behavior. Meta has 28 days to respond or face civil penalties of up to 10% of global annual turnover under the legislation.
Australia’s stance reflects its position as a pioneer in holding social media companies accountable for youth safety. In 2023, it became the first nation to impose significant fines on TikTok for systemic failures in protecting underage users. The current action signals a willingness to escalate enforcement as platforms face scrutiny in multiple jurisdictions.
Industry representatives have expressed caution about fragmented regulatory approaches. Nick Clegg, Meta’s President of Global Affairs, told Reuters that while the company supports age-appropriate experiences, “a patchwork of national rules risks creating confusion and undermining the very protections we aim to strengthen.” He advocated for global baseline standards developed through multilateral forums such as the OECD and the Global Partnership on Artificial Intelligence.
Child advocacy groups welcomed the governmental actions but urged broader systemic reforms. Sonia Livingstone, professor of social psychology at the London School of Economics and advisor to the UK’s Information Commissioner’s Office, said age verification alone is insufficient. “We need design changes that prioritize children’s well-being over engagement metrics,” Livingstone stated. “That means default private settings, limits on notification frequency, and transparent algorithmic audits — not just barriers at the door.”
As of April 19, no country has implemented a blanket ban on social media for minors, but the convergence of legislative, technological, and diplomatic efforts marks a significant shift in how democracies are addressing the challenges of growing up online. With further summits planned and legal actions underway, the coming months will test whether coordinated international action can keep pace with the rapid evolution of digital platforms.
