Australia Demands Social Media Account Deletion Reports
“`html
Australia’s Landmark Online Safety Law for Children Takes Effect
Table of Contents
– Updated December 12, 2025, 02:58:33 AM PST
Overview of the New Legislation
Australia has enacted the world’s first legislation specifically designed to safeguard minors in the digital realm. The new law,which came into force on December 13,2023,places significant obligations on social media platforms to verify the age of users and obtain parental consent for those under 16. The aim is to create a safer online habitat for children and address growing concerns about exposure to harmful content and online predators.
Key Requirements and Compliance
The legislation mandates that platforms take “reasonable steps” to verify users’ ages. While the law doesn’t prescribe a specific technology, it emphasizes the need for effective age assurance mechanisms. Platforms are also required to obtain parental consent before collecting, using, or disclosing personal details of users under the age of 16. This includes obtaining verifiable parental consent,going beyond simply relying on a child’s self-declaration of age.
According to the eSafety Commissioner‘s Industry Code of Practice, platforms must implement a range of measures, including age estimation technologies, parental consent processes, and robust reporting mechanisms for harmful content.
Financial Penalties for Non-Compliance
Non-compliance with the new regulations carries substantial financial risks for social media companies. Platforms found in violation face fines of up to 49.5 million Australian dollars (approximately 28.1 million euros as of December 12, 2025). This significant penalty underscores the Australian government’s commitment to protecting children online and serves as a strong deterrent against inaction.Major platforms, including Facebook (Meta), Instagram (Meta), TikTok, X (formerly Twitter), and YouTube (Google), have publicly stated their intention to comply with the law.
The Australian Competition and Consumer commission (ACCC) will also play a role in enforcing the law, alongside the eSafety Commissioner.
Data Collection and Monitoring by eSafety Commissioner
The Australian Communications Regulatory Authority, through the eSafety Commissioner, is actively monitoring platform compliance. On December 11, 2023, the eSafety Commissioner requested data from ten major platforms, comparing the number of new user accounts created on December 9th with those on December 11th. This initial data request aims to establish a baseline and assess the immediate impact of the law.
Platforms are required to submit monthly reports to the eSafety Commissioner for the following six months, providing ongoing transparency into their age verification and parental consent processes. The eSafety Commissioner will publish a public overview of the platforms’ responses within two weeks of receiving the initial data.
International Interest and Potential Adoption
Australia’s pioneering legislation has garnered significant international attention. The european Commission, along with countries including France, Denmark, Greece, Romania, Indonesia, Malaysia, and New Zealand, are actively considering similar restrictions to protect children online. Australian Communications Minister Annika Wells has welcomed this global interest,encouraging international collaboration to address this critical issue.
The European Union’s Digital Services Act (DSA) includes provisions related to the protection of minors, but Australia’s law is considered more specific and proactive in mandating age verification.
Challenges and Legal scrutiny
Despite widespread support, the law faces challenges. the Digital Freedom Project, an advocacy group focused on digital rights, has announced its intention to challenge the legislation in the High Court of Australia on constitutional grounds. The group argues that
