Fleshbot Loading...
Loading...

Age Verification Regulations Submitted to Australian Government for Review

LEGAL NEWS STRAIGHT

It seems age verification is coming to Australia as well.

eSafety has received seven draft Codes from industry groups aimed at protecting children from online pornography and other high-impact material, including themes of suicide, self-harm, and disordered eating.

eSafety is the eSafety Commissioner, an independent Australian government agency responsible for promoting online safety and regulating harmful digital content. Established under the Enhancing Online Safety Act 2015, eSafety works to protect all Australians—especially children—from online risks such as cyberbullying, image-based abuse, and exposure to harmful or illegal content.

These codes cover a wide range of online services and seek to enhance digital safety measures.

Australia

Industry groups submitted the drafts in compliance with an extension granted by eSafety in December 2024. The development of these enforceable Codes was initiated in July 2024 as part of the second phase of eSafety’s industry Codes and Standards program.

Phase 2: Protecting Children from Harmful Online Content

While the first phase of the program focused on combating child sexual abuse material and pro-terror content—now addressed under eight enforceable Codes and Standards—the second phase is centered on preventing children’s exposure to inappropriate content online. Additionally, it aims to empower all internet users to manage their digital experiences effectively.

The draft Phase 2 Codes are currently under review by eSafety to determine whether they provide adequate safeguards for the Australian community. If deemed sufficient, the eSafety Commissioner can register the Codes, making them mandatory and enforceable. If not, the Commissioner has the authority to establish new Standards.

A short extension has been granted to app distribution service providers, with their draft Code now due for review by 4 p.m. on 28 March 2025. The extension allows the industry time to align safety measures with evolving international regulatory developments.

Social Media Platforms Face $50 Million Fines for Non-Compliance

Under the proposed new Codes, social media and technology companies would have six months to implement protective measures or risk penalties of up to $50 million. The Codes require platforms that allow pornography to prevent access by minors and introduce age verification mechanisms.

Social media platforms that prohibit pornography must ensure the detection and removal of adult content, including self-harm material and violent imagery. The requirements apply across multiple levels of internet interaction, with separate Codes for social media platforms, gaming services, websites, search engines, internet service providers, and hardware manufacturers.

  • Device manufacturers and operating system providers must enable parental controls and default safety settings on child accounts.
  • Search engines must implement “safe search” functions by default for accounts identified as belonging to children.
  • Internet hosting services must enforce content laws and take proportional action against users violating regulations.
  • Gaming and online service providers must incorporate appropriate age restrictions and content moderation measures.

Draft Codes were initially released in October 2024 but were delayed by two months to align with the Australian government’s planned social media age restrictions for users under 16.

The Codes, developed by industry groups, must be approved and registered by eSafety before taking effect. They address online safety concerns, including exposure to pornography, self-harm, suicide, eating disorders, and violence.

Jennifer Duxbury, Director of Policy and Regulatory Affairs at Digi, a leading digital industry association, emphasized the importance of these measures:

“Online spaces and communication tools provide valuable opportunities for children to learn, connect, and explore the world. However, children should be protected from exposure to pornography and material that encourages harmful behaviors, such as eating disorders, suicide, and self-harm.”

Duxbury noted that the industry has worked closely with eSafety and stakeholders to develop practical solutions, including age assurance measures and enhanced protections across various technology platforms.

If approved, companies will have six months to implement the necessary measures before facing enforcement actions under the federal Online Safety Act, with penalties of up to $50 million for non-compliance.