Australia requires online age verification to protect children
New Australian rules mandate age checks on online adult content
SYDNEY: (Web Desk) – Australia’s online safety regulator has warned pornography websites that they must block users under 18 starting Monday under sweeping new restrictions aimed at protecting children.
Some websites had already begun restricting non-members and refusing new registrations on Friday, ahead of the mandatory rollout of age verification technology to prevent underage access.
The new rules are part of an expansion of Australia’s online child safety measures, following the country’s December 10 ban on children under 16 joining social media platforms. The crackdown limits child access to “age-inappropriate content,” including pornography, high-impact violence, suicide, and eating disorder material.
The regulations cover porn websites, search engines, app stores, gaming providers, and generative AI systems, including chatbots.
“Make no mistake, where we see failures or foot-dragging, we will hold companies to account,” said Julie Inman Grant. Failure to comply can result in penalties of up to Aus$49.5 million (US$35 million) per breach.
Under the rules, users must confirm their age when accessing age-restricted material. Simply clicking “I am 18 years or older” is no longer sufficient, aligning Australia with similar international safeguards.
Inman Grant emphasized that society has long agreed on the need for age barriers to protect children. “We don’t allow children to walk into bars, adult stores, or casinos, but online spaces they frequently visit had no such safeguards. That changes for Australian kids,” she said.
The regulator noted that industries will be required to maintain consistent standards across all platforms to prevent accidental exposure to harmful content. AI chatbots generating sexually explicit, violent, or self-harm content must verify user ages. App stores and online gaming platforms must also block under-18s from adult-only content.
Data of 15.8 Million Patients Hacked in Massive French Health Breach
Search engine users not logged in, for example on Google, will see blurred results for pornography and high-impact violence. Searches related to suicide or eating disorders will prioritize referrals to mental health support services.
The eSafety Commission will monitor compliance and take enforcement action against systemic breaches. Inman Grant added, “No piece of regulation will eliminate all risks immediately, but these codes create meaningful protections for children across the tech ecosystem. The government’s commitment to a digital duty of care will further strengthen protections in the future.”


Comments are closed, but trackbacks and pingbacks are open.