Australia's internet regulator has issued warnings to five major social media platforms for failing to adequately enforce the country's groundbreaking ban on users under 16, marking the first major compliance review since the law took effect in December.

The eSafety Commissioner identified significant gaps in how Facebook, Instagram, Snapchat, TikTok and YouTube are implementing age verification measures. The regulator found platforms were allowing children who had previously declared themselves under 16 to repeatedly attempt age verification, while failing to prevent new underage accounts from being created.

The compliance issues extend beyond technical failures. Platforms have not established effective reporting mechanisms for parents and others to flag underage users who maintain access to their services. Some companies offered children multiple opportunities to prove they were over 16 after initially declaring themselves younger.

"The evidence must show the platform has not implemented appropriate systems and processes"
eSafety Commissioner on enforcement standards

While social media platforms have taken some initial action, I am concerned through our compliance monitoring that some may not be doing enough to comply with Australian law

Julie Inman Grant, eSafety Commissioner — BBC

The regulator reported that 4.7 million accounts were restricted or removed in the first month after the ban took effect on December 10. However, this figure represents only initial enforcement actions, with ongoing monitoring revealing persistent compliance challenges across the industry.

◈ How the world sees it3 perspectives
Mostly Analytical2 Analytical1 Supportive
🇬🇧United Kingdom
BBC
Analytical

The BBC presents the story as a regulatory compliance issue, focusing on technical implementation challenges. Coverage emphasizes the international significance of Australia's pioneering approach to social media age restrictions.

🇦🇺Australia
SBS News
Supportive

SBS News frames the story as necessary enforcement action against non-compliant tech companies. The outlet emphasizes government criticism of platforms and includes expert commentary supporting the ban's mental health objectives.

🇩🇪Germany
Tagesschau
Analytical

Tagesschau provides brief coverage focusing on the regulatory investigation aspect. The German outlet presents the story as a straightforward enforcement action by Australian authorities against tech giants.

AI interpretation
Perspectives are synthesized by AI from real articles identified in our sources. Each outlet and country reflects an actual news source used in the analysis of this story.

Australia's legislation targets 10 platforms and carries penalties of up to $49.5 million for companies that fail to take reasonable steps to prevent underage access. The law has drawn international attention, with countries including the United Kingdom closely monitoring its implementation and effectiveness.

Meta, which owns Facebook and Instagram, acknowledged the industry-wide challenge of accurate age determination while arguing for verification systems at the app store level. The company emphasized its commitment to compliance but highlighted technical limitations in current age verification methods.

It is not good enough for big tech to offer kids multiple attempts to get in through photo scanning

Anika Wells, Communications Minister — SBS News

Snap reported locking 450,000 accounts since the ban's implementation, with daily removals continuing. The company's response illustrates the scale of enforcement required across platforms with millions of Australian users.

The regulator is now transitioning from monitoring to active enforcement, gathering evidence to determine whether platforms have implemented adequate systems and processes. The focus shifts from simply demonstrating that some children still access platforms to proving companies failed to establish reasonable preventive measures.

Mental health experts have supported the ban's objectives, citing social media's addictive algorithms and constant accessibility as particular risks for young users. Research suggests platforms can become problematic when used to manage anxiety, depression or other emotional challenges, potentially creating pathways to more serious addictive behaviors.