Australia’s online safety authority has raised “significant concerns” around how well the digital giants are complying with new social media laws.
eSafety said it was gathering evidence to “inform potential enforcement action” against the owners of Facebook, Instagram, Snapchat, TikTok and YouTube.
This was based on concerns around compliance with Australia’s Social Media Minimum Age (SMMA) obligation, which bans children under 16 from having social media accounts.
eSafety’s said its first compliance report, published this week, showed there had been some progress in the first three months, including large scale account removals and more visible underage reporting pathways.
However, “major gaps” remained.
eSafety Commissioner Julie Inman Grant said the new laws required “durable, generational change” and this took time.
“But these platforms have the capability to comply today and we certainly expect companies operating in Australia to comply with our safety laws,” she said.
“They can choose to do so or face escalating consequences, including profound reputational erosion with governments and consumers globally.”
Ms Inman Grant said, while the onus was on age-restricted platforms to take reasonable steps to keep children under 16 from having accounts, parents were proving to be pivotal partners in this “cultural reset”.
“We have heard from parents who have said the law is empowering them to say no to requests by their kids to have social media accounts,” she said.
“Any cultural change that pushes against the powerful interests and revenue potential of entrenched industry players – whether car manufacturers, Big Tobacco or Big Tech. Those players will push back but we continue to push ahead.
“We are committed to seeing further action taken by these social media giants, either through significant safety uplift in line with Australia’s laws, or in response to enforcement action.”
eSafety has observed a number of poor practices that give rise to compliance concerns outlined in the report. These include:
- Prompting children to attempt age assurance even where their declared age prior to December 10, 2025 was under 16.
- Enabling children aged under 16 to repeatedly attempt the same age assurance method to ultimately obtain a 16+ outcome.
- Failing to provide accessible or effective pathways for reporting age-restricted accounts.
- Insufficient measures to prevent new under-16 accounts being created.
Ms Inman Grant said that, while social media platforms had taken some initial action, she was concerned that some may not be doing enough to comply with Australian law.
“As a result, we are now moving into an enforcement stance,” she said.
“Any enforcement action requires sufficient evidence, which takes time to gather.
“The evidence must establish the platform has not taken reasonable steps to prevent children aged under 16 from having an account.”
eSafety has a range of enforcement powers, including infringement notices, enforceable undertakings, public Platform Provider Notifications and civil penalties of up to $49.5 million.








