Social media companies have been ordered to provide information on how many children have signed up to their platforms in Australia.
Under expanded powers, Australia’s eSafety Commissioner has requested information from the services to find out how many Australian children were on their platforms and what age assurance measures they had in place to enforce their own age limits.
Commissioner Julie Inman Grant said a series of questions had been sent to Google’s YouTube, Meta platforms Facebook and Instagram, TikTok, Snap, Reddit, Discord and Twitch.
Ms Inman Grant said the requests were sent as a result of expanded transparency powers under the Government’s recently-updated Basic Online Safety Expectations Determination.
She said the recently-amended Determination broadened the areas eSafety could ask industry to report back on and established expectations that companies would take steps to keep users safe online, including being transparent when asked questions by the regulator.
“We know that when it comes to keeping children safe online, we need a multi-pronged approach,” Ms Inman Grant said.
“Imposing age limits is on the table, but we also need better information to understand what will be effective, what the unintended consequences could be and we must absolutely support children in building their digital resilience and critical reasoning skills.
“We are having a really important conversation in this country right now about the potential damaging effects social media might be having on our children and our research shows that almost two-thirds of 14-17 year-olds have viewed potentially harmful content in the past year including drug use, self-harm and violent images, but we also know that teens get many benefits from social media.
Ms Inman Grant said a key aspect of the conversation was having some solid data on how many children are on these platforms and the range of their ages.
“But we also want to assess age assurance readiness and find out how these platforms accurately determine age to prevent children who are under the permitted age from gaining access, and ensure appropriate protections for those who are allowed on the services.
“Most of these platforms have their own age limits in place, commonly 13, but we also want to know how they are detecting and removing under-aged users and how effective this age enforcement is.”
She said eSafety research showed that almost a quarter of eight to 10-year-olds said they used social media weekly or more often, while close to half of 11 to 13-year-olds said they used social media at the same rate.