Tech giants put on notice to act on abuse

Tech Giants forced to report on actions to tackle online abuse - Newsreel
Tech giants are being legally compelled to report on their progress in tackling online abuse. | Photo: nycshooter (iStock)

Major technology corporations have been issued legal notices compelling them to report on their actions to tackle online child sexual abuse.

This follows concerns that the companies have not made enough meaningful changes over the past two years to deal with these issues.

eSafety Commissioner Julie Inman Grant announced today the notices would require the companies to report on these measures every six months.

Companies targeted include Apple, Google, Meta, Microsoft, Discord, Snap, Skype and WhatsApp.

The directive was issued under Australia’s Online Safety Act and requires all recipients to explain how they are tackling:

  • Child abuse material
  • Livestreamed abuse
  • Online grooming
  • Sexual extortion
  • Production of “synthetic” or deepfaked child abuse material created using generative AI.

Ms Inman Grant said the companies were chosen partly based on answers many of them provided to eSafety in 2022 and 2023 exposing a range of safety concerns when it came to protecting children from abuse.

“We’re stepping up the pressure on these companies to lift their game,” Ms Inman Grant said.

“They’ll be required to report to us every six months and show us they are making improvements.

“When we sent notices to these companies back in 2022-3, some of their answers were alarming but not surprising as we had suspected for a long time that there were significant gaps and differences across services’ practices.

“In our subsequent conversations with these companies, we still haven’t seen meaningful changes or improvements to these identified safety shortcomings.”

Ms Inman Grant said Apple and Microsoft said in 2022 that they did not attempt to proactively detect child abuse material stored in their widely used iCloud and OneDrive services.

This was despite “the fact it is well-known” that these storage services were a haven for child sexual abuse and pro-terror content.

“We also learnt that Skype, Microsoft Teams, FaceTime, and Discord did not use any technology to detect live-streaming of child sexual abuse in video chats,” Ms Inman Grant said.

“This is despite evidence of the extensive use of Skype, in particular, for this long-standing and proliferating crime.”

eSafety also found that eight different Google services, including YouTube, were not blocking links to websites that were known to contain child abuse material.

Compliance with a notice is mandatory, and there can be financial penalties of up to $782,500 a day for services that do not respond.

The companies will have until February 15 2025 to provide their first round of responses.