Skip to content
General News

eSafety report shows while tech giants have made some progress they still have a long way to go in stamping out online child sexual abuse

eSafety Commissioner 4 mins read

Tech giants, including Apple, Google, Meta and Microsoft, are still failing to address critical safety gaps in detecting and preventing child sexual exploitation and abuse (CSEA) on their services despite some improvements, according to a new eSafety transparency report.

The report, which is the second in a series of four to be published by eSafety, summarises tech company responses to a series of questions about how they are tackling the worst-of the-worst online content including real and AI-generated child sexual abuse material, livestreamed child sexual abuse, online grooming and sexual extortion.

In July 2024, eSafety gave eight tech companies legally enforceable periodic transparency notices under Australia’s Online Safety Act requiring each to answer questions about how they are dealing with these issues. The eight companies are Apple, Discord, Google, Meta, Microsoft, Skype, Snap and WhatsApp.

Some of the key risks highlighted in the report include inadequate detection of live online CSEA in commonly-used video calling services, insufficient proactive detection of newly-created CSEA material, and limited application of language analysis tools to detect sexual extortion.

eSafety Commissioner Julie Inman Grant said that while there have been some welcome improvements, it was disappointing to see some companies still not putting in measures to detect and remove new CSEA material and others still not using or developing tools and technology to detect live child sexual abuse taking place over popular video calling services.

“We have been engaging with these companies for a long time on these issues which were already highlighted in our previous transparency reports, so it’s disappointing to see so little progress being made,” Ms Inman Grant said. "I call on industry to address these safety gaps and take steps to stop harmful child sexual exploitation and abuse material and activity from taking place on their services.

”This is not just a question of legality across most jurisdictions, but it is a critical matter of corporate conscience and accountability. Without significant uplift in available safety technologies and practices, we will continue to see devastating consequences for children.

“I think the Australian public has an expectation that tech companies should be doing all they can, including to innovate new technologies, to protect children from sexual exploitation and abuse. This is particularly important for certain features such as live video calling and livestreaming, where proactive tools have not yet been implemented or developed across all services.

“While we’ve seen some positive strides from some services to detect live streamed online child sexual abuse, these need to go further. These companies are no strangers to innovation and have the technological capability to develop new technologies to detect this harm in the fight to end live online abuse of children. This represents a failure of industry will, not of technical capability.

“It’s also disappointing to see mature companies like Apple still not deploying technology to proactively detect and remove newly-created child sexual abuse material across any of its services, instead relying on user reporting, while Google, Microsoft and Snap used tools to detect new CSEA on some of their services (or parts of their service) but not others.

“We also hoped to see more progress by some services on the use of language analysis to pick up sexual extortion of kids and adults.

"eSafety Intelligence and Investigations have provided a number of these companies sexual extortion language indicators, the most common scripts, kill chains and commonly used fake imagery and it beggars belief that these have not yet been deployed to detect organised criminal gangs from targeting Australian young men on this platforms.”

Despite these shortfalls, the report did reveal some positive improvements since the publication of the first transparency periodic report in August last year, including improvements in detection of ‘known’ CSEA material. ‘Known’ CSEA includes material which has been previously identified by law enforcement and child abuse hotlines and continues to circulate online.

The report also showed improved response times to reports of CSEA, better industry information sharing to reduce its prevalence and a welcome expansion of systems like Apple’s Communication Safety and Google’s Sensitive Content Warnings that blur nude images and videos.

“These improvements, while more incremental than monumental, show that these platforms can improve when it comes to protecting the most vulnerable in our society,” Ms Inman Grant said.

“These companies have the resources and technical capability to make their services safer for not just children but all users and that is what these transparency reports are designed to achieve and are achieving.”

Key shortfalls include:

  • Inadequate detection of live online CSEA in video calling services – Meta did not use tools to detect live CSEA on Messenger and neither did Google on Google Meet.
  • Apple’s FaceTime, Microsoft’s Teams, Snapchat, WhatsApp and Discord (Go Live and video calls) continued to not use any proactive detection tools for live online CSEA on these services.
  • Apple still did not use tools to detect new CSEA (material not already identified by law enforcement) on any of it services and Google did not use any tools on Google Meet, Chat, Messages or Gmail. Microsoft did not use tools on OneDrive, Outlook or Teams.
  • Apple, Google’s Chat, Meet and Messages, Microsoft Teams, Skype (service shutdown in May 2025) and Snap still did not use language analysis to proactively detect sexual extortion. Discord was previously trialling using language analysis technology but has discontinued its use.

Key improvements include:

  • Microsoft reported an expansion in its use of tools to detect known (already identified) CSEA material in OneDrive and in email attachments in Outlook sent worldwide.
  • Snap reduced the time taken for moderators to reach an outcome after receiving a report of CSEA material from 1 hour and 30 minutes to 11 minutes. Snap also reduced the time that material was available on Snapchat from 3 hours and 21 minutes to 18 minutes.
  • Apple enabled its Communication Safety feature by default for accounts with users who declared themselves as under 13, with plans to extend this to under 18s by default in the near future. Communication Safety uses on-device machine learning to blur nude photos and videos and send warning messages.
  • Google fully launched its Sensitive Content Warnings which blur incoming images containing nudity before viewing, and prompt users with resources and options.

The eight companies are required to report back to eSafety two more times, in March 2026 and August 2026. eSafety will publish transparency reports summarising their responses in the coming months.

Failure to comply with a mandatory transparency notice given by eSafety may incur financial penalties of up to AUD$825,000 a day for providers that do not respond.

For more information or to request an interview, please contact:
Phone: 0439 519 684 (virtual line – please do not send texts) or [email protected]

More from this category

  • General News, Medical Health Aged Care
  • 05/02/2026
  • 06:01
Private Healthcare Australia

Australians are demanding action to keep specialists fees affordable, research reveals

Australians are demanding urgent action to curb rising medical specialist fees which are forcing a growing number of people to delay or abandon vital medical care. A national survey of more than 4,000 randomly selected Australians — including 2,300 referred to a specialist recently — reveals widespread concern about affordability and access to specialist doctors. The vast majority want the federal government to make specialist care more accessible. The survey found almost one-in-three Australians (30 per cent) have delayed or cancelled specialist care due to cost in the past three years. This rises to one-in-two families with multiple children managing…

  • General News, Medical Health Aged Care
  • 05/02/2026
  • 06:00
UNSW Sydney

TODAY: World-first research centre dedicated to cancer survivorship launches in Sydney

The Australian Research Centre for Cancer Survivorship (ARCCS), launched in partnership between Cancer Council NSW and UNSW, officially opens today. It’s the first research…

  • Contains:
  • General News
  • 05/02/2026
  • 00:41
Brazilian Rare Earths

Brazilian Rare Earths Achieves Exceptional Ore Sorting Results at Monte Alto

SYDNEY, Feb. 04, 2026 (GLOBE NEWSWIRE) -- Brazilian Rare Earths Limited (ASX: BRE / OTCQX: BRELY) (‘BRE’) is pleased to report exceptional results from sensor-based ore sorting test work program that confirms its suitability for Monte Alto’s beneficiation process flowsheet. Key HighlightsExceptional grade enrichment (+100%): Achieved grade upgrade factors of >2x, increasing feed grades from 12.4% TREO to ~27% TREO, using multi-sensor ore sortingHigh-grade product in single-pass: Produced a +27% TREO ultra-high grade product with single-pass processingWorld-class recoveries (95%): Cascade ore sorting produced a +20% TREO rare earth product, with exceptional cumulative recoveries of ~96–99% and upgrade factors of 1.3x-1.7xEfficient waste…

Media Outreach made fast, easy, simple.

Feature your press release on Medianet's News Hub every time you distribute with Medianet. Pay per release or save with a subscription.