Skip to content
Political

Stronger online safety laws needed to protect children, with new report exposing tech giants for failing to meet basic standards

International Justice Mission Australia 3 mins read

6 August 2025

The Australian Government must deliver on its promise to introduce digital duty of care legislation, with a new eSafety Commissioner report exposing a seemingly industry wide failure to meet basic child protection standards.

According to eSafety Commissioner findings released today from transparency notices on child exploitation and abuse for August 2025 (periodic), tech companies continue to do nothing to protect children from sexual abuse that is streamed live on their services, including in video calls.

 

The failure continues “[d]espite the availability of technology to help detect child sexual exploitation and abuse in livestreams or video calls,” according to the eSafety Commissioner’s latest report.

 

“I think this is hugely concerning. This is illegal content. This is literally the rape, torture of children and they’re enabling it and turning a blind eye,” said Julie Inman-Grant, the eSafety Commissioner.

 

The Australian Institute of Criminology has identified online video calling services as “a central vector for livestreaming child sexual exploitation and abuse.”  

 

International Justice Mission’s 2023 study, Scale of Harm, conducted with the UK’s Nottingham Rights Lab, found that nearly half a million children in the Philippines are sexually abused to create new abuse material, especially in livestreams; and no wonder—tech companies continue to allow these live

crimes scenes on their platforms. 

 

In their submission to the 2024 review of Australia’s Online Safety Act, Philippine Survivor Network leaders stated We have experienced different forms of online sexual exploitation such as livestreamed sexual abuse and production of child sexual abuse materials in exchange for money received by facilitators from online paying customers.”

“We also specifically ask that digital services be held accountable for livestreaming child sexual abuse –  that such criminal activity be proactively detected and disrupted on their platforms.

“Every Australian should find it deeply disturbing that the wealthiest, most powerful tech companies in the world are doing nothing to protect children from child sexual abuse streamed live in video calls,” said John Tanagho, Executive Director, IJM Center to End Online Sexual Exploitation of Children. 

“Companies can use artificial intelligence and machine learning tools, along with a range of signals and red flags, to detect and disrupt in real-time the rape of children through the services and platforms they provide freely to the world.  They must be required to do so.”

Online service providers are required to report, via transparency notices from the eSafety Commissioner, on how they are implementing the Basic Online Safety Expectations set out by the Australian Government.

eSafety’s latest report also found that many of the tech companies:

  • Failed to use tools to proactively detect new child sexual exploitation and abuse images and videos;
  • Have safety deficiencies in user reporting to identify child sexual exploitation and abuse and the resulting illegal images and videos;
  • Did not use hash-matching to proactively detect known child sexual exploitation and abuse; and
  • Did not use language analysis tools to proactively detect sexual extortion.

IJM Australia CEO David Braga said the findings send a strong signal that stronger online safety laws are needed to protect children.

“These findings are shocking and are a powerful reminder of why digital duty of care legislation is urgently needed,” Mr Braga said.

“Tech companies are among the wealthiest and most sophisticated organisations in the world and have the resources to prevent illegal activity and content on their platforms. They need to lift their game and do better to protect children online.

“Many of these companies aren’t doing enough to combat child sexual abuse or exploitation material on their platforms, apps, and devices. Self-regulation isn’t enough. It needs to be enshrined in law.

“Our laws must be fit-for-purpose in the digital age and cater for existing and emerging technologies.”

The Government announced in November 2024 its intention to legislate a digital duty of care, which is a key recommendation of the independent statutory review of the Online Safety Act 2021.

Under the changes, digital platforms would be required to take reasonable steps to prevent foreseeable harms on their platforms and services, with the framework to be underpinned by risk assessment and risk mitigation and informed by safety-by-design principles.

These platforms would also be obligated to continually identify and mitigate potential risks, as technology and service offerings change and evolve.

The eSafety Commissioner Snapshot report is available here: BOSE-Snapshot-first-regular-report-on-CSEA-sexual-extortion-periodic-notices-August2025.pdf; and the full report is available here: A Baseline for online safety transparency: The first regular report on child sexual exploitation and abuse, and sexual extortion (pg. 38-40 for livestreamed child sexual exploitation and abuse.)

 

Media Contact: Briony Camp | [email protected] | 0468 308 696

Media

More from this category

  • Manufacturing, Political
  • 12/12/2025
  • 12:19
Australian Workers' Union

AWU welcomes government action to secure Tomago’s future

The Australian Workers' Union has strongly endorsed today's announcement that theTomago aluminium smelter will remain operational, with federal and state governments committing to work with Rio Tinto on a long-term solution. “This is a pivotal moment for Australian manufacturing,” AWU National Secretary Paul Farrow said. "The AWU has been knocking on every door - federal, state, company, thought leaders - to make sure the right people were talking to each other and working toward a solution. We're pleased that effort has paid off. "For months we've been saying that Tomago isn't just another industrial site. It's the test case for…

  • Oil Mining Resources, Political
  • 12/12/2025
  • 12:06
Mining and Energy Union

MEU: Coal communities need stability and consistency following Net Zero Commission report

The Mining and Energy Union has responded to the NSW Net Zero Commission’s Coal Mining Emissions Spotlight Report, emphasising the continuing importance of coal mining to the state's economy and regional communities, and the need for clear, consistent emissions policy. MEU General Secretary Grahame Kelly said coal mining remains a foundation of regional prosperity in NSW, supporting jobs, local small businesses and billions in annual state revenue. “Coal mining delivers more than $3 billion a year in royalties for NSW and supports thousands of secure, well-paid regional jobs,” Mr Kelly said. “It also accounts forjust12 per cent of the state’s…

  • Political
  • 11/12/2025
  • 11:51
Unions NSW

Not meaningful reform: workers lose under compensation cuts

Psychologically injured workers who are close to catatonic will have their support payments cut under new laws agreed to between the Government and the Liberals and Nationals. Despite repeated evidence that a WPI of more than 21% means a worker has no capacity to work, the Parliament looks set to raise the threshold for income support to 25%, before ratcheting up to 28 per cent by 2029. “The parliament has failed to deliver meaningful reform. Instead, it has taken a sledgehammer to the entitlements of traumatised and vulnerable workers,” said Thomas Costa, acting Secretary of Unions NSW. Under the changes,…

Media Outreach made fast, easy, simple.

Feature your press release on Medianet's News Hub every time you distribute with Medianet. Pay per release or save with a subscription.