Skip to content
General News, International News

Second set of tech giants falling short in tackling child sexual exploitation material, sexual extortion, livestreaming of abuse

eSafety Commissioner 4 mins read

Australia’s eSafety Commissioner has released its second report under world-leading transparency powers showing some of the biggest tech companies aren’t living up to their responsibilities to tackle the proliferation of child sexual exploitation, sexual extortion and the livestreaming of child sexual abuse. 

In February, eSafety issued legal notices to Twitter (subsequently rebranded as “X”), Google, TikTok, Twitch and Discord under Australia’s Online Safety Act. The new transparency powers required the tech companies to answer questions about measures they have in place to deal with the issue. 

The report which summarises their answers, highlights serious shortfalls in how some companies detect, remove and prevent child sexual abuse material and grooming, inconsistencies in how companies deal with this material across their different services and significant variations in the time it takes them to respond to public reports. 

eSafety Commissioner Julie Inman Grant said the proliferation of online child sexual exploitation is a growing problem both in Australia and globally and technology companies have a moral responsibility in protecting children from sexual exploitation and abuse being stored, shared and perpetrated on their services.

“We really can’t hope to have any accountability from the online industry in tackling this issue without meaningful transparency which is what these notices are designed to surface,” Ms Inman Grant said.

“Our first report featuring Apple, Meta, Microsoft, Skype, Snap, WhatsApp and Omegle uncovered serious shortfalls in how these companies were tackling this issue.

“This latest report also reveals similar gaps in how these five tech companies are dealing with the problem and how they are tackling the rise in sexual extortion and we need them all to do better.

“What we are talking about here are serious crimes playing out on these platforms committed by predatory adults against innocent children and the community expects every tech company to be taking meaningful action.

“Importantly, next year we will have industry codes and standards in place which work hand-in-hand with these Basic Online Safety Expectations transparency powers to ensure companies are living up to these responsibilities to protect children.” 

eSafety also found that two providers, Twitter/X and Google, did not comply with the notices given to them, with both companies failing to adequately respond to a number of questions in their respective notices.

Google has been issued a formal warning, notifying it of its failure to comply due to the company providing a number of generic responses to specific questions and providing aggregated information when asked questions about specific services.

Twitter/X’s non-compliance was found to be more serious with the company failing to provide any response to some questions, leaving some sections entirely blank. In other instances, Twitter/X provided a response that was otherwise incomplete and/or inaccurate. 

Twitter/X did not respond to a number of key questions including the time it takes the platform to respond to reports of child sexual exploitation; the measures it has in place to detect child sexual exploitation in livestreams; and the tools and technologies it uses to detect child sexual exploitation material.

The company also failed to adequately answer questions relating to the number of safety and public policy staff still employed at Twitter/X following the October 2022 acquisition and subsequent job cuts.

Twitter/X has been issued with an infringement notice for $610,500 and has 28 days to request the withdrawal of the infringement notice or to pay the penalty. If Twitter chooses not to pay the infringement notice, it is open to the Commissioner to take other action. eSafety has also published a statement, called a service provider notification, about the non-compliance by Twitter/X.

Ms Inman Grant said Twitter/X and Google’s non-compliance was disappointing especially as the questions relate to the protection of children and the most egregious forms of online harm.

“Twitter/X has stated publicly that tackling child sexual exploitation is the number 1 priority for the company, but it can’t just be empty talk, we need to see words backed up with tangible action.

“If Twitter/X and Google can’t come up with answers to key questions about how they are tackling child sexual exploitation they either don’t want to answer for how it might be perceived publicly or they need better systems to scrutinise their own operations. Both scenarios are concerning to us and suggest they are not living up to their responsibilities and the expectations of the Australian community.”

Some of the key findings in the report featuring Twitter/X, TikTok, Google, Twitch and Discord include: 

  • While YouTube, TikTok and Twitch are taking steps to detect child sexual exploitation in livestreams, Discord is not, saying it is ‘prohibitively expensive’. Twitter/X did not provide the information required.

  • TikTok and Twitch use language analysis technology to detect CSEA activity such as sexual extortion across all parts of their services whereas Discord does not use any such detection technology at all. Twitter/X uses tools on public content, but not on direct messages. Google uses technology on YouTube, but not on Chat, Gmail, Meet and Messages.

  • Google (with the exception of its search service) and Discord are not blocking links to known child sexual exploitation material, despite the availability of databases from expert organisations like the UK-based Internet Watch Foundation.

  • YouTube, TikTok and Twitch are using technology to detect grooming, whereas Twitter/X, Discord and other Google services are not (Meet, Chat, Gmail, Messages).

  • Google is not using its own technology to detect known child sexual exploitation videos on some of its services – Gmail, Chat, Messages.

  • In the three months after Twitter/Xs change in ownership in October 2022, the proactive detection of child sexual exploitation material fell from 90% to 75%. It said its proactive detection rate had subsequently improved in 2023.

  • For Discord and Twitch, which are partly community moderated services, professional safety staff are not automatically notified when a volunteer moderator identifies child sexual exploitation and abuse material.

  • Significant variations in median response times to user reports of child sexual exploitation material exist – TikTok says it responds within 5 minutes for public content, Twitch takes 8 minutes, Discord took 13 hours for direct messages, while Twitter/X and Google did not provide the information required.

  • Significant variation in the languages covered by content moderators. Google said it covers at least 71 and TikTok 73. In comparison Twitter said it covered only 12 languages, Twitch reported 24 and Discord report 29. This means that some of the top 5 non-English languages spoken at home in Australia are not by default covered by Twitter, Discord and Twitch moderators. This is particularly important for harms like grooming or hate speech which require context to identify.

The full report can be found here

For more information or to arrange an interview, please phone 0439 519 684 or email media@esafety.gov.au

More from this category

  • General News, Regional Country Services
  • 18/10/2024
  • 10:35
NSW Office of Sport

Play your part in keeping children safe in sport

Play your part in keeping children safe in sport The NSW Government will host a series of interactive child safety workshops in the Central West and Western Plains next week to help local sporting organisations keep children safe from harm and abuse in sport. The NSW Office of Sport has partnered with the Office of the Children’s Guardian to deliver the workshops which will provide practical information on the simple steps sports clubs can take to protect children. The workshops will be held at Dubbo, Orange and Bathurst on 22, 23 and 24 October and will be delivered by MattSibley,…

  • Contains:
  • General News
  • 17/10/2024
  • 23:11
Wood Mackenzie

US utilities to face significant challenge as power demand surges for the first time in decades

Some regions in US to see 15% electricity demand growth through 2029; prices could escalateLONDON and HOUSTON and SINGAPORE, Oct. 17, 2024 (GLOBE NEWSWIRE) -- US power demand has remained essentially flat for the past decade, but this is all about to change as a pending surge in demand growth will be the biggest challenge for utility companies in decades, according to the latest Horizons report from Wood Mackenzie.According to the report, “Gridlock: the demand dilemma facing the US power industry” US electricity demand growth will be between 4% and 15% through 2029, depending on the region, with burgeoning data-centre…

  • General News
  • 17/10/2024
  • 16:56
Global Edge

FibreconX And Global Edge Launch New MSP Incentive And Platform To Offer Cutting Edge Fibre Solutions

SYDNEY, Australia, Oct. 17, 2024 (GLOBE NEWSWIRE) -- In a groundbreaking collaboration, FibreconX and Global Edge have joined forces to offer Managed Service Providers (MSPs) a compelling suite of fibre products through the Global Edge platform. This new alliance not only broadens the scope of services MSPs can offer but also introduces an attractive commission structure.Unlocking New Revenue StreamsThe partnership between FibreconX, renowned for its Pure Fibre connectivity network, and Global Edge, a leader in network service automation, promises to deliver unprecedented value to MSPs. At the heart of this partnership is the availability of FibreconX dark fibre access via…

Media Outreach made fast, easy, simple.

Feature your press release on Medianet's News Hub every time you distribute with Medianet. Pay per release or save with a subscription.