IJM Australia has welcomed an eSafety Commissioner report that revealed several of the world’s largest tech companies are failing to take adequate steps to detect and address online child sexual exploitation and abuse.
In February, the eSafety Commissioner issued legal notices to five tech companies under its world-leading transparency powers in Australia’s Online Safety Act, requiring X (formerly Twitter), Google, TikTok, Twitch and Discord to answer questions about measures they have in place to address child sexual and abuse exploitation materials on their platforms.
Tragically, since 2015, Australia has consistently ranked the third highest country in the world by both volume and value of transactions to procure the online sexual exploitation of children from the Philippines*.
IJM Australia CEO, Steve Baird, said, “I congratulate Australia’s eSafety Commissioner for taking these world-leading steps to hold big tech responsible for addressing the online safety of children on their platforms.
“IJM welcomes the eSafety Commissioner’s action against the two tech companies who have not complied with transparency notices issued to them relating to measures they have in place to address online child safety.”
Tech companies must be transparent and report how they detect and address child sexual exploitation and abuse, otherwise we will never understand the scale of this issue. They are the ones holding information about their own internal systems, processes and tools.
As the eSafety Commissioner’s report states “We can only know the true scale of the global problem if all online services use readily available technologies and human moderation to detect child sexual exploitation and abuse material, video livestreaming of abuse, grooming of children and sexual extortion.”
“I am appalled at the serious shortfalls in company practices uncovered in this report, including the failure of certain big tech companies to detect livestreamed child sexual abuse, despite existing technology,” Mr Baird said.
Every time a photo or video of child being sexually abused is accessed and distributed, that child is revictimised, compounding the trauma they experienced in-person.
“It is staggering to me that two of the world’s leading tech companies are not blocking links to known child sexual exploitation material, despite the availability of databases from expert organisations that identify links to such material and the websites that facilitate it,” Mr Baird said.
“Not only are known links to child sexual exploitation material not being taken down, but in some cases, tech companies are taking 13 hours to respond to user alerts,” Mr Baird said.
In September 2023, IJM released a world-first report, Scale of Harm*, which uncovered that half a million, or 1 in 100, children in the Philippines were subjected to online sexual exploitation just last year.
“The sad reality is that Australians are paying for the online sexual abuse of children, including in the Philippines, over popular tech platforms that are failing to comply with basic child safety standards under Australian law,” Mr Baird said.
The Scale of Harm report recommends ensuring tech companies use technology designed to prevent or disrupt livestreamed and other child sexual abuse images and videos on their platforms.
IJM has advocated for and applauded the world-leading powers of Australia’s eSafety Commissioner, however, these powers cannot result in meaningful transparency and accountability unless tech companies treat these notices seriously and disclose the information asked for by the Commissioner.
“Transparency is the first step in big tech taking responsibility for the real-world child sexual abuse taking place on their platforms.”
IJM would endorse further follow-up actions as required by the eSafety Commissioner to ensure that tech companies operating in Australia meet their basic online safety reporting obligations.