Skip to content
Government Federal

eSafety urges schools to report deepfakes as numbers double

eSafety 4 mins read

eSafety urges schools to report deepfakes as numbers double 

eSafety Commissioner Julie Inman Grant has issued an urgent call for schools to report deepfake incidents to appropriate authorities as the rapid proliferation of ‘nudify’ apps online takes a growing toll on communities around Australia.  

The Commissioner has written to education ministers urging them to ensure schools adhere to state and territory child protection legislation and mandatory reporting obligations.  

To help address the threat of AI-generated abuse in Australian classrooms, reports of which have steadily increased over the past 18 months, eSafety has today released an updated Toolkit for Schools including a step-by-step guide for dealing with deepfake incidents.  

The guide strongly encourages educators to prioritise the wellbeing of children and targeted staff and report any potential criminal offence to local police. 

“I’m calling on schools to report allegations of a criminal nature, including deepfake abuse of under-aged students, to police and to make sure their communities are aware that eSafety is on standby to remove this material quickly,” Ms Inman Grant said. 

“It is clear from what is already in the public domain, and from what we are hearing directly from the education sector, that this is not always happening."  

“Our response guide helps schools prepare for and manage deepfake incidents, taking into account the distress and lasting harms these can cause to those targeted.  

“It also encourages schools to openly communicate their online safety policies and procedures, and the potential for serious consequences, including criminal charges in some instances, for perpetrators who may be creating synthetic child sexual abuse material,” Ms Inman Grant said. 

eSafety today has also issued a new Online Safety Advisory (LINK) to alert parents and schools to the recent proliferation of open-source AI ‘nudify’ apps that are easily accessible by anyone with a smartphone. 

“Creating an intimate image of someone under the age of 18 is illegal. This includes the use of AI tools. Parents and carers can help educate their children that this behaviour can lead to criminal charges.” 

Additionally, eSafety is hosting a series of webinars (LINK) throughout July and August for parents, educators and youth-serving organisations on AI-assisted image-based abuse and navigating the deepfake threat.  

AI proliferation 

New data reveals reports to eSafety’s image-based abuse scheme about digitally altered intimate images, including deepfakes, from people under the age of 18 have more than doubled in the past 18 months, compared to the total number of reports received in the seven years prior. Four out of five of these reports involved the targeting of females. 

While the rapid rise in reports is cause for concern, the reality may be worse, Ms Inman Grant warned. 

“We suspect what is being reported to us is not the whole picture,” Ms Inman Grant said.  

“Anecdotally, we have heard from school leaders and education sector representatives that deepfake incidents are occurring more frequently, particularly as children are easily able to access and misuse nudify apps in school settings,” Ms Inman Grant said. 

“With just one photo, these apps can nudify the image with the power of AI in seconds. Alarmingly, we have seen these apps used to humiliate, bully and sexually extort children in the school yard and beyond. There have also been reports that some of these images have been traded among school children in exchange for money.” 

“We have already been engaging with police, the app makers and the platforms that host these high-risk apps to put them on notice that our mandatory standards come into full effect this week and carry up to a $49.5 million fine per breach, and that we will not hesitate to take regulatory action.” 

eSafety’s role 

eSafety’s clear role as Australia’s independent online safety regulator is to seek the removal of this kind of distressing material as quickly as possible, and to provide support to those impacted. 

Australians who have experienced image-based abuse (the non-consensual sharing online of intimate images, including deepfakes) are encouraged to report it. Allegations of criminal nature should be reported to local police and then to us at eSafety.gov.au.    

“Our specialist teams can provide advice, support, and help to remove harmful content wherever possible,” Ms Inman Grant said. 

“We have a very high success rate in removing harmful material - up to 98 per cent in cases of image-based abuse. Those affected consistently tell us that swift content takedown is their main concern.” 

Alongside removal actions, eSafety has remedial powers which can be used to require the perpetrator to take further, specific actions.   

eSafety also supports schools through its education and professional learning programs, training and resources.  

“Our world-first industry standards will tackle the most harmful online content, including deepfakes and ‘nudify apps’ that use AI to generate explicit material depicting a child under 18,” Ms Inman Grant said. 

“These standards require high-risk AI tools, including nudify apps, to implement robust safeguards that prevent misuse, including child exploitation.” 

Non-compliance may result in civil penalties of up to $49.5 million per breach. 

“Safety by Design needs to be a fundamental tenet in the creation and deployment of these apps which is why eSafety also continues to push for an end to the industry’s ‘release first, fix later’ mindset, particularly when the damage already wrought by these tools is so apparent,” Ms Inman Grant said. 

“We are looking forward to the Government moving forward with a digital duty of care and will continue to call for stronger industry adoption of eSafety’s Safety by Design principles, which ensure platforms are built with embedded safety features such as privacy settings, effective moderation and age-appropriate protections from the outset, not as an afterthought.  

“Ultimately, we need a holistic response that combines regulation, platform responsibility, education and cultural change to ensure emerging technologies are not used to shame, exploit or harm others, especially children.”  

Role of police 

It is important to emphasise that eSafety’s role as a regulator differs from the role of police.  

“While we act swiftly to remove harmful content, it is police who are responsible for pursuing criminal charges. We also refer serious matters to the appropriate law enforcement agencies and often work in parallel with them during investigations,” Ms Inman Grant said.  

Media

More from this category

  • Government Federal, Indigenous
  • 13/02/2026
  • 08:55
Aboriginal Medical Services Alliance Northern Territory (AMSANT)

AMSANT welcomes new Closing the Gap investment as progress remains critical

The Aboriginal Medical Services Alliance Northern Territory (AMSANT) today noted the Prime Minister’s annual statement on the National Agreement on Closing the Gap. It welcomes the Federal Government’s latest investments to build progress, particularly investment in health infrastructure and measures to reduce the cost of food in remote stores. AMSANT Chair Rob McPhee said these initiatives signal growing recognition of the deep and complex challenges facing Aboriginal communities in the Northern Territory. ‘We welcome the commitment to improve access to Aboriginal employment in the health sector, reduce the cost of groceries in remote communities, and strengthen support for services that…

  • General News, Government Federal
  • 13/02/2026
  • 08:00
e61 Institute

Aussies couple up with people of similar socio-economic status: New research

Australians are coupling up with people of similar socio-economic status, perpetuating inequality and fuelling house price growth in cities, according to new research by the e61 Institute. The analysis of 2021 Census data finds that in 38% of working age couples, both partners have a university degree, far higher than would occur under random matching. Bachelor’s degree holders, for example, are about 85% more likely to partner with another bachelor’s graduate than chance would predict. Coupling up based on social-economic background is known as assortative mating. “Because higher education is strongly linked to lifetime earnings, this means higher incomes are…

  • Government Federal
  • 12/02/2026
  • 14:36
Catholic Health Australia

Transparency on specialist fees is a welcome and necessary reform

Catholic Health Australia has welcomed new legislation introduced today to improve the transparency of specialist doctor fees as health costs continue to soar. The new legislation will allow the Government to upload specialist cost data directly to the Medical Costs Finder, an online tool that helps patients understand costs for specialist services. The Medical Costs Finder is already operating, but its impact has been limited because participation by individual specialists has been voluntary and extremely low, meaning patients often cannot see what a specific doctor actually charges. The new legislation allowing the Government to upload data directly to the Medical…

Media Outreach made fast, easy, simple.

Feature your press release on Medianet's News Hub every time you distribute with Medianet. Pay per release or save with a subscription.