Skip to content
National News Current Affairs, Youth

Modern sex education must address AI risks

Act for Kids 3 mins read

The growing risks associated with artificial intelligence must be incorporated into age-appropriate sex education talks with kids, a leading child protection organisation has warned.

Act for Kids says parents, carers, schools and educators need to be informed and prepared, with generative AI emerging as a strong trend in online child abuse and exploitation and sextortion material.

The warning follows a spate of serious incidents involving Australian school students using AI technology to generate illegal child abuse material inappropriately depicting classmates and peers.

In 2024, the National Centre for Missing and Exploited Children (NCMEC) flagged a staggering 1325 per cent rise in incidents involving AI-generated child sexual abuse and exploitation content, according to their CyberTipline Report. 

Similarly, Australia's eSafety investigators noted a 218 per cent increase in reports of AI-generated child sexual abuse and exploitation content between 2023 and 2024.

Act for Kids CEO Dr Katrina Lines said it was disturbing how rapidly generative AI had become increasingly accessible, easy to use, and widely available across online platforms.

“Addressing this issue involves much-needed legislative change, but it also presents a new and important opportunity for parents and carers to educate themselves about the risks and impacts of AI,” Dr Lines said.

Act for Kids national ambassador and former detective inspector Jon Rouse APM has dedicated his career to protecting children from sexual abuse.

His research, conducted with the Canadian Centre for Child Protection, found child sexual abuse material distributed online could continue to circulate on the internet two decades after a case was investigated and marked as “solved”.

“Artificial intelligence adds another obstacle in the ongoing fight against online circulation of child abuse material,” Mr Rouse said.

Act for Kids is encouraging everyone who has kids or works closely with kids to talk with them about the appropriate use of AI or incorporate this into age-appropriate conversations around sex education and consent.

“Young people need to know that there are real-life consequences when AI is used to generate child abuse material – it’s a crime and they could face very real penalties,” Dr Lines said.

Mr Rouse adding, “This is a stark reminder of the pervasive trauma this offending can cause survivors and overarchingly, how significant the global failure has been to address it.” 

 

Act for Kids’ tips on how to speak to kids about sex education:

•           Start conversations with your child from a young age.

•           Keep conversations open and age appropriate – follow your child’s lead and use words they can                      understand

•           Use the correct anatomical words for body parts.

•           Answer questions in a calm, casual manner.

•           Ask your child what they already know about AI so you can ensure they have the appropriate                          information.

•           Don’t make it awkward – it’s important to remember if you don’t talk to them, they may get their                        information online or from an unsafe or unreliable source.

•           Talk regularly, rather than having ‘the chat’.

•           Explain the importance of consent, especially in a sexual context – ‘yes’ means yes, ‘no’ means no,                and ‘maybe’ means no.

•           Seek resources if you’re unsure about a topic.

•           Remind your child they can always ask you questions and talk to you.

For children needing support:

•           Kids Helpline: 1800 551 800 or www.kidshelpline.com.au

•           Lifeline: 13 11 14 or www.lifeline.org.au

If you are experiencing online harm or abuse, whether or not generative AI is involved, you can make a report to eSafety.   


Contact details:

Melanie Whiting

0427 794 666

[email protected]

More from this category

  • Sport Recreation, Youth
  • 04/12/2025
  • 11:49
Monash University

New research shows councils hold the key to ending abuse in kids’ sport

New research from Monash University raises questions about the effectiveness of current approaches to stopping abuse and discrimination in children’s sport, and highlights how local councils could play a pivotal role in protecting children from harm. The groundbreaking research will be presented at the Sport Management Association of Australia and New Zealand today (Thursday 4 December).The research highlights that discrimination, bullying and sexual abuse are far more common in youth sport than many realise. A recent International Olympic Committee consensus statement found that between 44 per cent and 86 per cent of children experience interpersonal violence in sport environments, including…

  • National News Current Affairs, Women
  • 04/12/2025
  • 08:00
Monash University

New report uncovers perpetrator and victim perspectives on sexualised deepfake abuse

First study of its kind to interview perpetrators of deepfake sexual abuse to examine motivations Increased accessibility of AI tools is helping perpetrators create realistic and harmful nude and sexual imagery Education, tighter regulations on the marketing of AI tools and laws around creation and consumption of sexualised deepfake imagery may help combat this growing issue AI tools are making it easier to create and disseminate deepfake imagery, and a new study from Monash University has revealed insights into the experience of both victims and perpetrators of sexualised deepfake abuse. The research, funded by the Australian Research Council, is the…

  • General News, National News Current Affairs
  • 03/12/2025
  • 10:32
Australian Small Business and Family Enterprise Ombudsman

ASBFEO Small Business Pulse – November 2025

ASBFEO Small Business Pulse - November 2025. Release attached.

  • Contains:

Media Outreach made fast, easy, simple.

Feature your press release on Medianet's News Hub every time you distribute with Medianet. Pay per release or save with a subscription.