Monash University experts are available to comment on the Australian government’s pledge to invest nearly $1 billion to combat violence against women. Experts can speak about technologies to counter tech-based abuse, deepfakes, addressing the link to child sexual abuse, online age verification, misogynous content on social media and the proposed government support payments.
COUNTERING TECH-BASED ABUSE
Associate Professor Campbell Wilson, Co-Director AI for Law Enforcement and Community Safety (AiLECS) Lab, Faculty of Information Technology
Contact details: +61 450 501 248 or media@monash.edu
Read more of Associate Professor Campbell’s commentary at Monash Lens
- Digital forensics
- Technology to counter online child abuse material
- Using Artificial Intelligence (AI) to support law enforcement and communities
The following can be attributed to Associate Professor Wilson:
“We welcome investment into the use of technology to counter technology facilitated abuse.
“It is important that regulatory bodies and law enforcement be adequately resourced. This is particularly urgent given the pace at which AI is evolving, and examples of its weaponisation such as deepfake pornography and image based abuse.”
DEEPFAKES
Dr Asher Flynn, Associate Professor of Criminology, Faculty of Arts
Contact details: +61 402 257 926, or asher.flynn@monash.edu
Read more of Dr Flynn’s commentary at Monash Lens
- New legislation to ban deepfake pornography
The following can be attributed to Dr Flynn:
“Sexual deepfake abuse silences women causing lasting harm. The announcement from the National Cabinet today to introduce federal laws criminalising the non-consensual production, creation and distribution of sexualised deepfakes (known as sexual deepfake abuse) is a welcome response to recognise the significant harms of this type of technology-facilitated violence.
“It is likely that such laws will also go some way towards curbing the accessibility of sexualised deepfake technologies, like nudify apps. Nudify apps use AI technologies to remove the clothing from any image uploaded. They are advertised freely on people’s social media feeds and widely accessible. If it is illegal to create or produce non-consensual deepfake imagery in Australia, then it would likely reduce the capacity for these technologies focused specifically on sexualised deepfake content to be readily accessible in Australia.
“It is important that any new or amended laws are introduced alongside other responses that incorporate regulatory and corporate responsibility, education and prevention campaigns, training for those tasked with investigating and responding to sexualised deepfake abuse, and technical solutions that seek to disrupt and prevent the abuse.”
ADDRESSING LINK TO CHILD SEXUAL ABUSE
Ms Kelly Humphries, Research Fellow, AI for Law Enforcement and Community Safety (AiLECS) Lab, Faculty of Information Technology
Contact details: +61 450 501 248 or media@monash.edu
*Note: Ms Humphries is a survivor of child sexual abuse and also has 16 years of law enforcement experience
The following can be attributed to Ms Humphries:
“This announcement is an important step forward to support adult victims and survivors of domestic and family violence. However, we must also acknowledge more needs to be done to help survivors of child sexual abuse and exploitation.
“Around 28.5 per cent of our community is sexually abused before the age of 18 with only 5-13 per cent of the victims and survivors actually reporting their abuse. We often see a link between those who have been abused as children enduring domestic and family violence once they are adults. We cannot address violence against women without addressing violence against children.
“The insights of victims and survivors of sexual abuse are crucial in developing updated and fit-for-purpose, policy, practice and a system that truly educates and reforms harmful attitudes.
“We cannot fix bullet holes with bandaids and this issue is about child abuse without intervention or support. Traumatised children without specialised support become traumatised adults who then become statistics in intimate partner violence and the epidemic you now see.”
“Measures to prevent early exposure to pornography are welcome. Such exposure can, especially without adequate context or education, normalise grooming behaviours by making certain manipulative tactics seem commonplace or acceptable. This lack of awareness makes young people more susceptible to online grooming, as they might not question the intentions or actions of those who mimic these behaviours. Subsequently this can make them incredibly vulnerable and left unaddressed, can lead to harmful sexual behaviours. Without appropriate intervention, such traumatic experiences can alter interpersonal behaviours and attachment styles, leading to dysfunctional relationships later in life and hence a growing epidemic of domestic and family violence.”
ONLINE AGE VERIFICATION
Professor Jon Rouse, AI for Law Enforcement and Community Safety (AiLECS) Lab, Faculty of Information Technology
Contact details: +61 450 501 248 or media@monash.edu
*Note: Professor Rouse has 38 years’ of experience in Australian law enforcement and worked to combat child sexual abuse and exploitation as part of Task Force Argos
- Technology to counter online child abuse material
- Child protection
- Online safety for children
The following can be attributed to Professor Rouse:
"While the news is received positively, it is regrettable that the Government delayed action on the recommendations proposed by the eSafety Commissioner in the March 2023 Roadmap for Age Verification, which included a pilot to shield children from harmful content. The Roadmap pinpointed a correlation between mainstream pornography and attitudes and behaviours that could contribute to gender-based violence. It also highlighted potential dangers linked to online pornography, such as harmful sexual behaviours and risky or unsafe sexual practices. The connection between exposure to adult pornography and violence against women cannot be underestimated."
MISOGYNOUS CONTENT ONLINE
Dr Stephanie Wescott, Lecturer, School of Education Culture and Society, Faculty of Education
Contact: +61 3 9903 4840 or media@monash.edu
Read more for Dr Wescott’s commentary at Monash Lens
“It is affirming to see the government’s announcement of a range of measures to target the influence of misogynist online figures as part of the funding to combat gender-based violence.
“We know from our research, as well as research happening in other contexts, that algorithms deliver misogynist content to the profiles of boys and young men in social media platforms whether they search for it or not, and that this is having a direct impact on their views and attitudes towards women and girls. The behaviours that are currently being seen in schools are grounded in sexism and misogyny, and connecting these instances to the broader issue of gender-based violence is essential to fully address the problem.
“However, the government has not acknowledged that the effect of this misogynist content is already doing harm to women and girls in schools, and that this requires a specific and targeted response that not only supports teachers and school staff, but outlines a long-term, meaningful strategy to combat existing attitudes and behaviours. We would like to see a specific acknowledgement of the problem of gender-based violence in schools, as well as a plan to respond to it in both curriculum and whole-school cultural approaches.”
PROPOSED GOVERNMENT SUPPORT PAYMENTS
Professor Jacqui True, Director, ARC Centre of Excellence for the Elimination of Violence against Women (CEVAW), Faculty of Arts
Contact: +61 416 078 199 or jacqui.true@monash.edu
Read more for Professor True’s commentary at Monash Lens
“Support for women seeking to leave violent situations is always welcomed; however, while a lump sum of $5000 may help it is unlikely to change women's overall material situation or the situation of structural gendered economic inequality that limits their choices, including their choice to leave a violent situation, often with their children. For instance, women victim-survivors could not afford alternative housing with this amount. Moreover, accessing this payment through Medicare and Centrelink may present significant new barriers to women's exit from violence given the waiting time involved with this state service, not to mention that the payment may be weaponized by their violent partners.”
Professor Kylie Cripps, Chief Investigator ARC Centre of Excellence for the Elimination of Violence Against Women (CEVAW), Director Monash Indigenous Studies Centre, Faculty of Arts
Contact: +61 400 142 730 or Kyllie.Cripps@monash.edu
Read more for Professor Kripps’ commentary at Monash Lens
“If the expansion of the crisis payment continues to be administered through Centrelink on the same basis as it currently is this may not be the ‘fix’ or ‘solution’ the government thinks it will be. According to the criteria, you essentially have to be eligible for a Centrelink payment to then be eligible for the crisis payment. And they are prescriptive about leaving or the offender being out of home. This will be challenging in environments where housing is in short supply or in situations where children are involved that necessitates changes in care arrangements. It is also important to reflect that not all victims of violence, given the diversity of the Australian community and how violence impacts people of all incomes, will be eligible for this payment at all – and equally still be in a crisis situation with no funds, no housing and no way of supporting themselves.”
For any other topics on which you may be seeking expert comment, contact the Monash University Media Unit on +61 3 9903 4840 or media@monash.edu. For more Monash media stories visit our news & events site.