A Monash expert is available to comment ahead of new legislation the Federal Attorney General will introduce tomorrow that will criminalise the distribution and creation of non-consensual sexualised deepfake images in Australia.
Dr Asher Flynn, Chief Investigator and Deputy Lead, Australian Research Council Centre for the Elimination of Violence Against Women, and Associate Professor of Criminology, Faculty of Arts
Contact details: +61 402 257 926 or asher.flynn@monash.edu
Read more of Dr Flynn’s commentary at Monash Lens
- Deepfakes
- Digital crime
- AI-facilitated abuse
- Revenge pornography
The following can be attributed to Associate Professor Flynn:
“The Australian government’s decision to introduce laws criminalising the non-consensual production or creation of a sexualised deepfake of an adult, will address a major legal gap in Australia, in which the creation of sexualised deepfakes was not illegal anywhere in Australia, except in the state of Victoria. The new laws will see Australia once again leading the way in legal responses to online sexual harm.
“The laws may also go some way towards curbing the accessibility of sexualised deepfake technologies. If it is illegal to create or produce non-consensual deepfake imagery, then it would likely reduce the capacity for the technologies, like free nudify apps, to be advertised.
“It is important that these proposed laws are introduced alongside other responses which incorporate regulatory and corporate responsibility, education and prevention campaigns, training for those tasked with investigating and responding to sexualised deepfake abuse, and technical solutions that seek to disrupt and prevent the abuse.
“Through a survey conducted in 2019 we found that, across Australia, the United Kingdom and New Zealand, 14.1 per cent of respondents aged between 16 and 84 had experienced someone creating, distributing or threatening to distribute a digitally altered image representing them in a sexualised way. People with disabilities, Indigenous Australians and LGBTIQA+ respondents, as well as younger people between 16 and 29, were among the most victimised.
“Sensity AI has been monitoring online sexualised deepfake video content since 2018 and has consistently found that around 90 per cent of this non-consensual video content featured women. And this year, we have had numerous reports of sexualised deepfakes being created and shared involving women celebrities (including Taylor Swift), young women and teenage girls.
“Responsibility should also be placed onto technology developers, digital platforms and websites who host and/or create the tools to develop deepfake content to ensure safety by design and to put people above profit."
For any other topics on which you may be seeking expert comment, contact the Monash University Media Unit on +61 3 9903 4840 or media@monash.edu. For more Monash media stories visit our news & events site.