Skip to content
Government Federal, Information Technology

Federal government has responded to AI legislation paper

RMIT University 5 mins read

The federal government has today released their response to a consultation paper released last year on safe and responsible AI in Australia. 

Experts available: 

  • Professor Lisa Given: the government’s approach and comparisons internationally 
  • Dr Dana Mckay: the potential of AI – good and bad  
  • Dr Nicole Shackleton: the lack of consideration of AI use in sex and intimate technologies 
  • Dr Nataliya Ilyushina: the costs of delay in regulation 
  • Professor Mark Sanderson: the importance of understanding diversity when legislating AI 

Full comments below. 

Professor Lisa Given, Director of the Social Change Enabling Impact Platform and Professor of Information Sciences 

“The Australian government appears to be taking a proportional approach to potential risks of generative AI by focusing, at least initially, on application of AI technologies in high-risk settings (such as healthcare, employment, and law enforcement).  

“This approach may be quite different to what other countries are considering; for example, the European Union is planning to ban AI tools that pose ‘unacceptable risk,’ while the United States has issued an executive order to introduce wide-ranging controls, such as requirements for transparency in the use of AI generally.  

“However, the Australian government will also aim to align its regulatory decisions with those of other countries, given the global reach and application of AI technologies that could affect Australians directly. 

“Taking a proportional approach enables the government to address areas where the potential harms of AI technologies are already known (e.g. potential gender discrimination when used in hiring practices to assess candidate’s resumes), as well as those that may pose significant risks to people’s lives (e.g. when used to inform medical diagnoses and treatments). Focusing on workplaces and contexts where AI tools pose the greatest risk is an important place to start. 

“The creation of an advisory body to define the concept of “high-risk technologies” and to advise government on where (and what kinds of) regulations may be needed is very welcome. It will complement other initiatives that the Australian government has taken recently to manage the risks of AI.” 

Professor Lisa Given is an interdisciplinary researcher in human information behaviour. Her work brings a critical, social research lens to studies of technology use and user-focused design.  

Dr Dana Mckay, Senior Lecturer in Innovative Interactive Technologies 

“AI is affecting more of people’s lives than they realise. It affects the search results we get, the healthcare we receive, the jobs we apply for, and how much money we can borrow. In some countries, AI has even been used to determine prison sentences. AI is also a matter of national security.  

“When AI can affect our health, wealth and happiness, it is key that it is regulated to ensure personal wellbeing. 

“While the negative consequences of AI are large, so are the potential benefits.  

“Automating tasks that can be done by machines frees up human capacity and intellect for more complex or human-oriented tasks.  

“Ultimately, AI is a tool like any other, and needs principles-based legislation to ensure that it is beneficial for all of Australian society, not just those who benefit most from productivity gains, or those who own the technologies.” 

Dr Dana McKay studies the intersection of people, technology and information. Her focus is on ensuring advances in information technology benefit society as a whole. 

Dr Nicole Shackleton, Lecturer, Law 

“The Australian Government’s Interim Response to the consultation into the Safe and Responsible Use of AI makes promising steps towards proactive regulation of high-risk AI technologies.  

“What is concerning, however, is the lack of consideration of AI use in sex and intimate technologies, which is a growing market internationally and in Australia.  

“Other than the Government’s focus on AI-generated pornography or intimate images, often referred to as deepfake pornography, which is increasingly being developed and used without consent to bully and harass, the interim report shows little interest in issues of sexual privacy, the safe use of AI in technologies in sexual health education, or the use of AI in sex technologies such as personal and intimate robots.  

“It is vital that any future AI advisory body be capable of tackling such issues, and that the risk-based framework employed by the Government does not result in unintended consequences which hinder potential benefits of the use of AI in sex and intimate technologies.” 

Dr Nicole Shackleton is a socio-legal researcher focused on gender and sex, technology and regulation. Her research aims to inform law reform to prevent online abuse, and the regulation of technology companies. 

Dr Nataliya Ilyushina, Research Fellow, Blockchain Innovation Hub 

“Australia's unacceptable delay in developing AI regulation represents both a missed chance for its domestic market and a lapse in establishing a reputation as an AI-friendly economy with a robust legal, institutional and technological infrastructure globally. 

“The consultation process for responsible AI regulation concluded six months ago. Australia endorsed the Bletchley Declaration at the AI Summit in the UK last November, and EU officials forged a provisional agreement on the world's first comprehensive legislation on AI regulation on the 8th of December.  

“The adoption of AI is affordable and accessible, which is particularly essential for the growth of small businesses – the cornerstone of the Australian economy.  

“Employing AI to augment human jobs has demonstrated a capacity to enhance productivity, providing a direct solution to Australia's challenges of stagnant productivity growth, the cost-of-living crisis and labour shortages. 

“While businesses prefer voluntary codes and frameworks, other stakeholders – especially those working on risks related to cybersecurity, misinformation, fairness and biases – seek more stringent regulations.  

“Over-regulation of AI might incentivise businesses to relocate their operations overseas, potentially causing greater job losses than the implementation of AI itself. 

“Not having enough regulation can lead to market failure where cybercrime and other risks that stifle business growth, lead to high costs and even harm individuals are high.” 

Dr Nataliya Ilyushina is a Research Fellow at the Blockchain Innovation Hub and ARC Centre of Excellence for Automated Decision-Making and Society (ADM+S). Her work investigates decentralised autonomous organisations and automated decision making, and the impact they have on labour markets, skills and long-term staff wellbeing. 

Professor Mark Sanderson, Dean of Research and Innovation, Schools of Engineering and of Computing Technologies 

“As smart as AI has become, these computer systems are still prompted and controlled by something smarter, human beings. As important as it is to be concerned about AI algorithms, it is also critically important to monitor how people interact with AI systems and observe how those systems react.  

“Across a population as diverse as Australia’s, the way people request AI systems to take on tasks will differ widely in both in terms of expression and language.  

“Understanding how AI reacts to that diversity of interaction needs to be a critical component of the planned legislation.” 

Professor Mark Sanderson’s research covers search engines, usability, data and text analytics. He is also a Chief Investigator at the RMIT University node of the ARC Centre of Excellence for Automated Decision-Making & Society (ADM+S). 


Contact details:

General media enquiries: RMIT Communications, 0439 704 077 or [email protected] 

Interviews:
Professor Lisa Given, 0458 340 908 or [email protected]  

Dr Dana Mckay, 0420 422 215 or [email protected]  

Dr Nicole Shackleton, 0437 727 782 or [email protected] 

Nataliya Ilyushina, 0433 737 594 or [email protected] 

Professor Mark Sanderson, 0428 096 666 or [email protected]

More from this category

  • Finance Investment, Government Federal
  • 11/03/2026
  • 16:40
ACOSS

ACOSS statement on NACC Robodebt Investigation

The findings of the National Anti-Corruption Commission’s Robodebt investigation will be devastating to the victims and their loved ones today. The NACC found two public servants engaged in ‘serious corrupt conduct’ - but will not refer them for criminal prosecution. Four others, including former prime minister Scott Morrison and former secretary Kathryn Campbell, were found to not have engaged in corrupt conduct. “For the hundreds of thousands of people harmed by Robodebt, these findings will be devastating,” said ACOSS CEO Dr Cassandra Goldie. “For those who had their lives upended, who lost loved ones, who sold assets or borrowed money…

  • Government Federal, Oil Mining Resources
  • 11/03/2026
  • 13:06
Cement Concrete & Aggregates Australia

Infrastructure Priority List highlights need to plan for heavy construction materials supply

Key Facts: Infrastructure planning across Australia must be supported by clear understanding of construction materials supply to avoid project delays Materials supply is identified as the largest non-labour supply risk to infrastructure delivery, particularly steel, quarry products and concrete Coordinated supply and demand analysis for construction materials is crucial for efficient delivery of national infrastructure projects South East Queensland faces potential supply shortages as demand for materials is expected to increase due to population growth and 2032 Olympics Lack of proper planning could result in project delays, higher costs and supply bottlenecks affecting national infrastructure development Governments across Australia must…

  • General News, Government Federal
  • 11/03/2026
  • 08:00
e61 Institute

Self employment falls as more choose benefits of employment

Self-employment has fallen sharply to a 20-year low, according to new e61 Institute research that suggests a fundamental shift in how Australians work and run businesses. The share of Australians who are self-employed fell from a 2002 peak of 20% to just 14% of employment today, as the appeal of wage jobs, including higher pay and benefits like superannuation, continues to grow. Sole traders dropped from 12% in 2002 to just under 9% today while employing businesses fell from 7% in 2002 to less than 5% today. The study finds the decline reflects changing labour market incentives rather than a…

Media Outreach made fast, easy, simple.

Feature your press release on Medianet's News Hub every time you distribute with Medianet. Pay per release or save with a subscription.