Skip to content
Information Technology, Political

CDU EXPERT: Discussion paper misses mark on taking AI threats seriously, expert

Charles Darwin University 2 mins read

18 JANUARY, 2024

Who: Charles Darwin University Computational and Artificial Intelligence expert Associate Professor Niusha Shafiabady discusses the Australian Government’s Safe and Responsible AI in Australia paper. Associate Professor Shafiabady is an internationally recognised expert and developer of AI data analysis platform Ai-Labz.

Topics:

  • The Australian Government’s Safe and Responsible AI in Australia discussion paper.
  • Artificial Intelligence, machine learning, data analysis, modelling, deep learning and more. 
  • Combining academic knowledge and research into industrial applications.
  • The impact of Artificial Intelligence in the workplace.

Contact details: Call +61 8 8946 6721 or email [email protected] to arrange an interview.

Quotes attributable to Associate Professor Niusha Shafiabady:

“The discussion paper talks about issues with AI that everyone who reads the news has heard about, like misinformation and disinformation and collaboration with industry to ‘develop options for voluntary labelling and watermarking of AI-generated materials’.

“It also mentions that firms like ‘Microsoft, Google, Salesforce and IBM’ are ‘already adopting ethical principles’ in using AI.

“It is good fun to read the executive order US President Joe Biden signed on October 30, 2023 on Safe and Trustworthy use of AI, and compare it with the Australian Government’s discussion paper.  

“To me, threats AI and technology pose to people are two types: long-term and short-term. The paper our government has released lacks the long-term threats altogether. AI is changing our education system and the way the kids grow up and learn at schools.

“AI will be displacing many jobs. To what extent are we going to allow it to be integrated in our lives? Are we thinking strategically or put our faith in big firms’ hands for saying they are ‘already adopting ethical principles’? How are we going to create mandatory guardrails for ‘testing’, ‘transparency’ and ‘accountability’ through collaboration with industry? Who are our industry experts who verify if an AI system is ‘biased’ or not?

“The first question we should ask ourselves, in my opinion, is how are the AI systems being created and who are the people developing them? These days, the so-called ‘AI experts’ are people who have learnt to use free toolboxes or a paid tool which are basically distributed by the big firms like Microsoft, Google, Salesforce and IBM.

“What are these tools doing? Do the developers even know that there are ways to avoid issues like ‘bias’ for the AI systems? Do they have enough knowledge and training to develop the systems that are less likely viable to making mistakes? Can big firms that the governments are paying millions of dollars to use their services, be trusted?

“This paper, Safe and responsible AI in Australia, is stored in Google space.

“We need regulations and enforcement, not just talking about things that are good ideas. Misinformation and disinformation are serious threats. We need regulations to mandate watermarking the fake material.

“In the US government’s executive order, they specifically mention what they are implementing and what needs to be done. Here, we are putting our faith and the fate of the people in hands of industry’s good faith. Sorry but this wouldn’t work. If we don’t take the threat of technology seriously and come up with mandatory regulations, we will feel the blow as a nation. It is time to act now.”


Contact details:

Raphaella Saroukos she/her

Marketing, Media & Communications
Larrakia Country
T: +61 8 8946 6721
E: [email protected]
W: cdu.edu.au

More from this category

  • Information Technology
  • 16/12/2025
  • 17:41
Market Logic Software, GmbH

New APAC Partnership with Matter Brings Market Logic Software’s Always-On Insights Solutions to Local Brand and Experience Leaders

BERLIN, DE / ACCESS Newswire / December 16, 2025 / Market Logic Software, the market-leading SaaS provider of insight management solutions, has announced a…

  • Contains:
  • Environment, Political
  • 16/12/2025
  • 15:53
Make Big Polluters Pay

Treasurer must levy big coal and gas corporations to fund climate disasters Make Big Polluters Pay

Climate disasters are projected to cost the federal budget $6.3 billion in the upcoming mid-year economic forecast this week. The Treasurer should follow public opinion and ensure coal and gas corporations responsible for most climate pollution pay for these costs, rather than forcing ordinary taxpayers to shoulder the burden, according to the Make Big Polluters Pay alliance. Climate disasters already cost the economy $38 billion each year, with households, communities, local governments and small businesses paying to recover from extreme weather. These impacts are also driving up insurance premiums, food prices and household bills. Deloitte projects disaster costs will exceed…

  • Contains:
  • Information Technology
  • 16/12/2025
  • 10:11
Orezone Gold Corporation

Orezone Reports First Gold from Bomboré Hard Rock Expansion

VANCOUVER, British Columbia, Dec. 15, 2025 (GLOBE NEWSWIRE) -- Orezone Gold Corporation (TSX: ORE | ASX: ORE | OTCQX: ORZCF) (the “Company” or “Orezone”) is pleased to announce that it has completed the first gold pour from the Company’s new 2.5Mtpa hard rock expansion.Patrick Downey, President and CEO stated, "Commissioning of this plant is now complete, with mill throughput averaging 78% of nameplate capacity for the first 5 days of operations, resulting in first gold on December 15.I want to extend my sincere gratitude to all involved in the construction and commissioning of the Bomboré hard rock expansion. Their dedication,…

Media Outreach made fast, easy, simple.

Feature your press release on Medianet's News Hub every time you distribute with Medianet. Pay per release or save with a subscription.