Skip to content
Indigenous, Information Technology

Will AI close or widen the gap? New report examines opportunities and risks for First Nations communities

Paul Ramsay Foundation 3 mins read

Australia’s first comprehensive report to examine the future impact of artificial intelligence (AI) on First Nations communities warns that the technology could either widen inequality or help close the gap, depending on who holds the power to shape it.

The report by Wiradjuri researcher and Paul Ramsay Foundation Fellow Dr Kyle Turner draws on Australia’s 17 Closing the Gap targets to systemically assess how AI may influence outcomes across health, education, employment, justice and culture.

The report finds that while AI has the potential to improve lives, it also carries significant risks, particularly when communities are excluded from design and governance.

“AI’s promise is real. It can improve health outcomes for families, support learning across distance, assist rangers on Country and help strengthen languages at risk,” said Dr Turner, who previously created world-first AI technology to scan photos of teeth and gums for common dental problems, improving access to free dental check-ups and oral health education.

“But we must proceed with caution, because the dangers of AI are just as real. Predictive policing models trained on historic datasets can amplify discrimination rather than address it. In child protection, automated systems risk mistaking surveillance for safety, turning families into data points instead of people.

Even housing tools meant to improve maintenance can unintentionally exclude those already struggling to secure shelter. Efficiency can come at the cost of equity if systems aren’t culturally safe and community governed.”

The report’s release comes as governments work to establish national AI guardrails and the Senate Select Committee on Adopting AI calls for stronger protections for marginalised communities. The latest Australian Digital Inclusion Index shows that a digital gap persists between First Nations  people and other Australians as well as a considerable affordability gap.[i]

The report sets out a roadmap for responsible AI adoption, emphasising Indigenous Data Sovereignty to ensure First Nations peoples control their data. It calls for accelerating AI in health, education, and environmental management, adapting it for employment and housing, and pausing its use in justice and child protection until cultural safety, consent, and accountability are guaranteed – reminding us that responsible technology must be shaped by fairness, inclusion and the voices of First Nations communities.

Dr Turner, who completed his PhD in Epidemiology at the University of Oxford and has published widely on the burden of chronic diseases, highlighted the stark unevenness of AI’s future impact, warning that technology will almost always advantage those who hold the power to shape it.

Every AI initiative should pass a simple test: does this shift power, benefits, and decision-making towards First Nations communities? If the technology cannot clear that bar — no matter how sophisticated — then it has no place in closing gaps that colonisation opened.

Where First Nations communities shape the brief, set the boundaries, and retain the right to walk away, AI can amplify existing strengths. Where those conditions are absent, it becomes another vector for extraction and control.”

PRF’s Chief First Nations Officer Michelle Steele said Dr Turner’s work is a crucial reminder that technology must be guided by fairness and inclusion.

“This report reminds us that technology is never neutral,” she said. “When First Nations peoples lead, AI can strengthen self-determination. When they’re excluded, it risks repeating the mistakes of the past.”

Despite these challenges, the report offers hope, identifying strong examples of community-controlled AI in action, from telehealth services and ranger programs to language centres building their own digital archives under strict cultural protocols.

Dr Turner was hosted by the Brotherhood of St Laurence during his Fellowship.

The Future Impact of Artificial Intelligence on First Nations Communities: Opportunities and Risks is available now here.



[i] Thomas, J., McCosker, A., Parkinson, S., Hegarty,  K., Featherstone, D., Kennedy, J., Ormond-Parker,  L., Morrison, K., Rea, H., & Ganley, L. Measuring Australia’s Digital Divide: 2025 Australian Digital Inclusion Index. Melbourne: ARC Centre of Excellence  for Automated Decision-Making and Society, RMIT University, Swinburne University of Technology,  and Telstra.


About us:

About the Paul Ramsay Foundation

PRF is a philanthropic foundation.

The late Paul Ramsay AO established the Foundation in his name in 2006 and, after his death in 2014, left most of his estate to continue his philanthropy for generations to come.

At PRF, we work for a future where people and places have what they need to thrive. 

With organisations and communities, we invest in, build, and influence the conditions needed to stop disadvantage in Australia.

Find out more about how we work at www.paulramsayfoundation.org.au 


Contact details:

Pia 0412 346 746

More from this category

  • Agriculture Farming Rural, Indigenous
  • 15/01/2026
  • 07:00
Nuffield Australia

Protecting Indigenous food knowledge: charting a course for Australian Native Foods

MEDIA RELEASEThursday, 15 January 2026 Far from niche culinary novelties, a report released today highlights the importance of Traditional Knowledge protection in Australian Native Foods – an industry rooted in 60,000 years of Indigenous history. The Nuffield report of Marlon Motlop, a proud Larrakia/Gulumeorrgin, Kungarrakany Erub/Darnley man, highlights the importance of Australian native foods such as the Kakadu plum, Warrigal greens and Rock fuchsia. It calls for increased First Nations leadership in the rapidly growing industry. Marlon, a former Port Adelaide footballer, has taken a deep dive into how Australian Native Foods connect people, culture and country. He offers practical…

  • Contains:
  • Information Technology, Internet
  • 14/01/2026
  • 13:42
Monash University

Monash expert: Grok blocked – should Australia follow suit? And how to safeguard images against malicious content generators

A Monash Universityexpert is available to comment on the controversial AI tool Grok, whether Australia should follow Indonesia, Malaysia and the UK in restricting it, and how to safeguard against private images being used by AI image generators. Associate Professor Abhinav Dhall, Department of Data Science & AI, Faculty of Information Technology Contact via: +61 450 501 248 or [email protected] Human-centred artificial intelligence Audio-visual deepfakes Computer vision The following can be attributed to Associate Professor Dhall: “Grok has made it easier to produce malicious content because it is directly integrated into X (formerly Twitter), so anyone can quickly tag it…

  • Information Technology
  • 14/01/2026
  • 00:11
Securonix

ThreatQuotient Celebrates Award-Winning Year of Innovation and Expansion

ThreatQuotient Honoured with Seven Prestigious Awards Showcasing Leadership in Threat Detection and Response Melbourne, AUS. 13th January 2026 – ThreatQuotient, a Securonix company and leader in threat intelligence platforms, has experienced an exceptional year of innovation and development. This has been driven by remarkable achievements, including being named for the fourth consecutive year as Technology Leader in the analyst QKS Group’s SPARK Matrix for Digital Threat Intelligence management, as well as six other industry accolades, which have helped it achieve unprecedented momentum in the threat intelligence sector. In total, ThreatQuotient earned seven industry awards in 2025 for threat intelligence and security automation, including: SPARK Matrix for Digital Threat Intelligence Management: For the fourth consecutive year,…

Media Outreach made fast, easy, simple.

Feature your press release on Medianet's News Hub every time you distribute with Medianet. Pay per release or save with a subscription.