Skip to content
Employment Relations

How artificial intelligence is unmasking bias throughout the recruitment process

Monash University 2 mins read

New research from the Monash Business School has found that throughout the job recruitment process women believe artificial intelligence assessments reduce bias, while men fear it removes an advantage.

 

Professor Andreas Leibbrandt, from the Department of Economics, investigated how artificial intelligence recruitment tools affect existing biases in recruitment and argued whether there was a way to dismantle the barriers that prevent underrepresented groups from reaching their full potential in achieving their desired roles.  

 

“People in minority groups have inferior market outcomes, they earn less, they have a harder time finding and keeping a job. It’s important to understand why that is the case so that we can identify and remove the barriers,” Professor Leibbrandt said. 

 

One major hurdle lies in the recruitment process itself, which is undergoing a shift alongside the rise of AI. “We know that a large majority of organisations now use AI in their recruitment process,” he said.

 

To uncover recruitment barriers, the first-of-its-kind study focused on the two key areas of applicant behaviour and recruiter bias.

 

In one field experiment, more than 700 applicants for a web designer position were informed whether their application would be assessed by AI or by a human.

 

“Women were significantly more likely to complete their applications when they knew AI would be involved, while men were less likely to apply,” he said.

 

A second experiment focused on the behaviour of 500 tech recruiters.

 

“We found that when recruiters knew the applicant’s gender, they consistently scored women lower than men. However, this bias completely disappeared when the applicant’s gender was hidden,” he said. 

 

When recruiters had access to both the AI score and the applicant’s gender, there was also no gender difference in scoring. 

 

“This finding shows us they use AI as an aid and anchor – it helps remove the gender bias in assessment.” 

Professor Leibbrandt said a crucial aspect of the study was that, in contrast to the vast majority of current research, it focused on the human interaction with AI, rather than the algorithm behind it.

 

“My research isn’t just about dismantling bias, it’s about building a future of work where everyone has the opportunity to thrive,” he said. 

 

Professor Leibbrandt is exploring other frontiers in the fight for workplace inclusion. 

 

One project will test the impact of informing job applicants who are being assessed by AI about the potential bias in AI training data.

 

He also plans to tackle the concept of ‘narrative discrimination’ where unconscious stereotypes influence hiring decisions in the tech industry, as well as explore the potential for bias in remote work settings. 

- ENDS -

MEDIA ENQUIRIES 

Helena Powell

Media Communications Officer, Monash University 

M: +61 474 444 171

E: [email protected] 

 

GENERAL MEDIA ENQUIRIES

Monash Media

T: +61 (0) 3 9903 4840

E: [email protected]

For more Monash media stories, visit our news and events site 

More from this category

  • Business Company News, Employment Relations
  • 11/12/2025
  • 13:37
December 11, 2025

Update: Federal Court finalises Bupa and ACCC settlement

Bupa Health Insurance Australia acknowledges the orders the Federal Court made today in response to breaches of Australian Consumer Law. The proceedings related to the incorrect assessment of certain mixed coverage and uncategorised item claims and related eligibility checks between May 2018 and August 2023. Following the jointly proposed submissions from the ACCC and Bupa Australia, the Federal Court has approved the orders including an agreed penalty of $35 million. Weremaindeeply sorry for these errors and have apologised to our affected customers for the impact this has had on them and their families and have taken actions to ensure this…

  • Employment Relations, Government SA
  • 09/12/2025
  • 19:31
PSA

SA Justice System in crisis as Corrections Officers vote to enter unprecedented 72 hour lockdown

WHAT: SA’s Corrections Officers vote whether to continue a statewide prisonlockdown for 72 hours WHEN: 8am Wednesday the 10th of December 2025 WHERE: Yatala Prison, Grand Junction Road, Northfield MORE INFO: SA Justice System in crisis as Corrections Officers vote to enter 72 hour lockdown Corrections Officers across seven of South Australia's prisons will vote tomorrow morning at 8am on whether to continue a 48 hour strike which has plunged them into an unprecedented lockdown. They will be joined by Home Detention Officers who will also decide whether to down tools. They are responsible for the ankle monitoring of 1500…

  • Employment Relations, Government SA
  • 08/12/2025
  • 20:16
PSA

SA Prisons to enter 48 hours of lockdown as Corrections Officers vote whether to continue strike

WHAT:SA’s Corrections Officers vote on whether to continuelockdown for 48 hours WHEN: 7.30am Tuesday the 9th of December 2025 WHERE: Yatala Prison, Grand Junction Road,Northfield MORE INFO: Corrections Officers across seven of South Australia's prisons will vote tomorrow morning at 7.30am on whether to continue a 24 hour strike which has plunged the prisons intolockdown. Corrections Officers voted this morning to strike for 24 hours at stop work meetings atYatala Labour Prison, Port Augusta Prison, Mobilong Prison, Port Lincoln Prison, Cadell Prison, and Adelaide Women’s Prison. 2000+ of the state’s prisoners have been confined to their cells ever since. The…

Media Outreach made fast, easy, simple.

Feature your press release on Medianet's News Hub every time you distribute with Medianet. Pay per release or save with a subscription.