Powered by MOMENTUM MEDIA
HR Leader logo
Stay connected.   Subscribe  to our newsletter
Tech

AI threatens men’s advantage in recruitment, study finds

By Kace O'Neill | |5 minute read
Ai Threatens Men S Advantage In Recruitment Study Finds

Research has revealed that women believe artificial intelligence (AI) can directly combat bias in job recruitment.

New research from the Monash Business School has claimed that throughout the job recruitment process, women believe AI assessments reduce bias, while men fear it removes an advantage.

Professor Andreas Leibbrandt, from the department of economics, undertook the study, investigating how AI recruitment tools can affect the existing biases that are prevalent in the recruitment space.

Advertisement
Advertisement

From this, Leibbrandt argued whether there was a way to dismantle the barriers that prevent underrepresented groups from reaching their full potential in achieving their desired roles.

“People in minority groups have inferior market outcomes, they earn less, [and] they have a harder time finding and keeping a job. It’s important to understand why that is the case so that we can identify and remove the barriers,” said Leibbrandt.

“We know that a large majority of organisations now use AI in their recruitment process.”

With AI being implemented into the recruitment process throughout a number of Australian organisations, Leibbrandt believes it can be a huge boost to equal the playing field. In an attempt to uncover the biases that reside in recruitment that often restrict minority groups, Leibbrandt focused on applicant behaviour and recruiter bias in his study.

In one field experiment, over 700 applicants for a web designer position were informed whether their application would be assessed by AI or by a human. Another experiment focused on the behaviour of 500 tech recruiters.

“Women were significantly more likely to complete their applications when they knew AI would be involved, while men were less likely to apply,” said Leibbrandt.

“We found that when recruiters knew the applicant’s gender, they consistently scored women lower than men. However, this bias completely disappeared when the applicant’s gender was hidden.”

Once the recruiters had access to both the AI score and the gender of the applicants, there was no gender difference in the scoring.

“This finding shows us they use AI as an aid and anchor – it helps remove the gender bias in assessment,” said Leibbrandt.

A key difference between this study and others on AI in the recruitment process is that Leibbrandt focused on human interaction with AI instead of the algorithm behind it.

“My research isn’t just about dismantling bias; it’s about building a future of work where everyone has the opportunity to thrive,” said Leibbrandt.

As previously reported on HR Leader, algorithmic bias is a real threat to recruitment practices that use AI. It refers to systematic and replicable errors in computer systems that lead to unequal and discrimination-based hiring practices towards legally protected characteristics, like race and gender.

Zhisheng Chen, author from Nanjing University of Aeronautics and Astronautics, explained that the primary source of algorithmic bias comes from data: “The primary source of algorithmic bias lies in partial historical data. The personal preferences of algorithm engineers also contribute to algorithmic bias.”

“Despite algorithms aiming for objectivity and clarity in their procedures, they can become biased when they receive partial input data from humans. Modern algorithms may appear neutral but can disproportionately harm protected class members, posing the risk of agentic discrimination.”

RELATED TERMS

Recruitment

The practice of actively seeking, locating, and employing people for a certain position or career in a corporation is known as recruitment.

Kace O'Neill

Kace O'Neill

Kace O'Neill is a Graduate Journalist for HR Leader. Kace studied Media Communications and Maori studies at the University of Otago, he has a passion for sports and storytelling.