Home > Article > Technology peripherals > Study shows AI tools have limited effect on hiring bias
Researchers from the University of Cambridge said in the journal "Philosophy & Technology": "There is growing interest in new methods that can solve problems such as interview bias." Practical applications of AI are increasingly Common, but still at a “pseudo-scientific” level when it comes to selecting candidates through videos or application metrics.
Some human resources professional organizations once told the media that AI technology has the potential to combat bias.
A 2020 cross-national survey of 500 human resources professionals found that nearly a quarter of practitioners are using AI to “automate the selection of talent.” But this approach to reducing bias is The exploration does not seem to be working. Kerry Mackereth, a postdoctoral researcher at the Center for Gender Studies at the University of Cambridge, said in an interview that this matter is complete "bullshit".
She explained: "These tools cannot be trained to simply identify job-related characteristics, nor can they remove gender and race factors from the hiring process. This is because humans always believe that whether an employee is good or not will make a difference." Inevitably intrinsically linked to their gender, characteristics and race."
Research has also pointed out that some companies are also finding problems with the tools themselves. In 2018, Amazon announced that it would abandon the development of an AI-driven recruitment engine because it was found that it would detect gender from resumes and discriminate against female applicants.
Researchers are also paying special attention to tools that can "analyze the details of a candidate's speech and body movements." It is said that these subtle elements can also reflect whether the subject meets the specific requirements. Position requirements.
Research co-author Dr. Eleanor Drage admitted that this type of video and image analysis technology has "no scientific basis" and dismissed it as "modern phrenology" - that is, relying on the shape of the skull to infer personality and intelligence. False theory.
She emphasized: "They think that they can understand each other's personality through facial observation. This idea is as unreliable as a testing tool. AI cannot 'see through' the face to see the true inner self."
Cambridge tool screenshot
The researchers also worked with six computer science students to build their own simplified version of the AI recruitment tool, based on the so-called " Candidate photos are rated on the "Big Five" personality traits:
But the final rating will still be affected by various non-relevant variables.
Dr. Drage wrote: "When using these tools, you can see that simply adjusting the contrast/brightness/saturation of the image changes the character conclusions drawn by the AI."
The Register noted that other investigations have reached similar conclusions.
A German public broadcaster even found that wearing glasses or a headscarf in a video could affect a candidate's final score.
Hayfa Mohdzaini of the Chartered Institute of Personnel and Development pointed out that their research shows that only 8% of corporate employers use AI technology to screen job candidates.
“AI can indeed help organizations improve diversity by increasing the overall size of the candidate pool—but it can also miss out on many good candidates if the rules and training data are incomplete or inaccurate. "
She concluded: "Currently, AI software used to analyze candidates' speech and body language is still in its infancy. Of course, there are huge opportunities, but there are also risks associated with it."
The above is the detailed content of Study shows AI tools have limited effect on hiring bias. For more information, please follow other related articles on the PHP Chinese website!