Job applicants perceive AI-powered hiring process as more fair when it is blind to race or gender, new study finds
Job applicants can be suspicious of the hiring process if a company uses artificial intelligence to pre-screen candidates and facilitate hiring decisions, a Northeastern University expert says, but their perception improves when they learn that an algorithm is “blind” to such characteristics as gender, race or age.
A group of researchers, including Yakov Bart, a professor of marketing at Northeastern, conducted a behavioral experiment to see how people’s perception of fairness changes depending on what they are told about the algorithm used in the hiring process.
“Our findings indicate that people perceive hiring algorithms as procedurally fairest when companies adopt a ‘fairness through unawareness’ approach to mitigating bias,” Bart says. “They are also likely to view companies who use this approach more positively and are more motivated to apply for open positions.”
---
Continue reading at Northeastern Global News.