As far as job seekers are concerned, finding a new position isn’t only complicated by today’s economic uncertainty, but also by the artificial intelligence employers use in their hiring process, according to Insider. To reduce costs and the time needed for manual tasks, many companies have turned to automation and AI tools to review applications. That, in turn, is leading candidates to experience more angst around their job search and more trouble landing new roles.
A study from SHRM found that 42% of large employers use AI hiring support, meaning job seekers may virtually interview with or be prescreened by an artificial intelligence program. Hiring platforms and job boards, such as LinkedIn, also use “language-processing” AI tools to filter applicants.
Discrimination and Bias
At the same time, research shows that these same AI tools may be injecting bias and discrimination into the hiring process. In fact, researchers at the University of California, Berkeley, say that AI decision-making systems could have a 44% chance of being embedded with gender bias, a 26% chance of displaying both gender and racial bias, and may be prone to screening out applicants with disabilities.
In addition, some AI tools have been shown to be unable to identify quality candidates based on their training and the datasets provided. A 2021 Harvard Business School study found that 88% of executives know their AI tools screen out qualified candidates, but continue to use them because they’re cost-effective.
As a result, legislation and guidelines are slowly being put into place to mitigate the effects of AI on hiring. For instance, the Equal Employment Opportunity Commission recently began an initiative to aid organizations in their use of AI to “ensure that these technologies are used fairly and consistently with federal equal employment opportunity laws.” Meanwhile, New York City’s law on AI, requiring bias audits for automated employment decision tools, will go into effect in July.
HR leaders and the C-Suite need to be aware of the ethics of using AI tools to make their hiring decisions. The Harvard Business Review said, “It is important to create internal processes based on how one’s organization defines fairness in algorithmic outcomes, as well as setting standards for how transparent and explainable AI decisions within the organization need to be.” Others recommend that people always have the final say when it comes to AI decision-making, adding a human-touch and double checking the work that’s been done.
By Mark Feffer
Mark Feffer is executive editor of RecruitingDaily and the HCM Technology Report. He’s written for TechTarget, HR Magazine, SHRM, Dice Insights, TLNT.com and TalentCulture, as well as Dow Jones, Bloomberg and Staffing Industry Analysts. He likes schnauzers, sailing and Kentucky-distilled beverages.
Weekly news and industry insights delivered straight to your inbox.