Powered by MOMENTUM MEDIA
HR Leader logo
Stay connected.   Subscribe  to our newsletter
Tech

Could AI be perpetuating discrimination in the job hiring process?

By Kace O'Neill | |6 minute read

Artificial intelligence has been deployed as a tool to assist in the recruitment process. Now, accusations of AI bias resulting in blatant discrimination have arisen.

AI discrimination lawsuit

Workday is facing renewed claims that it is using AI tools that directly discriminate against job applicants at the many major companies it recruits for. Derek Mobley, who claims to have been turned down for more than 100 jobs he applied for through Workday’s platform, filed a complaint in San Francisco Federal Court.

Advertisement
Advertisement

Mr Mobley, a black man, said in the complaint that by using Workday’s platform for recruitment, employers are essentially handing over their authority to make hiring decisions to the company, which enables these issues to occur.

“Because there are no guardrails to regulate Workday’s conduct, the algorithmic decision-making tools it utilises to screen out applicants provide a ready mechanism for discrimination,” said Mr Mobley’s lawyers in the complaint.

The company has denied wrongdoing and said when the lawsuit was filed that it engages in an ongoing “risk-based review process” to ensure that its products comply with applicable laws and aren’t engaging in any forms of discrimination.

A wide range of employers and recruitment agencies use AI as a tool for the job hiring process, with roughly 80 per cent of US employers admitting this. That includes using software made by Workday and other firms that can review a large number of job applicants and screen out applicants for a variety of reasons.

Algorithmic bias

The process of screening applicants out is where problems can formulate. Of course, AI-enabled recruitment has the potential to enhance quality and efficiency and reduce transactional work. However, algorithmic bias can result in discriminatory hiring practices based on gender, race, and colour, which is what Mr Mobley is alluding to through his lawsuit.

Algorithmic bias refers to systematic and replicable errors in computer systems that lead to unequal and discrimination-based hiring practices towards legally protected characteristics, like race and gender.

Zhisheng Chen, author from Nanjing University of Aeronautics and Astronautics, explained that the primary source of algorithmic bias comes from data: “The primary source of algorithmic bias lies in partial historical data. The personal preferences of algorithm engineers also contribute to algorithmic bias.”

“Despite algorithms aiming for objectivity and clarity in their procedures, they can become biased when they receive partial input data from humans. Modern algorithms may appear neutral but can disproportionately harm protected class members, posing the risk of agentic discrimination,” said Mr Chen.

Strategies can be deployed to best ensure these acts of algorithmic discrimination do not occur. Different technical measures like constructing unbiased data sets and enhancing algorithmic transparency can be implemented to combat algorithmic hiring discrimination.

Workday claims to partake in constant self-regulation regarding their AI tools through an ongoing “risk-based review process”. However, Mr Chen believes self-regulation isn’t enough to really tackle and dispose of this issue.

“Although self-regulation can help reduce discrimination and influence lawmakers, it has potential drawbacks. Self-regulation lacks binding power, necessitating external oversight through third-party testing and the development of AI principles, laws, and regulations by external agencies,” said Mr Chen.

Third-party oversight is imperative to removing algorithmic biases as it offers accountability, yet it is equally important to expel the entrenched mindset that artificial intelligence offers inherently “objective” and “neutral” practices, especially when they have the ability to reflect the biases we humans hold.

If not held to account, algorithms may continue to exacerbate inequalities and perpetuate discrimination against minority groups in the hiring process.

RELATED TERMS

Discrimination

According to the Australian Human Rights Commission, discrimination occurs when one individual or group of people is regarded less favourably than another because of their origins or certain personality traits. When a regulation or policy is unfairly applied to everyone yet disadvantages some persons due to a shared personal trait, that is also discrimination.

Kace O'Neill

Kace O'Neill

Kace O'Neill is a Graduate Journalist for HR Leader. Kace studied Media Communications and Maori studies at the University of Otago, he has a passion for sports and storytelling.