4.9 C
New York
Friday, November 15, 2024
HomeHeadline newsAI recruitment systems to be investigated over racial bias

AI recruitment systems to be investigated over racial bias

Date:

Related stories

Britain’s data protection regulator said it will study if the use of artificial intelligence (AI) systems in recruitment would result in the wrongful denial of opportunities based on race.

The Information Commissioner’s Office said it is considering the impact the use of AI in recruitment on neurodiverse people who weren’t part of the testing for this software.

The investigation is part of ICO25 – a three-year plan setting out the watchdog’s regulatory approach and priorities.

The regulator’s decision comes amid concerns that the use of algorithms to sift through job applications affects employment opportunities for people from ethnic minorities, the Guardian reported.

The jobs website ZipRecruiter revealed to the newspaper that at least three-quarters of all CVs submitted for jobs in the US are read by algorithms.

- Advertisement -

“We will be investigating concerns over the use of algorithms to sift recruitment applications, which could be negatively impacting employment opportunities of those from diverse backgrounds,” the ICO said.

John Edwards, who took over as the UK’s information commissioner earlier this year, said the regulator would be “looking at the impact AI use could be having on groups of people who aren’t part of the testing for this software, such as neurodiverse people or people from ethnic minorities”.

David Leslie of The Alan Turing Institute which is focused on data science and artificial intelligence, said: “The use of data-driven AI models in recruitment processes raises a host of thorny ethical issues, which demand forethought and diligent assessment on the part of both system designers and procurers.

“Most basically, predictive models that could be used to filter job applications through techniques of supervised machine learning run the risk of replicating, or even augmenting, patterns of discrimination and structural inequities that could be baked into the datasets used to train them,” Leslie told the Guardian.

Subscribe

- Never miss a story with notifications

- Gain full access to our premium content

- Browse free from up to 5 devices at once

Latest stories