Is AI in recruitment good for diversity?

As recruiters start to rely more and more on automated technologies to select and hire candidates, Equate Scotland wants employers to be aware of the impact artificial intelligence could be having on the diversity of the candidates selected.

Recruiting a new employee is a notoriously expensive and laborious process. By the time you take into consideration crafting a job description, advertising the role, screening applicants, selecting candidates, interviewing, making the final decision, waiting for their notice period and then getting your new employee settled in, it can, reportedly, cost an organisation anything between £12,000 and £30,000. So the idea that an employer can significantly reduce these costs in both monetary value and time spent selecting a candidate through artificial intelligence is a no brainer, right?

Well, not always. Especially if your organisation values diversity and recognises that diverse teams improve profitability, can foster innovation and makes for better team dynamics. The problem with artificial intelligence is that bias has not been eliminated from technology. At the end of the day we must remember artificial intelligence is programmed by humans with biases of their own and often by a programming workforce which is male dominated.

In the first instance it may seem that using technology can reduce unconscious bias in the hiring process as it means assumptions about a candidate based upon their name, CV or LinkedIn profile are not immediately made by the individuals sifting through these and that ‘gut feeling’ does not interfere with our decision making. In theory, those chosen would be based upon the skills and experience a candidate has, not on their gender, appearance or background. However, if the biases we have are programmed into the technology we use, we risk further embedding inequality and discrimination into the recruitment process rather than eliminating it.

What do we mean by this? AI tools are automated systems designed to work like the human brain. They have been developed to do the work and make decisions so humans don’t have to. But to be able to do that the technology needs ‘trained’. This is done by feeding the technology very large data sets and asking it to make assumptions based upon the information the technology receives. These assumptions are compared to answers from humans and told whether they are ‘right’ or ‘wrong’. Therefore AI replicates human decision making, which we know is influenced by conscious and unconscious bias.

Last year, Amazon had to scrap the artificial intelligence tool they developed for the purpose of automatically sorted through CV’s. After been trained with a decade’s worth of CV’s, predominantly from male candidates, the tool was found to favour men and penalise CV’s with the word ‘women’ included. Women’s CVs would be identified with references such as ‘Women’s chess club captain’ or ‘member of women’s technology society’ as well as all female schools and universities they attended.

More recently, Apple released its first branded credit card in partnership with the New York’s finance giant Goldman Sachs . Like any other credit card it offers a credit limit based upon your income, financial history and credit rating. However the technology determining an individual’s credit rating has been accused of sexist decision making by users and tech entrepreneurs. David Heinemeier Hansson, tech entrepreneur tweeted that the card gave him 20x more credit than his wife, and Apple’s co-founder Steve Wozniak, tweeted to say he too had be provided more credit than his wife despite having entirely shared cards, accounts and assets. With an investigation currently underway Apple and Goldman Sachs is yet to confirm whether the AI used is gender biased. However, the outdated stereotype that women do not manage money well may well be living on through artificial intelligence.

There is some technology working to support diversity in the recruitment process. There are tools available to help identify gender biased language that has proven to put women off applying for certain roles. Equate Scotland encourage employers to use these tools as a starting point. However, as with all aspects of recruitment the human touch should not be lost. Language ‘de –coders’ do not always take into to consideration the context or tone of the writing or job description. Instead of relying on technology, at Equate Scotland we use our combined 14 years of experience on working to improve the representation of women in STEM to engage meaningfully with employers on their recruitment process. We do this by providing guidance on how to improve their job descriptions, making them read as more inclusive, taking tone and context into consideration as well as identifying where unconscious bias may exist in the recruitment process.

AI is a hugely important and growing part of our everyday world. That’s why we want to see efforts to embed equality, diversity and ethics into its development, so it can, not only be considered a technological advancement but a technological advancement that contributes to creating a better and fairer society. Whilst some AI tools are helpful and can be used in the recruitment process, there’s still a lot of work to do before employers and recruiters can fully rely on it to make fair decisions. Mostly, that work needs to come from addressing unconscious bias in ourselves, and those responsible for developing, programming and testing AI. That includes questioning what is informing our decision making, as well as considering how we can be doing things differently.