On Wednesday 5th May, Juggle founder Romanie Thomas hosted a panel discussion as part of CIPD’s 2021 Diversity and Inclusion Conference.
Alongside her were: Dr Andrew Marcinko, a Teaching Fellow and Behavioural Scientist at Durham University; Tali Shlomo, Inclusion and Diversity Consultant / VP EMEA at SwissRe; and Louise Hooper, Human Rights Barrister at Garden Court Chambers.
Despite being hosted online due to the pandemic, the discussion was as enlightening and nuanced as ever — and translated easily into best practices for creating inclusive working environments.
The panel focused on one burning question in the hiring space: “How can AI prevent unconscious bias in recruitment?” Here are their takes.
What is unconscious bias, and what are we dissecting?
Andrew: The human brain didn’t evolve to consciously deal with the abundance of information we are processing in the 21st century. To get around this, it essentially ‘automates’ some of those processes to navigate through the complex world we live in. This is what results in schemas, stereotypes and biases. And that’s what we need to fight when recruiting.
“We all subconsciously think that the best person to do the job is ‘someone like us’.”
How do you see bias manifest itself in your world?
Tali: It’s a habit, an automatic mental shortcut to make decisions. Affinity bias happens frequently in recruitment – people look for people who are similar to themselves (hobbies, school etc.). Then there’s confirmation bias — where you see the results you expect. Biases crop up everywhere and can be a problem even if you are aware of them.
How can AI mitigate this?
Andrew: We can’t eliminate unconscious bias. Think of walking to a shop – you don’t consciously process each step. It’s the same cognitive process with biases. We automatically take cues and use them to help us automate aspects of human interactions.
AI has some clear benefits. It allows us to look at and process significantly more applications than a human, and access a broader talent pool. When you look at the systemic side of things – humans have to build AI and machine learning and it is ultimately trying to replicate human thought and processes. If the humans building it have no women or minority programmers then they cannot design an AI that will not also reinforce those same systemic biases that are holding us back today.
There are possibilities, but it is not a ‘cure all’.
“AI is only as unbiased as its human developers.”
Louise: It’s vital to have diverse teams looking at the data, asking questions and building the technology. If it is just white men building AI then they are at risk of perpetuating bias, because we all subconsciously think that the best person to do the job is ‘someone like us’.
Tali: We know we can learn better from diverse teams. One part of this is to recruit and attract candidates. More importantly, we then need the right environment for diverse candidates to feel included. Yes, AI has a role. But, human compassion and care also needs to feature here.
Are we premature in thinking AI can solve some of these problems?
Tali: We can accelerate change with technology, but we need to continue to ask ourselves the important questions: “Will this have the right impact? What are the limitations? Is it good enough?”
Thanks again to Tali, Andrew and Louise for their great insight on AI in 2021. With the Government recently announcing it was scrapping unconscious bias training, it’s crucial the private sector does everything it can to reduce prejudice in the workplace.
As the panel explored, AI will be one part of this fight. But there is still so much more that can be done. For more insight on D&I from Juggle, take a look at our blog on the problems facing Chief Diversity Officers — or our broader Diversity Blog.
Sign up to Juggle for free today to get access to a diverse pool of candidates within 24 hours.