Using Artificial Intelligency to tackle unconscious bias


INTRO:
Can AI help to bring efficiency to the HR process?
Can it help to tackle affinity bias?
Will it eventually automate the hiring process?
Will we ever be able to trust an algorithm to take responsibility for a process that will, inevitably, be called into question time and time again?
Is it more trouble than it’s worth? We LOVE this interview snippet because each of our speakers has a different opinion:
Lauren is all for it! She believes that AI could be fantastic, and longs for the day when an algorithm can eliminate affinity bias. Lewis is totally against it. He believes that only a human should be responsible for a process that heavily affects human lives. Sarah @Snap Inc. plays a wonderful ‘devil’s advocate’, calling upon her knowledge and experience to mediate the debate and offer her expert opinion.
TRANSCRIPTION:
Lewis: So, we’ve figured that firstly talking about it and then taking action from the top down is really important. And then putting in the right procedures and protocols in place that think about people’s long term career, regardless of what background, color or ethnicity, race, or whatever it might be. I’ve heard a lot of talk recently about AI being involved in the interview process in order to try and help with bias. What are your thoughts on that?
Sarah: So this actually reminded me of Amazon’s big HR project that was scrapped in 2015, and so to me, when I think about AI helping and bringing efficiency into the HR process, I do think that there are gains that can be had there. But do I think that it’ll be a fully automated process? No! And if you read why it failed, it’s because well, there are maybe other industries where you’ll find that it’s pretty equal. There’s not much disparity between, it’s pretty diverse and maybe in those industries it might help, but especially in tech where it’s been a male-dominated field or any field that is overly saturated with some type of person. Whoever that may be, will inherently encourage biases into the outcomes and the predictions of what the algorithms that are coming out. And so that’s the heart of the problem and combating that is so hard. And then on top of it, there’s an aspect of interpretability and explainability of what the algorithm is doing and why it’s ranking candidates in some type of way. And if you can’t explain, because whether you know it exists or not, eventually it’ll come to light and you’re going to have to explain why the algorithm did what it did. And so that piece, I think, is something that that most companies will not want to own because they know that inherently it will be biased and it won’t create equal opportunities and so from that perspective, HR will always have a place, and all the thing is we think about this across the board. It’s very rare to see situations where something will be one hundred percent automated with no manual intervention. So from a place of automating let’s say there are systems that will go in and that read the resumes and highlight keywords, ect. And so that part can be automated. Hey, so is there a high match rate between what I’m looking for and what this resume is presenting? But then there’s also issues with that because sometimes the descriptions that are put together aren’t even put by the hiring manager. So there is a matching problem from the get go when the description is off and then the resumes may be tailored so that you’re matching for the job at that point.
Lauren: Yeah for sure. And the interesting thing about the job description as well, Lewis you maybe know a bit more about this, because you worked with a company that talks about gendered language. And how it kind of cues in selecting the right words. And that’s the same problem. You teach a system to match from a job description to CV. If one is gendered, it’s going to find it in the other ones, you know, and it just continues the same bias throughout the system. But I do think AI could be so good has the potential to be amazing.
Sarah: And that’s why I feel like also I think they took it a step further and they said maybe if AI was to take in the video footage of the interviews. To take visual cues of how someone is responding, the things that they’re saying, and maybe put that through a system, I don’t know. Also, it’s like, again, it’ll all depend on what people are putting into the algorithm in terms of things that they think are good predictors. And some people have different personalities. Like what if I’m shy? Is that going to go against me?
(specialisms) jobs.
TAKE THE NEXT STEPS.