Analysis

December 17, 2018

Will AI take human bias out of hiring?

Britain's Applied says "No". France's Clustree says "Yes". And America's Pymetrics has ploughed into Europe anyway.


Kitty Knowles

6 min read

Photo by Brooke Lark on Unsplash

Designing algorithms is easy to do badly—just look at Amazon. The American retail giant failed spectacularly this year when its AI introduced bias against female job applicants. This happens when a machine inadvertently learns to mechanise bias already existent in employee data, explains British entrepreneur Kate Glazebrook. “It is easy to come up with something which appears to deliver the outcome you're looking for but has phenomenally bad unintended consequences,” she says.

Glazebrook is co-founder and CEO at Applied, a startup spun out of Britain’s Behavioural Insights Team (an organisation set up improve government policy and services). Since 2016, Applied has used behavioural science to mitigate human biases and support human decision-making across 65,000 job candidates for more than 60 organisations including the British Government, Hilton Hotels and publisher Penguin Random House. It closed a £1.5m investment round in November 2018.

Kate Glazebrook presenting Applied at the DWP-led Women In Digital conference.

Glazebrook—who previously completed a Masters in public policy at Harvard University—describes Applied’s approach as “redesigning the decision process, rather than using AI to determine the outcome”. The technology platform doesn’t rely on algorithms to identify and predict high performance, instead, it implements more than 30 anti-bias “nudges” to counteract common cognitive biases. For example, to curb the “Halo/Horns effect” (when one candidate detail unduly influences the impact of other details) Applied presents candidates’ answers in batches so that reviewers can compare candidates across one topic at a time. It also randomises the order of applicant answers to remove the bias of decision fatigue (when earlier answers are seen more favourably to later ones). And it removes distractions like names and socio-demographic details which can see employees hiring clones of themselves (known as affinity bias).

Advertisement
60% of candidates selected using Applied would have ended up in the ‘no’ pile based on their CVs alone.

Glazebrook’s research within the Behavioural Insights Team suggests that 60% of candidates selected using Applied would have ended up in the ‘no’ pile based on their CVs alone. Applied could adopt AI to streamline human decision making, adds Glazebrook, but she remains wary of its flaws. “The test data you build an algorithm from itself needs to be free from bias: I'm not aware of any existing data set that is free from bias,” she concludes.

Despite this, startups across Europe are determined to use artificial intelligence to introduce rather than prevent greater employee diversity.

Bénédicte de Raphélis Soissan, founder Clustree.
Bénédicte de Raphélis Soissan, founder Clustree.

Bénédicte Raphaelis is the founder of the French AI firm Clustree which focuses on internal hiring and promotion. The Paris-based startup uses natural language processing (NLP) to predict skills based on job titles, and develops deep learning algorithms to extracts insights from candidate profiles. It also automatically recalibrates its recommendations based on user feedback and—like Applied, Clustree blinds these to aspects like gender, age, or educational prestige. Clustree has already supported the internal hiring of 200,000 employees across billion-euro companies like Sanofi chemicals, Orange telecoms, and Carrefour, Europe's largest retailer.

I realised that HR people didn't use data to put skills at the centre of decisions.

Raphaelis—who has two STEM Masters degrees—says she realised she needed to transform the hiring landscape after years spent working in small firms left her feeling stuck. “I was a woman, I had no big logos in my education, or among my employers—I was convinced that my resume alone would never lead to an interview,” she tells Sifted. In response, Raphaelis put job titles to one side, and instead analysed what skills she had in common with 500 other professional profiles. “That's how I had the idea of Clustree,” she explains. “I realised that HR people didn't use data to put skills at the centre of decisions.”

Clustree’s algorithms analyse 250 million career paths to help match existing employees to internal job openings, development and mentorship opportunities. But with such a big back catalogue of potentially biased data, surely this puts Clustree at risk of falling into the Amazon trap? Raphaelis disagrees. This is because just one of Clustree’s recommendation algorithms is pre-trained on existing career path data pool, she explains, while seven others assess independent parts of an employee profile like job title, job level and language used. Raphaelis points to Clustree’s NLP algorithms which detect “invisible data” not listed on your CV (a marketing role in the digital industries might equate to strong data mapping skills, for example). “We are convinced at Clustree that although today we define an employee by their job title, tomorrow we will define an employee with a basket of skills,” says Raphaelis.

One of the biggest pioneers of AI hiring in Europe is the American firm Pymetrics (which has raised $56.6m to date). The New York firm, founded by Julie Yoo and Frida Polli, trains its algorithms on results from neuroscience assessment games in order to pinpoint job candidates who share prevalent and distinctive traits held by existing high performers. It’s already being used in Europe by professional services company Accenture, HR consultancy firm Randstad, and Danish dairy producer Arla Foods, as well as Unilever, LinkedIn and Tesla in the US (Elon Musk thinks college degrees are overrated, remember).

Alex Terry, Head of Occupational Psychology EMEA at Pymetrics, says that the firm’s exercises only measure traits that are stable across genders and ethnicities and do not change over time (things like risk-taking indicators, cognitive processing consistency or memory capacity). The “ideal” set of traits prioritised are based on test results from existing employees. However Pymetrics avoids too much homogeneity because it tracks a complex range of data points, and there are multiple ways in which candidate can "fit" the algorithm which promotes diversity. Over time, successful candidates may bring their own idiosyncrasies with them, and could even alter the algorithm. “You’re always allowing surprises through and the surprises that come through may set the next parameters,” Terry explains.

Unlike Amazon, Pymetrics also pre-tests algorithms potential for bias (according to the EEOC four-fifths rule) prior to launching, thereby removing any learning which promote the “pale, male and stale” mould. “If we detect any bias by race or gender we go back to square one… we will always trade off the power of the prediction to ensure it’s bias-free,” she explains. Later stages involving human assessment will also always be pivotal to Pymetrics’ success. “Those at the next stage of the process have to be open-minded, otherwise it undoes all our good work,” Terry notes.  

What do Europe’s AI leaders make of all this? Machine learning expert Loubna Bouarfa, a member of the European Commission's High Level Expert Group for Artificial Intelligence, says that B2B AI businesses like Clustree and Pymetrics will likely thrive, but warns that European firms will ultimately be slower to adopt AI into hiring than their American counterparts. This, she argues, is a good thing: “Even if companies are using a layer of AI algorithms, there will still be a layer of human decision. We don't move to automation right away, especially when it is about individuals and their needs,” she says.

If done successfully, AI hiring could have a special role to play in Europe in supporting the cross-pollination of talent across countries divided by cultural and language barriers. “We need to look outside of our comfort zones,” says Bouarfa. “If we want 50-50 gender targets, we can design systems that support us to reach those outcomes.”

Advertisement