Listen to this article
Even computer programs designed to eliminate human subjectivity sometimes fail. Bias in the hiring process is prevalent and hard to eradicate.
Scientists have found evidence that data-driven recruitment algorithms have the potential to learn our prejudices. Unless carefully monitored, their filters detect patterns of underrepresentation and reproduce them, perpetuating biases against already disadvantaged groups such as older and disabled workers, women and ethnic minorities.
The deep-rooted nature of human bias has led some academics and consultants to question the value of diversity training that pins its hopes on educating people to rise above their prejudices. In her book What Works, Professor Iris Bohnet, a behavioural economist at Harvard University, argues that rather than trying to alter people, employers should change their processes to limit the opportunity for bias.
One company experimenting with different ways of making its processes and practices bias-proof is Vodafone, the telecoms group. Catalina Schveninger, global head of resourcing, highlights a pilot the company is running, initially in India, to test the effect of removing gender from the CVs of job applicants.
Historically local managers had assumed they were failing to appoint women into tech roles because there were not enough qualified women to recruit. But the data told another story. Plenty of highly qualified women were applying but were not getting interviews. “If the [Indian] pilot shows that by [gender] blinding CVs we can move the needle then we will share the results and encourage other markets to adopt the same practice,” says Ms Schveninger.
At professional services firm KPMG, crunching the numbers on internal promotions revealed that proportionately more men than women were being promoted to senior roles. This produced a gender imbalance that worsened with each step up.
However, this was not simply a matter of male bosses appointing men in their own image. There was also a gender dynamic. “Where the men would apply for a role if they had 80 per cent of the [required] skills, women would think they were missing 20 per cent and not bother,” says Martin Blackburn, people director at KPMG UK. Now when a promotion is advertised, line managers are encouraged to check whether their high-potential female colleagues have applied, and if not ask why.
The analysis highlighted that men holding job offers from competitors were more likely than women to ask for and receive a financial bonus to prevent them leaving, and this was contributing to a gender pay gap. Rather than offering money, line managers are now expected to offer career development — and if that does not work, to let people go.
Trying to limit bias with hard-hitting training can have the opposite effect. Studies suggest that if people feel coerced to change their opinions, their biases may be more entrenched and it could provoke a backlash.
At advertising agency Dentsu Aegis Network — whose Japanese parent company is being investigated by the government following an overworked employee’s suicide in 2015 — managers have been taking part in a form of bias training that tries to engage people rather than blame them. Watched by an audience, actors recreate stories of workplace bias gathered from the business. As the scene unfolds, the onlookers are invited to discuss what is going on and then rewrite the script to produce a better outcome.
“The point isn’t to get people to accept that they have biases, but to get them to see [for themselves] that those biases have negative consequences for others,” says Theresa McHenry, HR director at Microsoft UK, which also uses the technique. The idea is that by teaching people decision-making disciplines — called “bias interrupters” — they will be better equipped to counter the brain’s tendency to fall back on the known and familiar when making choices.
Developments in supercomputing are enabling other methods of countering human blind spots and social disadvantage. At Vodafone, recruiters run job postings through Textio, a tool that sniffs out corporate jargon and words — such as “competitive” and “drive” — that research suggests can put off female applicants.
The business is also trialling Headstart, an algorithm-driven recruitment platform, which matches graduates to job opportunities based on psychometrics and an analysis of mutual needs that does not hinge on which university the candidate attended. The technology contains some optional features that allow employers to prioritise applications from under-represented groups.
Though the trial is still at an early stage, Ms Schveninger says she is starting to see promising candidates coming forward for interview from universities that were not previously on Vodafone’s radar.