Companies are more likely to hire women for senior roles when they apply anonymously, according to a study of over 2,000 successful job applicants by recruitment software firm Applied.
The company found that using a hiring process that removes identifying features like CVs and cover letters led to an almost 70 per cent rise in successful female applicants for leadership roles.
Instead of providing a résumé, candidates answered a series of questions about how they would handle potential scenarios at work, took part in cognitive testing and standardised structured interviews.
Questions that could reveal details like personal interests or "cultural fit" were not allowed.
The study, which looked at the gender identities of 2,260 candidates, found that 52 per cent of successful applicants who went through this process and were hired for senior-level positions at organisations in Australia, the United States and the UK were women, compared to an estimated global average of 31 per cent.
The reason, Applied's CEO Khyati Sundaram told Euronews Next, is all in our heads.
"There are biases that all of us have, and that's evolutionary,” she said.
“We're all meant to have that and we're meant to take quick decisions in a lot of aspects of our life. But when you translate the same biases into a work context, it does have catastrophic issues".
The issue, Sundaram argued, is that no matter how good the intentions of hiring managers, no matter how strong an employer's commitment to equality and diversity in its workforce, the subconscious judgments people make are all but impossible to avoid.
"If I'm walking down the street, and there's a car hurtling towards me at 100 miles per hour, I will move out of its way and that is a bias. It is a shortcut. And I'm completely right to take that decision in that moment," she said.
"But if I apply a similar shortcut, when I have a CV that probably doesn't sound right - I can't pronounce their name, I don't understand which school they've gone to, they're probably from an ethnic minority - I'm not going to call them for an interview because I've already made a bad impression of what they can and can't achieve at work".
The skills for the job
The solution, Sundaram's company claims, is to take that sifting process out of human hands altogether.
Applied's hiring software presents candidates with three to five questions about their skills. It anonymises and randomises the answers, presenting recruiters with individual responses that they can assign a score. A higher score means a candidate progresses to the next stage.
"You remove biases related to how people sound, where they've gone to school, where they've worked before. You're removing all kinds of personal identifiers from that decision infrastructure," Sundaram said.
"You're making a decision solely based on merit, or what should be merit, ie does this person have the skills for the job?"
'The holy grail'
Algorithm-driven recruitment is nothing new but it is controversial. Previous attempts have ended up recreating the discrimination seen in human-led hiring decisions.
In 2018, Amazon was forced to retire an automated hiring tool that awarded candidates a score from one to five based on their CVs.
"Everyone wanted this holy grail," an Amazon source told Reuters at the time. "They literally wanted it to be an engine where I’m going to give you 100 resumes, it will spit out the top five, and we’ll hire those".
There was one problem: the people who trained the Artificial Intelligence (AI) that sorted job applicants gave it CVs from past Amazon applications, most of which came from men.
This effectively taught it to discriminate against women, leading it to downgrade CVs that featured the word "women's" and even lower scores for graduates of women's universities.
'Starting the conversation'
So, what can employers do to tackle a challenge of this scale?
"Even if they don't have budgets, or they don't have access to tech tools such as ourselves, they could review their hiring policies, so what are they testing on if they're using CVs? What kind of people aren't getting through the door? Are there biases that are in play here?" Sundaram said.
"Just starting the conversation internally is important for lots of different companies".
But while companies can start with looking inward at how - and who - they hire, the problem is society-wide, Sundaram told Euronews Next.
Last year, in recognition of the growth of automated hiring processes, New York City Council introduced a law requiring recruiters to carry out annual audits of their hiring software to check for bias.
But the regulations themselves have been accused of having their own bias: critics including the Center for Democracy and Technology think tank have said the new law covers characteristics like race and gender, but does not oblige employers to check for bias on grounds of age, disability and sexuality.
"I don't believe we've made a lot of progress, and that comes down to fighting this social construct. It is trying to deconstruct this entire social construct that has been with us for centuries," Sundaram said.
"And so how do we adapt to the new realities of what the workforce of 2050 is going to bring us? That's the conversation we need to have, a small part of that bigger, very wide conversation".