- Talent Management
Recently, scandals at Uber and other tech companies have continued to draw attention to the fact that gender equity in the workplace is still an ongoing struggle and one that won’t be easily overcome with good intentions or even progressive policies. A key problem is that gender bias breed gender bias. Men are more likely to hire other men, which perpetuates the gender gap in traditionally male dominated fields. Unfortunately, in some sectors, including tech, women are more likely to hire men too. This has led some tech companies to experiment with recruitment processes that strip the gender out of hiring. The problem is that so far, these extreme measures have not yielded the desired results.
Decades ago, orchestras were primarily populated with men. Indeed, even in the 1970s, women only made up about 5% of many U.S. orchestras, and even highly accomplished musicians were not making the final cut. As a result, in the 1970s to 1980s, anonymous auditions (once known as “blind hiring”) increasingly became the norm. As a result, women now comprise a higher number of seats in the nation’s major orchestras but are still outnumbered by men.
More recently, the tech sector has started to experiment with something similar: a form of interviewing that conceals not only candidate’s bodies but in this case, also their voices. Not unlike the voice modulators used when a protected witness is shown on a crime show, what recruiters hear is a voice that sounds machine-generated rather than human. In theory, taking the pitch out of a human voice should eliminate the gender, but in practice, this is not necessarily the case.
Over the past two years, several different hiring platforms and tech companies have attempted to use voice modulation to address gender bias. So far, the results have suggested the experiment is not working.
interviewing.io is a platform that enables people to practice technical interviewing anonymously and find jobs based on their interview performance not their resumes. As explained in a 2016 post, to tackle gender bias in tech, interviewing.io, made men sound like women and vice versa. As explained, their experiment did not work out as planned:
The setup for our experiment was simple. Every Tuesday evening at 7 PM Pacific, interviewing.io hosts what we call practice rounds. In these practice rounds, anyone with an account can show up, get matched with an interviewer, and go to town. And during a few of these rounds, we decided to see what would happen to interviewees’ performance when we started messing with their perceived genders…We ended up with 234 total interviews (roughly 2/3 male and 1/3 female interviewees)…After running the experiment, we ended up with some rather surprising results. Contrary to what we expected (and probably contrary to what you expected as well!), masking gender had no effect on interview performance with respect to any of the scoring criteria (would advance to next round, technical ability, problem solving ability). If anything, we started to notice some trends in the opposite direction of what we expected: for technical ability, it appeared that men who were modulated to sound like women did a bit better than unmodulated men and that women who were modulated to sound like men did a bit worse than unmodulated women.
interviewing.io speculated that their failed experiment may have to do with another aspect of their platform. While men, even men who consistently interview badly, persist on the platform, women who interview badly are seven times more likely then men to simply leave the platform. One theory for the men’s success in the voice modulation experiment may be related to the fact that the men interviewing.io simply have more interviewing experience. However, other tech companies who have experimented with voice modulation suggest that linguistics may offer a better explanation.
When asked about collaborative projects, women interviewee’s tend to use more collective language (e.g., “we developed this program…”) while men used language that conveyed their leadership (e.g., “I developed this program…”). While using collective pronouns should work in one’s favor when asked about collaboration, in reality, the opposite holds true. Women and men interviewers have a tendency to rate applicants higher when they use language that suggest they are entirely in charge and this holds true even when asking them about collaborative projects.
What is the take away from the tech world’s most recent attempt to solve their gender equity issues? Bias, it appears, runs deep and structures both women’s and men’s perspectives on hiring. In the end, awareness training may prove more useful than algorithms or anonymous hiring practices.