tl;dr: Diverse companies outperform non-diverse companies. Here’s one solid way to help diversify your workforce.
Promoting diversity in the workplace is a widely supported cause, and one that comes with quantifiable benefits for people and businesses. Aside from promoting a fairer workplace, increasing diversity has been extensively shown to improve a company’s ROI: Diverse companies consistently outperform their non-diverse peers.
While there have been a number of attempts to increase diversity by improving internal culture, instituting unconscious bias training, or implementing c-suite quotas, fewer companies have taken steps to remedy the problem where it often actually begins: the hiring process.
Numerous studies reveal bias against women and minorities in hiring, most of it carried out unconsciously by people. This bias is well-documented when women are looking for jobs in STEM careers or raising funding from VC firms. Similar issues exist for ethnic minorities: for example, studies have shown that resumes with traditionally African-American names are less likely to be called for an interview than those with traditionally Caucasian names.
To fix this problem, hiring must be carefully designed. One approach is to start with “blind auditions,” which elevated female participation in orchestras from 5% to 35%. This drastic improvement in diversity could greatly benefit other industries. Blind auditions were effective for orchestras because they allowed musicians to be evaluated solely on their capabilities, not their background, gender, or ethnicity.
How That Might Work
So how do we bring “blind auditions” to hiring? Pre-hire assessments, or tools that assess a candidate’s job aptitude at the earliest stages of the application process, are currently used by over 40% of US companies. These tools should function as the “blind auditions” for hiring, but unfortunately, many of the popular assessments used for hiring decisions are perpetuating bias.
How is this possible? The regulations set forth by an interagency committee led by the Equal Employment Opportunity Commission (EEOC) in 1978 had the direct goal of preventing assessments from showing adverse impact against women and minorities. However, the regulations carve out an exception: adverse impact, or bias, is allowed so long as it’s shown to be “job related”. The viewpoint of an expert in the field illustrates this point: “Some of the biggest predictors of job performance also demonstrate increased levels of adverse impact [bias]”.
There is strong reason to believe the adverse impact is a methodology problem. Pre-hire assessments predict performance by comparing an applicant to high-performers in a role. This would be fine if all jobs had equal representations of genders and ethnicities. However, if a homogenous population dominates an industry, their traits may be spuriously linked to top performance, but nonetheless become baked into pre-hire assessments. This perpetuates the bias against minorities both structurally and legally.
A highly oversimplified example is this. On the Fortune 500 list of CEOs, there are more men named John than there are women. Thus a simple model predicting CEO potential might place being named John as an important predictor of CEO aptitude, and more predictive than being female. While this reflects the status quo, we don’t want this baked into future predictive models.
Luckily, improvements in both assessment and analytics means pre-hiring assessment technology can be both predictive and bias-free — providing the true “blind audition” for hiring. Specifically, using machine learning – with its focus on prediction rather than description – as well as using gender and ethnic neutral data inputs into hiring models, is critical. So with our simple example from above, we would remove demographically-biased data, and try to find the gender-neutral commonalities that retain prediction of job success.
There is a crop of new technology startups looking to de-bias different parts of the hiring process.
(At pymetrics, we measure a candidate’s cognitive and emotional traits and compare them to top-performers using 100% bias-free technology. We do this by using bias-free inputs, removing applicant demographic information, and using statistical tools to remove residual bias from the algorithms. We are able to provide the true blind audition for hiring: our final predictive model presents a gender and ethnically balanced set of candidates that is pre-screened to be a great fit for the role. And this is in keeping with EEOC guidelines prescribing the use of the least biased predictive technology)
The data shows what we all know to be true: Improving diversity isn’t just the right thing to do, it’s the smart thing to do.
If companies want to hire people from all backgrounds, genders, and ethnicities, it’s time to adopt hiring technologies with the power to stop perpetuating bias and actively increase diversity.
Dr. Frida Polli, PhD, MBA is a Harvard- and MIT-trained award-winning neuroscientist turned CEO and co-founder of pymetrics. You can find her on Twitter here