For far too long, diversity hiring has been synonymous with “meeting a quota.” Most companies have used a “plug and play” approach to identify candidates who have the necessary characteristics of a diversity hire, whether it’s based on race, ethnicity or gender. Typically these candidates were sourced from major universities.
Not only is this manner of recruiting for diversity offensive, it’s also ineffective. By tapping only a university talent pool, organizations deny themselves talent that doesn’t come through those institutions. Furthermore, because most major universities are only really accessible to a privileged few, talent recruited in this way is rarely truly diverse. For a long time, employers have been fishing in a pool of 1% and ignoring the other 99% of talent.
AI assessments can help close these gaps. Truly diverse talent exists in unexpected places, and AI makes finding that talent possible. Here’s how.
Valuing Skills Over Background
Online assessments that leverage predictive analytics can identify candidates with the skills needed for a specific job, giving you a truer sense of the best candidates for a position. Furthermore, AI is capable of sourcing from a talent pool that better reflects the demographics of an organization’s client and customer base. There is proven value in building workforces that reflect the demographics of your customers.
These methods vastly increase the talent pool over traditional methods of diversity recruiting, and give now overlooked candidates a chance, particularly if the assessments are online and mobile-friendly.
Combating Implicit Bias
AI-based assessments are scored automatically via a set of objective algorithms based on behavioral characteristics linked to the role. They mimic ordinary job interviews, and 95% of the time move forward the same candidates that expert interviewers would. Based exclusively on skills and capabilities, the AI can determine the best candidate for each position.
But unlike human interviewers, AI doesn’t get tired, bored or disengaged. These elements are impossible to remove from human scoring, and often contribute to implicit biases. Even the least-biased recruiter can make a mistake when hand-sorting hundreds of applications. Furthermore, AI doesn’t bring any assumptions into play — a human recruiter may privilege one educational experience over another, for example, but the AI is programmed to focus on skills.
Even though AI assessments and scoring work in favor of all candidates by producing a skills-based and bias-free assessment experience, there is still fear of the unknown. This can be hard to overcome, and it requires complete transparency. In the past many organizations have favored a “black box” approach to recruiting, where the inner workings of the selection process were shielded from the candidate. Now it’s time for a change.
It’s critical to take a “glass box” approach to using AI for recruiting. Candidates must be informed ahead of time how their data will be collected, what the AI is reading and scoring, and how bias is being filtered out. It’s only natural for human candidates to have doubts regarding AI’s role in recruiting — and only by using this glass-box approach can you truly meet and overcome the needs and hesitations of your candidate base.
Are you interested in reading more about AI and hiring for diversity? I have also published an article in The Times Special Report: Future of talent management.
About the AuthorFollow on Linkedin Visit Website More Content by Achim Preuss