#1 Be a responsible consumer
When it comes to choosing AI technology for your recruitment process, it’s essential to take a closer look at your options. Talk to potential providers about their algorithms and how rules are set. Understanding what underpins these rules and how the provider ensures that bias is eliminated is key.
We are very transparent about this at impress.ai. We employ a combination of rules-based supervised algorithms. Our approach sets rules based on widely recognised organisational psychology research that demonstrably combats bias. Our system evaluates each candidate‘s merits within a prescribed mandate rather than replicating hiring norms that may be influenced by bias.
#2 Routinely assess algorithms
As with any core business process, AI is not ‘set and forget’. To ensure your AI is delivering for your organisation and is fair for your candidates, it’s recommended to have algorithms routinely assessed by accredited bodies.
Assessing your intelligent decision-making tools is no different to performing employee evaluations. The tools are doing a job for your organisation, and you want to make sure they are doing the best possible job and remaining compliant.
This gives you peace of mind that you are getting the best possible outcomes. It can also help you navigate legal considerations as employment law in most countries is still catching up to advancements in technology.
#3 Be aware of inherent biases in biometric tools
In the last few years, we have seen a rise in organisations using biometric tools such as facial recognition within the recruitment process. These tools can be used for everything from simple identification to reading and interpreting expressions in video interviews.
And while some of these technologies are incredibly impressive, it is crucial to be aware that they can carry inherent biases.
Take facial recognition, for example. In infancy, these tools were developed based on Caucasian male faces, and these were the first traits embedded by machine learning. Unfortunately, this can lead to incorrect and unfair analysis today. In fact, one study by the Gender Shades Project reported a 34% increase in bias when analysing women of colour.
This doesn’t mean you have to abandon the use of biometric tools. However, it is critical to be aware of this potential and consider insights from these tools alongside human interaction with the candidate.
At impress.ai, we are committed to ensuring that our clients get the most out of their technology, saving time and money while increasing accuracy and eliminating bias. We’d love to show you how it works. Contact our team to organise a demo or discuss your specific requirements.