The use of AI, automation and other data-driven tools has been growing in the recruitment sector for a few years now, as it has across the rest of the business world. As we move into 2022, you might even be considering investing in these kinds of new technology for your business.
Certainly these tools have the potential to provide some important benefits for staffing companies – helping to source great candidates through targeted advertising, sift through applicants by screening CVs, and standardise the evaluation and assessment process. The potential to remove a degree of human bias from the recruitment process, helping find and decide the best person for a role, makes these technologies a very attractive prospect.
However, these tools also carry significant risks. AI and automation technologies still have to be designed by humans and they learn by examining past human behaviour, which can perpetuate or even exacerbate existing biases. Many of you will have read about cases like Amazon’s AI recruiting tool that discriminated against women, because it had been taught using historical data which showed tech as an industry dominated by men. The AI then set this uneven gender divide as its ideal baseline, not knowing that it was behaving in a discriminatory way.
It is no wonder then that when the Centre for Data Ethics and Innovation (CDEI) conducted a review into bias in algorithmic decision-making, recruitment was a key area of interest. Their polling found that fewer than two in five people were aware that algorithms could be used to support recruitment decisions, and only 14 per cent thought they would be aware if an automated decision had been made about one of their recent job applications.
The CDEI’s review concluded that while the use of algorithmic tools was increasing in recruitment, there was a lack of clear and consistent understanding about how to do this fairly and without entrenching existing biases. However, they did say that recruitment firms are relatively good at collecting data to monitor the outcomes from these tools, compared to other sectors.
We must recognise that data-driven tools are not a silver bullet. Given how important the recruitment process is for both businesses and jobseekers alike, it’s vital that recruiters think carefully about how AI and other tech should be implemented in order to mitigate these risks and ensure that outcomes are fair for everyone.
Guidance and standards
With that in mind, the REC has been working with the CDEI over the past year to produce guidance to help recruiters implement these technologies effectively and fairly. One of the key pillars of our work at the REC is about keeping standards high in the industry. This guidance falls squarely in that area, giving recruiters a number of recommendations to follow to ensure that data tools and AI are being deployed in the best way possible.
The first part of the guidance provides an introduction to the kind of technologies that exist in the sector and the risks that they bring. It then moves on to recommend key actions at four stages of the development cycle.
Before starting to purchase or develop one of these tools, it’s essential to articulate what your objectives and requirements are, and how the technology will ideally help to achieve them. This should also include developing a baseline for assessing the tool’s performance later on. You should then be in regular dialogue with the vendor or developer about the tool so you can articulate any concerns and ensure it complies with data protection and equalities legislation.
Test to assess
Once the tool is in place, the business should run a pilot to test that it is working correctly, assessing the tool against the objectives that were put in place earlier. Only once that has been successfully completed should the tech be rolled out fully – and then it should be monitored closely and regularly assessed to make sure it continues to fulfil what it was designed to do.
Finally, if you are using these types of data-driven tools, it’s important that candidates and clients are made aware of this. Providing this information should help to increase levels of trust and also allow you to get feedback from both sides on how the technology has worked from their perspective.
Whether you are buying in an automation or AI system or building one in-house, it’s vital that recruitment companies innovate with care and consideration. These tools have the potential to greatly improve efficiency and reduce the amount of human bias in the hiring process. We hope this guidance will help make that possible, and the REC will continue to engage with the CDEI and others in this area to ensure recruiters are well-prepared as the automation and AI revolution continues.