Originally posted on Medium, 13 October 2016 (read part 1 and part 2).
Over the past few weeks weeks we’ve taken you through our findings from our validation experiment testing whether using Applied moves the dial in terms of hiring more intelligently and more fairly. We’ve shown evidence that it can do both. Now to our third test: does it make hiring easy?
One of the central tenets of behavioural science is that people won’t do things unless they’re easy. Even things we really want to do or think we should do (think about all those gym memberships gathering proverbial dust in your wallet).
So we’ve spent a fair bit of time trying to make the platform as easy to use as possible — building in smart defaults to guide users and automating as much administration as possible. This final blog post in the series provides data on how well we did on the ease measure. We decided to keep this part of the experiment simple, and just looked at a some basic time spent measures.
Hiring people is time-consuming stuff but worthwhile when you get it right, so while it’s always nice to reduce how much time you spend in any process, it’s as important to think about how that time is spent. There are lots of things that humans do better than machines, so automation isn’t always the answer.
A quick recap on what we are comparing here. The Behavioural Insights Team had over 700 candidates apply to its graduate role earlier this year. After completing a multiple choice test, the best 160 were taken through to detailed review. To allow a fair test of the approaches and to ensure no candidate was even potentially affected by being reviewed on different information, we ran a parallel process which meant all candidates were simultaneously (but independently) reviewed on their CVs and on their responses to work-based tasks on Applied. The former was done business as usual — no blinding or chunking of candidate information. It involved a pre-sift by the head of HR to identify the best third, followed by a more granular review of the best by two members of the team assessing separately. The latter was done according to Applied’s unbiased review process: candidate responses to the work sample tasks were blinded, chunked by question horizontally, and randomly assigned to three independent reviewers in random order.
A lot of people worry that getting three people to read candidate information will add to what they already see as a time-consuming process.
We found even if you just look at the sifting process in isolation, reviewing responses on Applied was actually faster than CV sifting. More importantly, that time was spent much more intelligently, it was spent doing things that humans are better at doing. Applied enabled 75% of the time to be spent reviewing candidates. In contrast, 61% of the time using the CV sift was spent on administration (from printing and collating hundreds of pieces of paper, to inputting scores back into spreadsheets).
For those drawn to cost-benefit analyses, we calculate that we would have needed to review 3 times as many CVs to uncover the same number of highly rated candidates. So if it’s outcomes we’re interested in, then the time saved was even more extreme. We calculate that it would have taken 63 hours of CV sifting to identify as many excellent candidates as we found in 18 hours of Applied sifting.
And that’s before we account for the time we would have spent had we not used Applied to automate how candidates selected interview slots, which is estimated to have saved 3 days’ worth of admin time.
Putting just these two areas of savings together, we found that we saved around 30 mins on each candidate.
But don’t just believe this one experiment. One of our clients has told us that they’ve seen an over 300% improvement in interview hit rate since using Applied (that is, the percentage of people interviewed that get hired), and that their admin time has been slashed by 95%.
Things look even more promising if you imagine that people hired through Applied are also more likely to stay in the job — which isn’t that ludicrous an assumption. Applied not only allows organisations to find the best candidate more objectively, it also exposes candidates to what the job entails. Unlike the more traditional, passive forms of recruitment like sending in CVs, the work sample questions we use in Applied have the dual benefit of being more predictive of success and simply allowing for better self-selection by candidates.
Improving person-job fit isn’t just a good thing to aim for on a personal level, it’s also critical to the bottom line. Industry estimates are that up to one in three people leave a new job in the first year, and that the cost of these hiring mistakes can be as high as three times that person’s first year salary (considering recruitment, onboarding, training, salaries etc). Improving retention is a business imperative for most organisations.
We think this is just the beginning of the ways in which we could improve the efficiency of hiring and make doing best practice recruitment easy. It shouldn’t be hard to do best practice recruitment.
What’s next?
We pitted Applied head-to-head against the CV sift and saw positive results. We found over 50% of candidates wouldn’t have been hired without the platform, which is a staggering result. But that was just one experiment and this is just the beginning. Next, we’re working with other organisations to test how Applied could help them to improve their hiring practice. If the number is even 20%, that’s still 1 in 5 people getting hired based on merit that otherwise wouldn’t have been. That’s worth getting out of bed for.
Fancy taking the Applied platform for a spin? Start your free trial here.