Candidates: Are you interviewing and need support?
Candidates: Are you interviewing and need support?
Boasting over 500,000 customers, JobTestPrep is one of the largest providers of pre-employment assessment study tools. Starting at the low price of $39.99, job seekers get access to "PrepPacks" tailored for tests created by every major assessment provider.
For talent acquisition departments making screening decisions based on multiple choice or closed-ended assessments, sites like these are a problem.
The best candidates could be locked out behind a paywall.
With few exceptions, talent assessments present candidates with a series of closed-ended, multiple choice questions.
We’ve looked at the value of pre-employment testing before: it takes around 70 multiple choice questions to establish a correlation with job performance. The shortest assessment of this type takes a little less than 30 minutes to complete.
A lengthy multiple-choice test might not be the best experience for the candidate, but it makes identifying top performers much easier for a busy recruiting staff. Since assessment scores are predictive of performance, candidates who score the highest are more likely to be high performing.
That’s the theory, anyway. But if you’re looking to create a fairer hiring process, a multiple choice test might not be the way to go.
Standardized tests are deep-seated in the American education system. Every state has its own elementary school testing protocol, and nearly every American college requires applicants to report an SAT or ACT score. America is not unique in this regard: standardized tests are even further ingrained in European and Asian nations.
Around the world, standardized tests have one thing in common: they’re mostly multiple choice. It would be impossible to grade every child’s test in a timely manner otherwise. Like most multiple choice tests, there are right answers and there are wrong answers.
Now consider how most talent assessments are presented. Most will tell candidates that there are “no right or wrong answers.”
Based on their years of taking standardized tests, in school or otherwise, citizens of testing cultures know this is a lie.
Consider this common talent assessment question:
My former employer would say that I get along well with others.
Obviously, the “Strongly Agree” response is correct. While there may be jobs that do not require much social interaction, there is not a single job where disliking others is an advantage.
But is “Strongly Agree” really the correct answer? Will the assessment think you’re lying if you provide too many of the “obviously correct” answers, and downgrade your score?
Candidates don’t know. So they do what they’ve always done to remedy their knowledge gaps and achieve a high score on multiple choice tests: they study.
Most assessment providers will tell you that “pre-employment assessment prep kits” won’t actually prepare candidates for their assessment (a quick Google search will return dozens of these assessment preparation sites). They argue that these sites are a scam, and that they never leak the “right” answers.
They could be right. Then again, they might not. Game of Thrones gets leaked every year by one of the most profitable producers of original content, so it’s not unthinkable that the same could happen to an assessment provider. Either way, it’s a subpar situation for those that rely on the assessment to make their screening decisions.
Let’s consider these two possible scenarios:
Even if the study materials sold by test prep sites are flawed, the problem still remains: your candidates are not answering honestly.
Maybe their flawed study materials actually made them perform worse on the assessment. Great - some of your best candidates (those who perform third party research and put the effort into studying) selected themselves into the worst scoring bracket.
If the study materials sold by test prep sites are accurate, you’ve got an entirely different problem on your hands: you’re giving preference to candidates who study, rather than those who answer honestly.
Which, if the testimonials on assessment prep sites are to believed, might be happening more often than you think:
“Not only did I get the position I wanted, the testing coordinator said I achieved the best score on the personality test she had ever seen. The JobTestPrep guide was clear and well explained. It included several different approaches to cover all the personality test type variations and formats, and allows you to practice the strategy laid out in the guide.” - A Satisfied Assessment Prep Customer “I wanted to work at a certain software company, so I got the [assessment name removed] pack. Did all 5 tests and went through all the tips. I took the [assessment name removed] calm, prepared, and confident. I was hired.” - Another Satisfied Assessment Prep Customer
People around the world have been trained from a young age to study for multiple choice, closed-ended tests. You can’t blame them for doing what they’ve always done to stay competitive.
Candidates who study should not be penalized for making the most of the materials that are out there. But candidates who don’t should also be given a fair shot. When it comes to assessing talent, it’s critical that everyone is on the same playing field so they can let their true abilities shine.
You shouldn’t blame the assessment either. Pre-hire assessments have a decades-long history, and are backed by solid science - so long as the test takers are answering authentically. Unfortunately, the internet has made it easy to build a business around leaked exam materials.
Don’t blame the candidate, and don’t blame the assessment: blame the delivery mechanism.
The closed-ended, multiple choice questionnaire (that, to its credit, has worked for decades) looks like a odd, stagnant artifact in our rapidly evolving technological landscape.
We need to bring this tried-and-true screening tool into the modern era.
Artificial intelligence (AI) is powerful because it can draw powerful inferences (like identifying cancer) based on terabytes of historical data. It’s also given us the unprecedented ability to make sense out of open-ended, unstructured information.
Consider the following example. To describe herself to a computer (or online application), Jane would need to fill out a form for the program to recognize her:
Of course, we recognize her in an entirely different way:
We know her name because she just told us. We can see that her hair color is blonde. We can tell she’s happy because she’s smiling. All the data we subconsciously parse is unstructured: a computer can’t make heads or tails of it.
That is, until AI and machine learning. Below you can see how different artificial intelligence tools (examples in parentheses) can give structure to "unstructured" data like speech, hair color, and affect.
With the ability to make sense of unstructured data (like keystrokes in games or responses in recorded video interviews), asking candidates a series of closed-ended questions is no longer necessary to gather the data you need to predict job performance.
Instead, you can ask them to record open-ended responses to questions you (or a team of I-O Psychologists) have identified as predictive. These most often include:
A traditional closed-ended assessment already asks questions like these, but candidates are locked into choosing one of several responses. And if the size of the job test prep industry is any indication, many candidates feel trapped, forced to choose an answer that doesn’t accurately represent their thinking.
Put visually, you can see how a traditional assessment requires closed responses to generate a structured input:
Now compare that to an assessment that can make sense of unstructured video data:
This isn't just a profoundly better experience for the candidate. It also makes "gaming" the assessment impossible.
There are some interesting things going on with gamification in the assessments space as well. Games designed around a psychological framework can use machine learning to draw conclusions about candidates’ job-relevant attributes based on their performance and keystrokes.
Like with HireVue Assessments, gamified assessments are nearly impossible to cheat because they rely on open-ended responses.
Candidates are often frustrated by the limited choices they are given in a traditional-style assessment. For some the experience is so bad they feel required to spend more than $50 on assessment study tools! Talk about reflecting negatively on an employer's brand.
Prior to machine learning, this was just the price of doing business. The alternative - manually screening everyone - was too time consuming and required too much headcount.
With the latest advancements in AI and machine learning, there is no longer any reason to impose a lengthy closed-ended assessment on your candidates.