This is a story how we at Toggl are using automated skill-based testing to hire top developers with little effort. Our mobile technical lead writes about the key elements in his process and gives useful tips on how to build workable tests with Toggl Hire.
Usually I like to write about technical topics, about cool upcoming new features or entirely new apps I am working on. But today I would like take to focus on my second big responsibility here at Toggl: building a team of amazing programmers to make all the other great things possible.
Specifically, I would like to talk about how over the last year I almost singlehandedly hired six great developers with very little effort.
I started out with very little experience in hiring and while I still would not consider myself an expert, I learned a lot over the months that I would like to share here.
The process
While we are very flexible at how we get things done at Toggl, most of our hiring processes look something like the following:
- Our initial selection is made using the automated online testing tool Toggl Hire
- Candidates passing the online test get invited to one or two video call interviews
- The most promising interviewees are invited to work with us for three to five days, just as if they were part of the team
- After a short final interview, those that did well in their test-week get an offer and often start working with us within a few weeks
Depending on your background, this process might on the surface seem very straightforward. Or maybe you think we are crazy for trusting automated tools, never even asking for a candidate’s CV and not bothering to meet people face-to-face before hiring them.
I would like to convince you that this process is not only viable, but very efficient, if done right.
Automated testing with Toggl Hire
While not the only or maybe even the most important part of our hiring process, Toggl Hire‘s automated skill-based testing is the key to the efficiency or Toggl’s hiring.
It is the first filter applied to the large pool of potential candidates, and being almost entirely hands-off it can do work equivalent to checking hundreds or thousands of CVs in the blink of an eye.
In fact I think that skill-based testing such as Toggl Hire allows us to do is even more valuable than reading CVs for two reasons:
1. CVs are written to present the applicant in the most positive light possible
They focus on the strengths of the candidate, and might leave out important weaknesses. They also require you to trust in the honesty of the applicant, which at this stage in the process is not a given.
An online skill test, evaluated automatically, however, is much harder to fool. You cannot simply exaggerate your skills and experience. Either you get the answers right or you do not. In addition, since you are the one creating the test, you can make sure to ask questions testing the skills and knowledge important to you and the position.
2. You will find candidates who may not have even been looking for a new job
Writing and submitting a CV can be a significant hurdle when applying for a position. Certainly those not looking for a new job will never even bother applying for your position.
By having an online skill-test, we do not only advertise the availability of a position but also tempt people’s curiosity with a quick challenge, and a free T-Shirt when doing well on the test.
This significantly improves your reach by including excellent candidates that might not realise that they want to work for you.
Certainly most of the people I hired took the initial test for fun, for the free shirt or simply out of curiosity.
For these reasons, when it comes to efficiency, I consider automated skill-based tests our most powerful hiring tool. In fact, given the above, Toggl Hire replaces not only CVs, but even initial phone screenings, all while finding more candidates than a regular application form ever could.
The hiccup of automated testing
However, all the above hinges on one thing: the quality of your test.
An automated test will always be exactly as good as its questions. Get the questions right, and you will find who you are looking for.
Unfortunately, writing a good test is not easy and requires an initial investment before you can reap the benefits. I hope the following tips, learned over many months while continuously refining and updating our tests to the best of our understanding, will help you get a head start.
How to create a good automated skill-based test
The key aspect of a good skill-based test is in the name: You want to test skills, and you want to be sure you are testing the right ones, which is not always as easy as it seems (you can read more about that here)
1. Know what you are testing for and make sure it’s what you need
It is easy to come up with a bunch of questions related to, for example, programming. Just because the question relates however, does not mean it is testing the applicant’s skill as a programmer.
You need to think through the questions, and how you would or could solve them. If it requires the experience and skills you want in a new hire, your question is good.
2. Avoid trivia questions
How tall is the Eiffel Tower? What does KPI mean? What version of iOS runs on the iPhone 5?
What these questions have in common is that the only skill they require is to open a second tab and copying them into Google. Or alternatively, Siri gives straight up answers for the first two, and a useful link for the third.
Unless “looking up things” is a key skill for the position, this kind of question may not be what you want.
3. Ask questions that test understanding
With the availability of search engines, testing pure knowledge is almost impossible. What is possible however is to test understanding.
It is likely that the position you are hiring for requires dealing with and understanding some complex system or another. For programmers these are pieces of code, larger scale architecture, and the ecosystem an app might be part of. For someone in marketing, this might be a basic understanding of psychology, or knowing how different marketing channels can interact or counteract each other.
As an example, I ask candidates to give me the results of several pieces of code, usually just a few lines each. This tests that they can read code and step through it in their head, while keeping in their head (or written out) the various values of variables. This tests exactly the kind of skill that is one of the requirements for any programming position.
4. Avoid “gotcha questions” and riddles
You want to test the skills the candidate will have to use on a regular basis.
Most gotcha questions and riddles don’t test analytical thinking, but much more whether the candidate has heard the riddle before, or happens to intuitively guess the answer to the question.
Instead you want to ask questions that require basic knowledge in the relevant field, and manners of thinking things through that will actually be required on the job.
5. Ask questions that test analytical thinking
While the importance of soft skills can differ between different positions, analytical thinking and general intelligence is a boon to any line of work.
You could give the candidate a small logical puzzle with a clear solution that can be arrived at by thinking it through attentively, or ask them to choose which of a given list of arguments or facts contradicts a stated position or line of reasoning.
No matter how you do it, you want to _pick the brain_ of the applicants to see if they can understand and solve problems and arrive at the correct or best solution.
6. Keep it short
A guideline I follow myself is that each question should be answerable in less than two minutes, and that the whole test does not take longer than 20, and I err on the side of shorter tests.
One reason for this is that a shorter test will be taken by more people, especially if you advertise the short duration ahead of time.
Secondly, I rather ask a few well thought-out questions that tell me a lot about a candidate, than casting my net too wide, and asking too many questions, or questions at which people can get stuck for a long time.
7. Think carefully about the test’s passing grade
I have both seen and tried several approaches when determining a passing percentage, or the number of correct answers to advance to the next round.
The first approach is to set the passing grade at 100%, only allowing those that answer all questions correctly to pass on. While this may seem harsh and unreasonable it can work well for shorter tests that test basic understanding and experience that any candidate needs to have to even be considered. Generally, those answering all such questions correctly are often good candidates.
The second approach is to set a relatively low passing grade (I have gone as low as 50%). To make this approach work, the questions have to be harder. Applicants are not expected to answer all questions correctly. Instead it is more a question of how many they manage to answer well in the limited time given.
If your test does not quite achieve the right outcome, whether it lets through too many or too few candidates, consider adjusting the difficulty and passing grade of the test. But always do so with an understanding of which of the two approaches above you are going for, or come up with your own.
8. Open-ended questions
While much of the above applies generally, there is an inherent problem with open ended questions: they cannot be graded automatically.
For my case of looking for developers this is not a big problem, since the technical skills required here can be tested by objective questions.
However, when looking for a designer, marketing stunt creator or HR ninja the relevant skills may be harder to test for. Even in those cases however, I would encourage you to still test for critical thinking and general understanding. You could also test their views on aspects of work culture to find someone with similar values as you or your company, though you may lose out on otherwise good candidates because of small differences in opinion.
Even for more technical positions, however, there can be a lot of value in the right kind of open question. For example, I ask every applicant to solve a relatively easy programming problem that requires just a couple of lines of code but shows whether the candidate is comfortable writing code. A good candidate is usually able to implement a decent solution in just a minute or two.
While that kind of question cannot be graded automatically, I use it as another manual filtering step: only those that pass the other questions and also submit good code – something I can usually determine at a glance – will receive an interview invitation.
I also used to leave an open question where someone could leave a link to their website, portfolio, GitHub account or similar place to show of their experience and previous work, but I find that this information does not really help me make a decision while taking a lot more time than quickly scanning a piece of code.
However, I can see more value in such open questions for less technical positions, like designers where a good portfolio is key.
9. Test your test
Lastly, but maybe most importantly: make sure you test your test. Show it to people in similar positions to the one you are trying to fill, see how well they do on it, and ask them what they think.
I even ask everyone I interview how they liked the online test, and incorporate their feedback into future versions to make the test and Toggl Hire do as much of my work for me as possible.
How to get the most out of automated testing
They real key to unlocking the vast potential of automated skill-based testing, like we do with Toggl Hire, is to ask the right questions.
Investing a few hours to create a good Toggl Hire test can save hundreds of hours of looking for and sorting through candidates.
Using Toggl Hire and the processes and tips above I was able to single-handedly, and by investing just the bare minimum of time, find an entire team of great developers that I am happy to now work with every day. And we continue to hire using the same process with confidence and no intention of making big changes anytime soon. More about that here.
It took me months before I developed my current understanding of what makes for a good skill-based test. I hope that with this post you can do the same in weeks or days.
If you have questions, other ideas on how to create great automatic skill-based tests, please feel free top get in touch or drop a comment below.
Until then, happy hiring!
Paul started at Toggl Track as an Engineer after quitting college (he thought "real work" was more fun) and leading a game development team. Now, as CTO, Paul works closely with other department heads and engineering managers to maintain a technical vision for the company. With a focus on leadership and strategy, Paul coordinates the efforts of Toggl Track's engineering teams to follow this vision and achieve sustainable technical solutions that support company goals. In his spare time, Paul continues to follow his passion for writing clean code through various open-source projects and runs a "hella exciting" [sic] game of Dungeons & Dragons.