Interviewing Programmers
This blog has been a bit quiet of late. I've recently finished a contract in which I was doing little development work but lots of interviewing to fill technical positions on several teams. The reason I mention interviewing is that I recently read this surprising article about how to prepare for an interview at Microsoft and Google. Don't get me wrong. It's probably great advice if you want to work for one of those companies. What surprised me was the following:
Practice using the same medium (e.g. paper and pencil) and time limits (e.g. 30 minutes) as the real interview.
Google and Microsoft both use whiteboard coding questions, yet often candidates practice by coding alone at home on a computer with a compiler. ...
A key lesson I learned doing technical interviews is that the only way to judge a programmer's ability is to sit down and program with them at a computer. Any programming test that requires only paper or a whiteboard is too trivial to give you any useful information about the candidate's abilities as a programmer or their experience with current technology and techniques.
The best way I've found to interview candidate developers is to pair program with them to do real work on the actual code of the project that is hiring. However, that's not always possible: you might be trying to hire for a greenfield project or the organisation may not allow you to show interviewees sensitive data or proprietary code. In these cases a well designed programming exercise is the next best option. I had to use an exercise for the interviews I conducted. Here are some lessons that I learned about designing exercises for programming interviews:
The exercise must involve working with current development tools. A programmer should know how to use the tools of their trade. For mainstream languages that means a decent IDE with code navigation, analysis and refactoring support. A junior developer might not have had a chance to work with the best commercial IDEs, but they should pick up the features quickly. A senior developer should be aware of the state of the art and at least have tried an evaluation copy of the best tools on the market. Look out for programmers who have said that they use the latest tools in screening interviews but use no more than the text editor in practice.
The exercise must require the candidate to apply important features of the language. Every candidate I interviewed was able to give textbook descriptions of polymorphism, abstract interfaces, exceptions and, in C#, delegates and events. Surprisingly few were able to actually use those language features to solve a simple problem.
The exercise must involve understanding and working with existing code. Any programmer can write code. Good programmers can read code. Very good programmers instinctively write code that follows the same style as the rest of the codebase.
The exercise must involve error handling. Any programmer can write the happy path for a problem. Good programmers think about the error handling at an early stage.
The exercise must involve testing. Good programmers understand how to write automated tests, how to test for boundary conditions and corner cases and how to test error handling code.
The exercise must offer scope for domain modelling. Good programmers choose names that reflect the problem domain. Bad programmers choose names like "MyInterviewExercise".
The exercise must involve asking questions. Include some ambiguity in your description of the task. A good programmer will ask you to clarify your requirements. A poor programmer will make an arbitrary decision without consulting you or, worse, not notice the ambiguity.
The exercise must be realistic. You don't want to get sidetracked explaining that some aspects of the solution can be ignored because it's just an interview exercise. The entire exercise must be complete and consistent. For example, if a candidate would rightly want to log a significant system event, there should be an API in the exercise codebase for logging that event or announcing it to a system monitor.
The exercise must involve changing requirements. Introduce requirements bit by bit. How does the candidate cope? Do they refactor their code to adapt to the new change or just hack the new behaviour in?
The exercise must offer the candidate enough rope to hang themselves. The exercise should not have an obvious path to the solution. The candidate should have ample opportunity to write bad code, use bad names, ignore error handling, rewrite logic that already exists in abstract classes and write long, messy methods. One thing I look out for is unnecessary "cargo cult" code written parrot fashion without any understanding of why it is not needed or where it would be applicable. Unnecessary getters and setters for member variables are the most common kind of cargo cult programming I've seen.
The exercise must test many different skills and practices. If the candidate gets stuck on one part of the exercise, help them through the problem so that you can see how well they do in other parts of the test.
A good interview programming exercise takes at least an hour. That means that it's still worth doing telephone and face-to-face interviews to screen out unsuitable candidates. I'll describe the lessons I learned about that another time.