Interviewing skills are the core of user research, but how do you know if you are a good interviewer? Are you getting the most from your time with research participants? Are you asking the right questions in the right way? You want to be unbiased, but how can you be sure?
In their recently published book Moderating Usability Tests, Dumas and Loring outline a number of best practices for usability test interviews. Even with all of their experience, though, they find that no interview session is ever perfect, and there is always room to improve technique.
What We Did
Our organization took this suggestion to heart. Our goal was to give members of our experience design team a framework and systematic approach for improving interview technique. We developed the interviewing guidelines worksheet that follows this article.
The worksheet started as a collection of tips and guidelines that is a combination of best practices published in literature resources and observations from our years of conducting user research. Originally, it was an unstructured document that we would periodically add to and ask the team to reference when they had the opportunity. Then we converted the document to a structured worksheet that we could use as a coaching and improvement tool.
The worksheet is now a condensed, one-page sheet that includes guidelines and examples organized into four key sections. The first section, Welcoming, lists several key elements to set proper expectations and minimize the participant‘s hesitation or nervousness. The Questioning section includes a series of guidelines for asking unbiased, productive questions, along with examples of good or bad questions. Within the Interacting section, we list guidelines and suggestions for appropriate communication with participants outside the wording of specific questions. The worksheet concludes with guidelines for maximizing the Closing of the interview. While there are other elements that may come into play in certain situations, we decided to keep the list to a manageable number.
How It Works
If we have the opportunity for a note taker to be present during the interview, that person will refer to the worksheet, document the code for a particular guideline, and add a short comment whenever the interviewer deviates from the guideline. If we don’t have a note taker present during the interview, we strongly encourage interviewers to videotape their sessions and then use the sheet to evaluate their own technique. In either case, the worksheet helps interviewers see patterns in their interviewing style and identify opportunities for improvement. There is no quantitative score at the end of a review—we are not looking to give the interviewer a grade. Rather, we are trying to provide a mechanism for constructive feedback and a forum for discussing potential improvements.
Conclusion
By developing an interview technique review worksheet, our team emphasized continual improvement and provided an easy way for team members to evaluate a user research interview session. As with our interview technique, we will continue to improve the worksheet and our process for review. While we may never reach it, we will continue to pursue the “perfect” user research interview.