Discovering the Right Tasks Using an Interview-based Approach

Jared Spool

April 30th, 2012

The other day, I wrote about how choosing the right words in your tasks makes a critical difference to the outcome of your user research. Mike Pauley wrote a comment, asking how to make sure you’ve got the right words:

Great timing on this, as I am dealing with the same issue with a test I’m currently running. Do you have any guidelines for how to approach question wording? Other than running the same test multiple times with different wording, how do you know when task failure is the fault of your IA or the way the question is worded?

A great technique is to adopt an interview-based task design approach. You start by interviewing your participants to ask them about their previous experiences and current needs.

For example, if we were testing the IKEA site for it’s navigation, we’d recruit people who either have purchased IKEA furniture or are likely to do so in the near future. Then, during the first 15 to 25 minutes of the session, we’d interview them about their furniture buying process and desires.

In that interview, we’d get them to use all their own terms and define their own tasks. If it’s something they’ve done in the past, we ask them to re-enact it for us. If it’s something they’re planning in the near future, we look for them to show us how they think they’ll do it.

The trick is that we let them tell us the words they use. After you’ve done this with a few users, it’s likely you’ll hone in on some generic ways to formulate the tasks that don’t influence the tasks’ outcome as directly.

Back in 2006, I wrote about interview-based tasks and recorded a podcast about how to do them. Interview-based tasks have become a very important part of our usability toolbox, specifically to deal with this problem.

3 Responses to “Discovering the Right Tasks Using an Interview-based Approach”

  1. Samantha LeVan Says:

    Not only do you have an opportunity to learn how to phrase test tasks, but you have taken the time to put the participant at ease. After they’ve shared their experiences, the usability test session shouldn’t feel so “test-like”. Great idea!

  2. Fred Beecher Says:

    Jared, I can easily see how that would work when testing an existing, fully functional system. Where I struggle with this technique is when it comes to prototype testing. Since prototypes have limited functionality, that limits what we can test. Our designs are based on what we’ve observed in research, of course, but the spectrum of human experience is such that it is likely that even a prototype well-grounded in research wouldn’t be able to cover the situations test participants tell us about. I’d love to hear your thoughts on this.

  3. Mike Pauley Says:


    I remember you or someone else saying something like this before at a UIE Conf – “have the audience help you write the questions.”

    When interviewing and discovering the top tasks of my audience, I was focusing on the broad subjects they were deeming important (very complex subject matter) and paying less attention to the intricacies of their sentence structure and nomenclature.

    Thanks for the reminder.


Add a Comment