Avoiding leading questions in user interviews

Question design for user interviews is an art.

We interview users because want to gain insights by learning from their experiences. Ideally we want to get them talking and keep talking, as they traverse a card sort, a prototype, or a series of questions about how they work.

Here’s the thing: the best insights can come from users, but if we’re not careful in how we ask them questions, we can load them up with our biases before they have a chance to share their own.

Psychologist Adam Phillips has said that psychoanalysis is about examining someone’s side effects, or, “what falls out of their pockets once they start speaking”. This is a great analogy for interviewing, we want to ask them just enough to get them talking, get some momentum, and observe what falls out of their pockets as they talk. When we’re caught up in proving our biases, asking if they want red or blue, we’re not being present enough to gain the insight we seek.

Starve a user, feed a backlog

Artful questions begin with feeding users as little information as possible as a way of getting them to tell me what they think and feel. To ask non-leading questions you must be comfortable with being vague in how you ask your questions. To be nebulous, use as few words as possible get the user talking, and once they get going, they’ll usually continue without much prompting. Here’s some examples:

  • “How would you use this?”

  • “What would you do with this information?”

  • “What would you do next?” or my favorites, “What’s next?”

  • “How do you feel about what you’re doing there?”

  • Point out a part of the screen, or the whole screen and ask “Go clockwise around this area and tell me what each piece means”

  • “Expand on [that thing] you just mentioned”

  • Before a user clicks on something, typically while in the middle of talkig but indicating they’re going to click on something ask: “What do you expect to see (or do) when you click [that button/link/etc]”

The key with these types of questions is not asking them with a specific outcome in mind, it’s about asking questions in the hope that you’ll tease things out. Remember, while interviews can help us confirm how things work, the potential to learn new things is always possible. The more specific our questions, the less likely it is we’ll learn something new.

To get even more utility from the questions you ask, it’s important to become comfortable with long pauses. After the user finishes a thought, allow a moment or two of silence to linger so they can process their thoughts and add anything else that comes to mind. We often find that the most interesting insights come from those silences.

Conversely, when we ask questions that are too leading, we put too much information in the mind of the user.

Here’s some bad ones:

  • “What’s missing?” (Too quiz-like, leads the user to come up with something)

  • “Do you want this to be blue, or red?” (What if neither work, but we box them into choices they don’t want?)

  • “Where should we move this to?” (Why not let users organically suggest a move instead of prompting them?)

  • “Do you want this to be like [insert name of another product] does things?” (Ugh, such a rookie move).

  • “Do you think you’d like [insert your idea] to happen” (Just plain leading and looking for confirmation of your biases)

  • “If I made this do [insert dreamed up solution here] would you use it?” (Of course they would, they want to perform well and it requires no work to say yes to your imaginary solution).

  • “Would you use this if I made this change [insert change here]” (This comes off as desperate, and would be much more powerful if they said it of their own volition)

These types of questions put ideas in users’ heads, and lead them towards ideas they might not have come up with themselves. Keep in mind: we want to observe users in as clean a frame a reference as possible. As soon as we verbalize solutions or a directions for them to go — we’ve lost our opportunity for them to guide us as they meander through their thoughts and feelings around a given scenario. Give users space and silence for best results.

Bringing teammates into the interviews

It’s one thing to have a designer or two conducting interviews, it’s another to bring in Product Owners or Managers into the mix. Honestly, PM’s and PO’s can sometimes be too close to a product to prevent bias from slipping into the conversation. That said, it’s definitely something the team can practice.

In a recent project we live streamed the interview sessions over Zoom.us for the Product Owners and Managers to watch as we conducted them. This offered us designers time to work out how the interviews would be conducted, while the product team could learn, observe, and see what to expect.

When you start bringing extended team members into the interviews, simply agree that observation and listening at first are totally ok. “You aren’t learning anything when you’re talking”, as Lyndon Johnson used to say. As I mentioned above, not every moment needs to be filled with you talking, it can be more awkward to fill dead air than to let it be.

Trusting insights derived from non-leading questions

Interviewing users is a great way to reduce project risk and assumptions, but it all starts with how you craft and ask questions. What are some other questions we could ask that remove bias from the equation? Could this be a team activity before testing? Take some time to come up with 10 or 12 questions geared towards teasing insights out without letting your bias in. If you’re doing it right, the questions are vague but the user divulges more than you expected. Now have more pure, trustworthy insights for designing your product.

(Special thanks to Courtney Bregar at Pivotal Labs for her input and edits)

Previous
Previous

Engaging multiple stakeholders with the dots and cards design critique

Next
Next

Ultra-lean user testing with Amazon Turk and Google Forms