Start trial
PricingContact Us
Log InStart For Free

7 common user testing mistakes and how to avoid them

May 12th, 2023

8 min read

Series of box frames with lights

Written by

Team Tiny


Product-Led Growth


“It’s not what you look at that matters. It’s what you see.”
– Thoreau

User testing is hard, and it takes years to master the skills to consistently deliver the kind of user tests that produce breakthrough insights. The good news, however, is that the skills you need can be learned, refined and polished over time.

In this post, I look at seven of the most common mistakes made in user testing, and how to avoid them. Every mistake on this list has been and can be made by novices and experts alike (myself included). Despite years of conducting user testing sessions, I still keep this list on hand.  I hope you find it useful too. Share your thoughts and tips in the comments below.

Mistake #1: Making the task too complicated or vague

The right structure can make or break a user test: ask too much, and you will lose the participant; ask too little and you risk not collecting the evidence you need to make decisions or arrive at interesting insights.

NNg recommends thinking of creating task scenarios rather than “tests”. Task scenarios consist of three key parts:

  1. A sentence setting the context for the task.
  2. A sentence describing the activity the participant should perform.
  3. A couple of targeted follow-up questions, asked after completion of the primary activity. (Optional)

For example, in our image crop UX test, the task scenario was stated as, “You are editing a text document and want to crop an image in the story. Crop the image, to only show the girl and the balloon.”

If you are conducting moderated UX testing, you can ask follow-up questions during the session. In our example from image cropping, the participants were asked a couple of questions after they completed the primary activity. This process kept the main task simple and focused. It also prevented us from biasing the participant towards a particular methodology or outcome.

If your tests are unmoderated, I would recommend packaging the tasks into smaller chunks and running them separately.

Mistake #2: Asking leading questions

As any journalist will confirm effective interview techniques take years to master. In the context of user testing, the success of your session will be directly affected by your skills in this area. At a minimum, it’s important to remember three things when conducting qualitative interviews:

  1. Ask questions that do not plant an answer in the participant’s mind.
  2. Ask questions that show you are interested: don’t judge, correct or criticize.
  3. Let the participant lead the direction of your questions.

But these questions are quite theoretical. Which is why I developed a set of relatively straightforward questions that are now part of my interview toolkit. The hard part is translating these into a habit (for which there is no shortcut to Gladwell’s 10,000 hours). My top three questions are:

  • “Right, yes … what makes you say that?”
  • “Yes, yes … why do you think…?”
  • “I see, that makes sense … and then what…?”

Note that all of the above questions begin by confirming or echoing what the participant said. If you are new to UX testing, this may seem like an odd or inefficient thing to do. In practice, however, these “small talk” phrases make a huge difference to making your participant feel at ease—which directly impacts the quality of insights that you can hope to get from the interview.

Mistake #3: Talking

As important as interview techniques are, it is equally important to recognize that the role of the test facilitator is primarily not to talk. The goal of user testing is to observe (with awareness and gentle guidance) how your target audience uses your product. For the most part, this should be a silent “active listening” process.

In international hostage negotiations, a team of half a dozen listeners supports the lead negotiator. In Chris Voss’ book, Never Split the Difference, his anecdotes highlight that insights that led to breakthroughs often came from something that the listening team picked up on: an inconsistency, a hesitation, an unexpected change of tone. The same is true in user testing sessions – being aware and alert to notice a momentary hesitation and probe further can lead to UX gold.

In a UX testing session, we recommend asking for participant’s permission to audio record the session. Processing these audio files is quite time-consuming, but inherently valuable. Audio also provides objective support for any design change recommendations that you will make as a result of your tests.

Mistake #4: Making the UX test a Test

The problem with running a UX “test” is that it implicitly sets the expectations that there is a right answer. The consequence of this is that participants will want to “get it right” to please you, and will feel stupid if they perceive that they got it wrong.

The easiest way I have found to overcome this is to address it openly at the beginning of the session. In just about every interview, observation session or on-line test I start the process by stating:

“… this is not a test, there are no right or wrong answers. We are just really interested to hear your thoughts on …”

If a participant is stuck on a task during the session, i.e. they are failing at the set activity, I will remind them of the above, and reassure them that they are like everyone else…

“… I can see that you are struggling to get [the task] finished, and that’s totally fine… we are finding that most people are struggling on this, which is really valuable information for us as now we know what we need to change/fix to make our product better …”

I have found that this simple technique of acknowledging that there is no wrong or bad outcome visibly and instantly puts participants at ease and keeps the session moving forward.

Mistake #5: Defending your design choices

Think back to a past testing session that went south and you will probably find this mistake the culprit. It is hard to train yourself not to answer questions about design choices, even when directly asked by the participant. It is also hard to resist engaging in conversations about why design choices were made. These discussions belong in your debrief session, not in your user testing session.

Keep an eye out for this one creeping in if you have co-facilitators, especially if they are from the development or engineering team. It can be difficult for an engineer to refrain from defending his/her code-product-offspring. This sets an unproductive tone for the testing session, ensuring almost always that constructive insights will not be gleaned.

The best way I have found to avoid falling into this trap is to breathe and repeat this mantra before every session:

“We are testing a rough first draft, and looking forward to finding ways to make our product better.”

Seriously, I actually do that. As a designer, it’s critical to develop the awareness that it’s possible to fall in love with your own designs—and take conscious measures to remind yourself that no matter how amazing the current design is, it is really still a “first-draft”.

Another handy tip is to distance yourself from the design being evaluated, as this gives permission for the participant to provide honest feedback without the pressure to please the interviewer:

"The design team put this design together [fake it if you have to], and are interested in finding where the sticky points are"

Mistake #6: Looking to confirm that your design is good

“Any readers who like your poems, doubt their judgment.”

– Wendell Berry

This one is counter-intuitive, right? Most people conduct user testing to justify a particular design direction. To prove that the design that they have sweated over, and fallen in love with, is The Right One. This, sadly, is a mistake of the tester’s ego.

In practice, it is far more constructive to approach user testing with the attitude that you are looking for ways in which your coveted design will fail. It is only through the process of systematically eliminating all likely points of failure that you gain confidence that the design holds up. Unfortunately, our human tendency is to sweep the possible failures under the carpet, and seek out data to confirm what we want to hear.

The best technique that I have for addressing this mistake is time, and life, and running marathons. Developing the ability to de-personalise, become an aware observer and nurturing an empathetic relationship with failure are investments that will make you not only a bad-ass UX tester, but also a pretty awesome human being too. For me, this is definitely a “work in progress”!

Mistake #7: Forgetting to close the loop

You are probably high-fiving yourself for having nailed the above list, and in that enthusiasm feeling the urge to rush off and put all that newly learned goodness into practice. And that is great, but before you do I want to leave you with a final thought.

Remember to close the loop with your participants after the testing is finished. In my experience, most participants take part in research or user testing because they want to make a difference. Letting your participant know that you appreciate their time (by way of monetary reward, token gift or simply an acknowledgment) is both polite and expected. Taking the time to report how the users’ input helped to impact the product will set you apart. Think of it as an investment in the collective future experience of UX testing.

The report to the user does not need to be complicated: an email, or a quick brief about the key three things you discovered, and how those insights were used to change the product design is all that you need.

Do you have techniques that you use in your UX testing sessions? Love to hear your tips in the comments below … we all learn that way!


byTeam Tiny

Powering more than 40% of the world’s websites. Here to educate and empower the world through all things TinyMCE and more.

Related Articles

  • Product-Led GrowthApr 23rd, 2024

    CRM history, market and future: the essentials

Join 100,000+ developers who get regular tips & updates from the Tiny team.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Tiny logo

Stay Connected

SOC2 compliance badge


© Copyright 2024 Tiny Technologies Inc.

TinyMCE® and Tiny® are registered trademarks of Tiny Technologies, Inc.