What to do when user research isn't giving actionable results

If you work in UX, I’m sure you’ve had times when your research just doesn’t seem to be delivering anything useful. If you’re on a deadline and working to a budget and those actionable results are crucial. This can be a stressful experience - believe me, I’ve been there!

Don’t panic! With over 13 years of experience working in this industry, here are my tips on what to do when user research isn’t giving actionable results.

Take a step back to assess the project from a wider viewpoint. It could be down to one of any number of reasons. Here are the most likely:

You’re not covering all your user research objectives

This could be down to a few reasons. Firstly, are your objectives spread too thinly? Try to focus on no more than three clear (and testable) objectives otherwise, you’re likely to find yourself pulled in too many directions and are unlikely to be able to cover each of them sufficiently during user testing. One way to combat this is to start your project with a research planning workshop with all the key stakeholders present. You can determine their different business needs and objectives for the project and align everyone on a few key objectives which then form the backbone of the research.

Perhaps your participants aren’t relevant to your objectives. Where and how did you recruit them? Are you using the correct screening criteria? We recommend using a specialist recruitment partner like People for Research or Roots to find the right participants. There are plenty of other good firms who can do this too. It’ll ensure they’re the right audience and do not bias in any way.

Make sure you’re spending enough time with each participant to cover all of your objectives. We recommend no more than 45-minute sessions to keep concentration and engagement levels high. If your sessions are running past the 1-hour mark, you could well have too many objectives and chances are, your users are losing the will to live and not saying anything useful by that point anyway.

I’ve talked so far about one-on-one usability testing but regardless of the method you’ve chosen to use if you're not getting actionable results, the issue could be that you’re using the wrong method for the assumptions you’re testing. If you’re looking for quantitative data on whether something works or not, perhaps try click testing. Or, to garner opinion on a new feature perhaps run a large sample survey.

You’re finding out different stuff from every participant

One problem could be your sample size. When it comes to quantitative data, there is lots of stuff online which explains about sampling and margins of error. Take a look at this easy to digest piece from InfoSurv for a good overview.  

When it comes to collecting qualitative feedback, the rules are a little different but roughly speaking we’d recommend running interviews with no fewer than 10 participants at a time.

If you’ve not been able to determine themes and key issues from your research it could be due to low numbers. You can fix things by taking what you’ve learned so far from the participants and running a survey with statistically significant sample size. Alternatively, select a more focussed, precise audience on the next round of research. Define strict screening to ensure they’re exactly right for the subject you’re working on.

You’re getting a lot of subjective feedback

It might be that your user research method needs to be refined. Like I mentioned earlier on, your objectives might be better suited to a survey or click testing than depth interviews.

If you’re sure that your research method is correct, consider the questions you’re asking and whether you could frame them differently to help find what you’re trying to find out. An example here is instead of asking for feedback on something, consider providing the participant with some stimulus. This could be as simple as a sheet of paper or as complex as a high fidelity prototype. The idea is that you can encourage them to accomplish tasks using it. You’re more likely to get actionable results this way as the feedback which matters is more likely to be driven out.

People are saying X but doing Y in real life

I’ve found in my experience that this is something which comes up relatively frequently, especially when conducting user research on a financial or commerce-led product or website. Especially on pricing questions like “How much would you pay for this?”

It’s tricky and often a mistake to try to get people to predict their future behaviour. It tends to be inaccurate at best, wildly misleading at worst.

If this issue sounds familiar, consider running a smoke test where you simulate the availability of a feature or show a mock pricing page to real users. Use a tool like Hotjar to track whether they click the “Buy now” button next to the price you’re looking to charge or if they click on a link to use the feature.

You’re not getting the actionable results you hoped

As a researcher, you are not immune to assumptions. Sometimes you might think you know the answer but your participants are dispelling those initial assumptions. This could be down to a few reasons.

Firstly, perhaps your assumptions weren’t far off (this could be a follow-up or revised piece of research based on existing evidence). In this case, user needs may simply have changed.

Secondly, it could be that assumptions have been allowed to guide the business previously. You, your boss or whoever came up with those things in the first place may just be...well... wrong. This is why the evidence should always trump an assumption.

If neither of the above applies and you’re still not getting the results you hoped, consider running another round of research to make doubly sure before you make any business-changing decisions. You could also consider blending your research methods. If you’ve been doing just one form of research, try mixing two methods (for example depth interviews alongside surveys) to explore whether your findings correlate with a larger sample.

Back to my intro. If user research isn’t giving actionable results, do not panic! It’s likely to be something which can be easily fixed by working your way through these points until something resonates. If it doesn’t and you’re still stuck, get in touch and we can tackle the challenge together.