Fundamentally special: why research is an indispensable specialisation in UX

Designers can make good researchers, but ideally these two activities should be done by separate people. 

Recently, I’ve seen a few UX job listings that suggest designers need to lead research studies to inform their designs. These job ads want designers to have prior experience recruiting participants, conducting usability tests, writing reports, and handling analytics. Is it fair (or even possible) for UX designers to do all these things well in a fast-paced environment?

Having worked as a product designer and then a design researcher, I’m deeply convinced that research skills shouldn’t be shoved into a product designer’s job description. And this isn’t just because designers already have lots on their plate. 

The main benefit of having a specialist UX researcher on your team is having someone who uses both technical and soft skills in a critical manner to reduce bias during discovery and strategy formation.

Researchers are less susceptible to unconscious bias

Should the person who’s responsible for solving the problem also be the one who’s defining the problem

While this might seem more efficient, you risk hearing only convenient truths during the discovery phase. Designers and product managers are often much more familiar with the technical domain and constraints than researchers, so they’re more likely to have solutions in mind even before talking to the first research participant. If we want to hear user needs clearly and objectively, we need to minimise unconscious bias.

This is why the research should be done by, well…, researchers. 

We have no stake in solution delivery and implementation. As a researcher, I actually don’t care how customer problems will be solved and what impact that has on timelines or the design system. This allows my brain and heart to be fully focussed on listening and empathising with the participant. I can record accurate observations more easily because I’m not evaluating their responses against design concepts and build effort. These practical realities tend to only enter the researcher’s world at the end of the analysis phase, when we’re trying to translate the customers’ needs and perspectives into the business context.

The value of a research mindset

Eliciting, recording, and interpreting data properly is what happens when you apply a research mindset. But the benefits are broader than that. A research mindset helps us make evidence-based decisions – rather than relying on our gut feelings.

When we’re analysing and synthesising data, it’s important to know how those responses were generated or interpreted. This is especially true when we’re looking at data that another researcher produced. We ask questions like:

  • What were participants responding to, when they gave these responses?
    • Were participants asked leading questions? 
  • What’s the sample size of this study?
  • How were these statistics or analytics calculated?
    • Is this the best way of expressing how participants responded?
  • How were these participants recruited?

Taking the time to ask these types of questions lets you gauge how much confidence you should have in the dataset. People whose roles are motivated by moving into delivery as soon as possible might be tempted to skip this critical questioning phase.

It takes time

Developing a solid research mindset doesn’t happen after running a few interviews or reading some articles. I see a research mindset as a combination of technical and soft skills that are sharpened by constantly handling practical and theoretical details that are specific to the field of research. So it makes sense to use a dedicated expert if you want the benefits of a research mindset.

A major barrier to developing a research mindset is a lack of time to gain mastery over these bias-busting skills. I found this to be especially true in the world of consulting. 

In my first strategy job, I had a superficial understanding of qualitative research. I didn’t really know why we did things the way we did. I didn’t know how to check if I had introduced problematic biases into our recruitment process, interviewing methods, or analysis of data. I was just doing what felt convenient and efficient. Even though I wasn’t the designer who would go on to implement anything, I still fell into the trap of making heavily biassed decisions because:

  • I lacked guidance from a research specialist
  • the time-based agency model disincentivised asking difficult foundational questions
Different pillars, one foundation

Given that I moved into the research world partly due to my strategy background, I wanted to finish this article by reflecting on how my understanding of ‘strategy’ has expanded over the last few years and how research connects with it. 

I used to think strategy was merely about articulating creative concepts succinctly. Once I worked as a strategist, my understanding of what strategy referred to grew from just brand and advertising strategy to:

  • experience and innovation strategy (high-level UX and product work)
  • media strategy (what channels to advertise on, what keywords to use)
  • content strategy (structuring and governing digital content, deciding what to publish and when/where/for whom)

And since leaving my first strategy role, my world grew bigger still as I encountered people from fields such as: 

  • corporate strategy (which informs management models and major commercial decisions like goals for the financial year, mergers, and acquisitions)
  • product strategy (which is concerned with market sizing, where to play, and how to win at a more granular level)
  • marketing strategy (which takes into account pricing and product-market fit, among other factors)
  • … and other industries that are less relevant to UX, such as strategy for environmental sustainability, political parties, industry-specific regulations, urban design, etc.  

Practitioners in these spaces often need to communicate complex things in simple ways, so they’ll use artefacts like massive spreadsheets, journey maps, and a plethora of frameworks (e.g. PEST analysis, MECE). But what they all rely on is accurate, relevant information. 

And this is where research fits in. Strategy with substance always needs to be based on evidence. Brand, media, product, and corporate strategy may look like discrete pillars, but there is one foundation they all need to stand on – research.

Photo by Darryl Low via Unsplash

Persuasively articulating the what, why, and how of goals, plans, and recommendations requires evidence. Even the most beautiful story will fall flat if the data isn’t robust or has been interpreted with bias. And for commercial contexts, you’re going to need stories with scale; quantitative data needs to be woven into the narrative just as carefully as qualitative data.

Looking back on where I started, I’m once again convinced I made the right decision to leave my first strategy job when I did. It was a place where research as a craft wasn’t taken seriously. And the more I encountered other types of strategy, the more I outgrew their brand-centric view of the world. Believing that proficiency in only one area of strategy is enough to solve your client’s most pressing problems is like calling yourself a professional swimmer when you’ve only ever waded in the kiddy pool. That was fun for a moment – but I quickly realised I needed more depth. 

It’s a complex world out there

I’d like to end by affirming that researchers don’t have an exclusive claim on providing strategic value. I’ve worked with some excellent service designers, product managers, and design leads who’ve shown they’re capable of undertaking systematic inquiry and generating insight. The traditional double diamond is getting fuzzier and fuzzier because, in the real world, discovery and delivery aren’t discrete independent phases.

But it’s important to have specialist UX researchers in the picture because bringing in delivery-oriented thinking too early into the process introduces a lot of dangerous bias. Yes, it’s possible to ship products without dedicated researchers. But even a test and learn approach will require the team to systematically collect and analyse the feedback that comes in after release. 

I’m not asking for researchers to get a seat at the table. But I am asking for people to look underneath the proverbial table. There, you’ll see that researchers are the legs that allow the table to stand grounded in reality – not bias.

Disclaimer: Views expressed are my own. The opinions expressed here belong solely to myself, and do not reflect the views of my employer.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s