Analysis & Methods

How to triangulate data from multiple sources in user research

The more the merrier: de-risk decision-making by leaning on multiple data sources for your insights.

6 minute read

Triangulating data was something I learned in academia. It was always essential to have data from multiple sources to confirm findings. Just interviews or survey data wasn’t enough for my academic research. 

However, when I transitioned to user research, I didn’t consider this concept much, especially at the beginning of my career. I learned new methods and ways of working, and triangulation, honestly, fell to the wayside. I conducted and presented usability test results or insights from qualitative interviews. While I battled the “only five participants?!” question, I didn’t think to include other data sources to back up my results or mitigate this concern. I believed a larger sample size was the answer.

But, there was a limit. I couldn’t always speak to more people, especially when teams needed results quickly. And, for a while, I felt stuck. 

One day, I was grappling with a terrible experience for one of the products I worked on. However, we didn’t have clear quantitative data to show any issues. So I decided to take a peek at our reviews. I had hit gold. Many of the negative comments had to do with the feature in question. That moment sparked in my brain and reminded me of the power of multiple data sources.

Types of triangulation in user research

There are a few ways to triangulate data:

  1. Data triangulation: Gathering data from different times, spaces, and people, such as a longitudinal study or comparing data from different locations.

  2. Investigator triangulation: Including multiple researchers in collecting and analyzing data and comparing code sheets or findings.

  3. Theory triangulation: Involving different theoretical frameworks, such as testing a variety of hypotheses behind motivation

  4. Methodological triangulation: Using varying methods to approach the same topic, such as a survey and interviews in the same study

We generally focus on methodological and data triangulation for user research. However, I have also seen investigator triangulation on larger teams and theory triangulation at hypothesis-driven companies. Nevertheless, methodological and data triangulation are the easiest and most common ways to approach triangulation in user research.

Connect all kinds of customer data. Unlock fresh insights. Transcribe, tag, and affinity map with Dovetail today—now free, forever!

Free? Sign me up!

When to use triangulation

Once I remembered the power of triangulation, I wanted to use it all the time. Soon, I tried to triangulate data for every single study. And it became similar to me trying to test everything with research back in my early career. I couldn’t triangulate everything, and it wasn’t necessary to. 

I then shifted my mindset to help me understand when it was necessary to triangulate and when I could let it go. For a while, I based my decision on time. Did I have enough time and resources to triangulate my research? However, I slowly moved away from resources and toward risk. I started asking myself how significant the risk was if we made decisions solely on one method or data source.

Finally, I used triangulation as a way to mitigate risk. After some time, I realized that triangulation was helpful in the following situations:

  1. To see a complete picture of the research problem or users, gaining multiple perspectives into how or why someone is thinking, feeling, or acting in a certain way.

  2. To enhance the validity of my study by combining complementary methods and mitigating biases or limitations from using only one method.

  3. To give insights more credibility by cross-checking other sources to see if the data lines up.

These situations allow me to reduce risky decision-making, especially if the results are confusing or fuzzy.

How to triangulate data in user research

There are quite a few ways to use triangulation in user research, and the majority of sources are available to most researchers. The main ways I triangulate research are through checking sources like:

  • Customer support tickets

  • Reviews of the product 

  • Analytics/product usage data 

  • Stakeholder interviews, such as speaking with account managers

  • Mixed methods approach, such as using a survey to follow up on qualitative interviews

  • Using multiple metrics to assess usability (e.g. time on task, task success, number of errors)

  • Intercept surveys (e.g. SUS)

Let’s look at some concrete examples of implementing triangulation into your next study for each situation.

To see a complete picture 

About a week ago, we put the System Usability Scale up on our platform to assess the platform’s overall usability and satisfaction. Unfortunately, we received a few low scores regarding the platform’s ease of use. However, we had no idea why users rated it so poorly. So we spoke to account managers to understand any issues from their perspective, and they gave us some areas of concern that several users had brought to their attention.

We decided to run a usability test to assess the underperforming areas of the platform better. We uncovered the root of the issues and why users faced them with this test. This approach helped us discover the what and the why, giving us a more holistic picture of the problem.

To enhance the validity

We ran a larger scale persona study where we interviewed 15 people from one of our segments. There was an influx of insights, ranging from needs to pain points, and we had a lot of qualitative data. However, we had no idea how this data generalized to the larger population or how to prioritize it. 

To increase the study’s validity, we sent a follow-up opportunity gap survey to assess the levels of importance and current satisfaction of our different insights. This survey helped us determine the most important insights to our users and what they are least satisfied with. It also allowed us to prioritize the qualitative data into a persona better. 

We also combined this with data analytics to confirm areas that users cited as pain points.

To give insights more credibility 

We focused on usability testing a newly developed feature for this study. Unfortunately, we didn’t have much time, so we chose a segment to focus on and tested with seven participants from that segment. We found that four out of seven participants struggled with a particular task. However, this data did not feel conclusive to make any significant changes.

We decided to look at the analytics data we gathered throughout the past few weeks the feature was available. We found an extensive drop-off rate through this data at the same point in the experience that we saw a struggle in the usability test. In addition, we reached out to customer support and learned they recently received a high influx of tickets complaining about this feature. 

There are infinite ways to use multiple sources and methods to triangulate your data. I highly recommend trying it in your next study, especially if you are in one of the above situations and trying to mitigate risk!

About the contributors
Nikki Anderson
User Research Lead & Instructor
Human problem detective and dog-petter.
All work →
Keep reading
Animated illustration of a highlighter on a page.


Tag, you’re it

You can travel across the magical and varied world of tagging taxonomies.
Animated illustration of a person holding up a nugget.


Chicken nuggets

Get your research repository humming with these articles.
Animated illustration of a fish coming out of a person's head.


Curious minds

Interesting conversations with fascinating folk from all over the world.
Animated illustration of two people working on a single laptop.


One dream

Team work makes the dream work. Articles all about collaboration.
Illustration subscribe inline

Join 45,000 product people and get Method in Madness in your inbox

Analysis & MethodsScaling ResearchTeam CollaborationConversations
Curated in Sydney and consumed globally: Method in Madness explores the craft and challenges of understanding people while working in the madness of organizations worldwide.
© Dovetail Research Pty. Ltd.