Cognitive Biases: A Guide to Identifying and Overcoming Biases in UX Research
If you work in UX, you’re probably surrounded by cognitive biases without even realizing. You may even be unintentionally implementing them in your designs or research. This guide will explore common cognitive biases in UX, and provide you with actionable tips for how to overcome these.
Chapter 1
UX cognitive biases: Why product teams should care
About the author
This guide was written by Melanie Buset, User Research Manager at Spotify. You can connect with Melanie on LinkedIn.
What are cognitive biases in UX?
Cognitive biases are your brain’s way of taking mental shortcuts to ease the cognitive load, make quicker inferences and faster decisions. They show up in all facets of our lives, and we have little-to-no conscious awareness they’re even there.
While cognitive biases usually start as a well intentioned hack to allow your brain to process an abundance of information quickly, they can lead to miscalculations about the world, resulting in assumptions, inflexible beliefs, and blindspots.
The most challenging aspect of tackling our own cognitive biases is that they can be so ingrained we don’t even realize we have them, or when we may be acting on them.
Cognitive biases can pop up all over the place, including in the product world. This is why it’s so important to remain objective when conducting UX research or designing a product, and consider all sides of every story—even the sides that are opposite to yours.
One common example we often see play out in user research is confirmation bias, when people only attain information that confirms their beliefs. This might look like only listening to and sharing information that aligns with your assumptions, and ignoring any additional insight that may contradict them. Cognitive biases like this are, by their very nature, subconscious actions or beliefs—but they can have a huge impact on how we research, design, and build a product.
The origin of cognitive bias
Psychologists Amos Tversky and Daniel Kahneman were the first to introduce the term ‘cognitive bias’, and defined it as “people’s systematic but purportedly flawed patterns or responses to judgment and decision problems”.
Tversky and Kahneman came to this definition through research which showed that because human beings are faced with an immense amount of perceptual stimuli, information and decisions, we start to form patterns in thinking and reasoning to help speed up and better understand the world around us—albeit not always accurately. Simply put, cognitive biases are our brain’s way of taking shortcuts to make decisions and understand things faster, but this means there can be inaccuracies.
Why are cognitive biases important in the UX research process?
As researchers and product creators, it’s crucial to stay objective in the UX research process: your key goal is to understand user behavior and bring truth to your team, by evaluating ideas and challenging assumptions.
To hold an objective mindset, it’s important to not only note your own biases but be on the lookout for when others may be influenced by theirs. If you’re not noting others’ biases, you could be baking assumptions into the process from the very beginning—as you set a research plan, collaborate with stakeholders, and begin to define goals and research questions.
For example, if someone has a bias that people from a specific market behave or present in a certain way without any evidence, it may lead you to only recruit those specific people for user research. This is an example of sampling bias, when people recruit or focus on a subset of individuals, and disregard others who don’t fit this unconscious bias.
You may also ask the wrong questions if the team's assumptions aren’t properly documented:
If your recruitment plan and research questions are misinformed, your recommendations and takeaways will be too.
This will trickle down and heavily impact the final product—if you provide unsuitable next steps, the product will be designed and built within these guidelines, ultimately creating an unsuitable, or sub-optimal product which could result in low engagement and dissatisfaction from users.
The purpose of user research is to identify what your audience want and need from a product, and to provide suggestions based on insights on the best way your team can deliver that. If cognitive biases impact any part of this process, it creates a domino effect, causing misinformed decisions and mistakes from the very beginning to the final product.
What is the impact of cognitive bias in user research?
The ultimate impact of cognitive bias in UX research is that it could lead your team to build the wrong thing(s), and if the wrong things are built this could lead to users becoming dissatisfied with your products, resulting in a decrease of usage, an increase in churn and a decrease in revenue. Let’s break this down.
Prioritizing the wrong problems
From the outset of a project, it’s important to understand the problem you’re trying to solve. Trying to understand the problem will often involve meeting with your stakeholders to learn what information you currently have, and what you need to uncover through research.
While anecdotal evidence has its place as qualitative data, you need to tread carefully. This can be one of the first instances where cognitive biases emerge—look for how stakeholders frame their rationale, and where they get the information for their conclusions. If someone is unable to reference past insights and identify why they hold the beliefs they do, then these might actually be assumptions, and should be challenged.
Remember 💙
We all have unconscious biases, so it’s important to approach everyone with respect and openness. These conversations are about identifying obstacles to overcome, not shaming or placing blame.
An example of how this may show up could be if someone recently read a report or had a conversation about a specific topic—it’ll be top of mind for them, so they might believe it should be prioritized over other research. Due to the clustering illusion, we can become fixated on something that was recently shared, rather than looking at the big picture.
The overarching objectives set out in your research strategy can help with problem prioritization—you may already have specific issues in mind, or you can utilize a traffic-light system to prioritize problems. Another way to address this bias is to determine the opportunity size that working on a problem could yield.
In other words, try to quantify how many people you could impact positively if you were to focus on this problem. Speak with your data experts to understand how to go about this, or dig up past insights to determine how often a problem arises and who it impacts.
By prioritizing the wrong problems, your product team can end up fixing things that users don’t think are broken, or releasing new features which won’t add any additional value.
Faulty research planning
One potential impact of cognitive biases is the knock-on effect it has on wider research plans. If biases influence or mislead product teams to address the wrong problem, or implement a feature which doesn’t meet a user need, you risk wasting time, money, and resources which could be used more impactfully.
In practical terms, this might mean asking your users the wrong questions or recruiting the wrong participants. When designing a research plan, you should always include information on:
- Objectives
- Methodology
- Assumptions & hypothesis
- Research questions
- Participant recruitment
- Reporting
- Study guide or script
- Success metrics (should also be noted in your hypothesis)
If, at this stage, your plan is influenced by cognitive biases (such as selection bias, which could influence suggesting the wrong participants or recruitment methods), it will have a direct knock-on effect on your research, and skew results.
Just as it’s important to run usability tests with all areas of your user base and use inclusive research principles, it’s important to ensure your cognitive biases don’t impact recruitment or planning. This avoids certain user groups being excluded from the conversation, and therefore being left out of any considerations for product developments.
As you create your study guide or testing script, you want to be mindful of the tasks and questions you ask, and how you ask them. Asking the right research questions really is an art form, and that’s partially because biases can easily affect how we ask them. Some types of cognitive biases to look out for when designing questions are the framing effect, confirmation bias, and anchoring bias—we’ll get into these more (and how to combat these biases) in the next chapters.
Tip 💡
Check out our guide to asking unbiased questions for tips and tricks to combat biases like this
A research plan is crucial when designing a research study, as it brings the team together to air out and bring awareness to any biases from the onset. It also provides an anchor to refer back to at any point in the process where biases might try to creep back in (such as during recruitment or when analyzing results). Without an objective, accurate research plan, biases can end up derailing the goals of the research and impacting the results, which can ultimately lead to misinformed product decisions.
Misinformed conclusions
How product teams report insights can impact how those insights are received and interpreted; reporting and delivery is what carries decisions and research forward. Poorly-presented information can lead to misinterpretations about participants, the health of the product, and even the wider business. Despite best efforts, it’s always possible for findings to be misinterpreted—you can’t control others’ interpretations, but this is all the more reason to be on the lookout for biases, and be conscious in how insights are delivered.
Sidenote: Remember that, even the best-reported insights will be inherently flawed if the wrong problems have been focused on in the first place (like we discussed above).
Common biases that occur when presenting results are confirmation bias, the framing effect, and the clustering illusion. Confirmation bias can play out when certain behaviors are ignored during testing and not reported, due to the fact these behaviors didn’t fit the expectation or assumed narrative of the researchers. This bias can be a challenge to detach from for product managers, designers, founders, and anyone who is heavily invested in the product.
The framing effect can also show up in how findings are cataloged—for example, you may write “40% of users were unable to find what they were looking for when landing on our homepage”. At first glance, 40% sounds like a staggering statistic, and you’d conclude the homepage should definitely be redesigned. However, if you rephrased to “60% of users had no issues finding what they needed on the homepage”, this would likely receive a different reaction, where people celebrated over half the participants experiencing no problems.
The language used and how a narrative is crafted can dramatically influence how data is interpreted. You always need to be aware of how your wording might be subtly impacting the way something is read. In chapter three, we’ll get into how you can do this.
Common cognitive biases
It’s estimated there’s over 180 types of cognitive biases—which ones show up can vary depending on the context; who you’re working with, what your product is, and even where in the world you are. But when working in UX, here’s some of the most common cognitive biases to look out for.
- False consensus bias: This bias is responsible for the belief that your behaviors or opinions are more common (and likely to be shared) than they might actually be. It’s especially likely to rear its head if you work on a product that you also use frequently—you can start to think or believe the issues you face are those of the general population as well.
- Serial-position effect: This is a psychological phenomenon that occurs because we are often more likely to remember information presented at the beginning or end of an experience. It’s especially prevalent during analysis, when the majority of what is remembered is from a session’s beginning or end.
- Frequency bias: The belief that something is more common than it actually is, as a result of you recently starting to notice or pay attention to it. This might occur after a new strategy or company goal is presented, as you become more aware of insights or problems related to said strategy/goal and think that the problem is bigger than it is, simply because it is receiving more attention than previously.
- Social desirability bias: The tendency to respond or behave in a way you believe will be favorable to someone. This often happens during qualitative UX research, both in unmoderated research—such as survey responses—or in moderated research—such as during interviews. The participant might begin responding to questions in a way they believe would make them more likable. This can result in inaccurate data, and often leads to an exaggerated amount of positive feedback on the product you’re investigating.
- The peak-end rule: The effect of an experience getting most attention during the section a person feels is most intense (the peak) and at the end, rather than people recounting the entire experience end-to-end. This bias often presents in user interviews, or during an insights shareout presentation where certain parts are recollected in more detail. This effect also compounds with the serial-position effect to make it more likely that the middle of an experience is mostly forgotten.
- Framing bias: The occurrence where responses are biased depending on whether the information was presented in either a positive or negative light. Often occurring during interviews, this effect is particularly prominent when asking questions in interviews and surveys, or presenting information, such as research findings.
- Confirmation bias: This bias is in action when we seek out and assign value to information that confirms our already-formulated beliefs. When conducting or reading research, it can be easy to craft a narrative by subconsciously cherry-picking data to match existing beliefs.
Recognize and control cognitive bias
Cognitive biases can show up in all areas of UX, but they’re particularly important to call out during research. Remember to watch out for the ways biases can creep in while planning or reporting results, and hold yourself and your colleagues accountable. Consider things such as framing questions objectively, running unmoderated studies to avoid researcher influence, or using a reliable research tool, like Maze, to mitigate bias—more on this in chapter three.
Ensuring research is as honest and accurate as possible is the only way we can truly depend on results to move our products forward in the right direction. Now, let’s learn more about different cognitive biases and how to combat them in the following chapters.
Frequently asked questions about cognitive biases in user research
What is cognitive bias in UX?
What is cognitive bias in UX?
Cognitive biases are your brain’s way of taking shortcuts to make quicker inferences and decisions. They show up in all facets of our lives, and we have little-to-no conscious awareness they’re even there. While cognitive biases usually start as a well intentioned hack to allow your brain to process an abundance of information quickly, they can lead to miscalculations about the world, resulting in assumptions, inflexible beliefs, and blindspots.
How can cognitive biases influence the UX design process?
How can cognitive biases influence the UX design process?
Cognitive biases can influence the UX design and research process in many ways: by making us prioritize the wrong problem, recruit the wrong research participants, or focus on the wrong data. The ultimate impact of cognitive bias in UX research is that it could lead your team to build the wrong thing(s), and if the wrong things are built this could lead to users becoming dissatisfied with your products, resulting in a decrease of usage, an increase in churn and a decrease in revenue.
What is an example of cognitive bias in UX?
What is an example of cognitive bias in UX?
One common bias we often see play out is confirmation bias, when people only attain information that confirms their beliefs. This might look like only listening to and sharing information that aligns with their assumptions, and ignoring any additional insight that may contradict them. For example the person conducting user research may only report findings that suit the experience they’ve personally had or that they’re trying to push forward. Cognitive biases like this are, by their very nature, subconscious actions or beliefs.