The way we understand the world is hugely influenced by factors that we don't even think about.
As Daniel Kahneman explains in his best-selling book, Thinking, Fast and Slow, many of our decisions are formed quickly based on automatic but unnoticed impressions and feelings. This process helps us make the instinctive decisions we often need to survive.
But it can also hurt us when we need to make calculated decisions that will have long-term effects. In product development, this slow, long-term thinking is critical.
Adjusting the way we think requires that we first acknowledge and understand our automatic—but inexact—way of viewing the world. Below, we'll explain the two systems of thought that Kahneman describes in his book and show how we can hack our own psychology and use scenario planning and process mapping to make smarter and more successful decisions.
The two systems of thinking that we use every day
According to Kahneman, humans are “of two minds”: one fast and one slow.
You're already familiar with both fast and slow thinking because you experience both every day. To illustrate this, look at this photograph:
In an instant, you could probably tell that this baby is happy. It's a general conclusion, but it's accurate. It was as easy for you to determine this as it was for you to realize the towel over the baby's shoulders is yellow. Figuring it out didn't feel like work. This is what Kahneman describes as fast thinking.
On the other hand, take a look at the math problem below:
You could make an immediate guess, but it would probably be wrong. The precise answer doesn't easily jump to your mind. To find the exact answer, you'd have to take measurements and perform calculations. You'd have to choose whether or not you engage in figuring out the answer. This is what Kahneman calls slow thinking.
To further distinguish between these two modes of fast thinking and slow thinking, Kahneman defines them as “System 1” and “System 2.” These systems are distinct and often conflicting. The automatic thinking of System 1, which recognizes a baby's happiness on first glance, is instinctual and helps us survive. But the slow, analytical mode of System 2 is what we need in order to work through complex problems and make long-term, rational decisions.
For System 1 to operate quickly and automatically, it must rely on heuristics or mental frameworks. These are often shaped by our psychological biases, feelings, and impressions, not by facts or reasoning.
Engaging System 2 is a decision, and it takes significantly more effort than using System 1. This is why we often default back to System 1.
Relying on System 1's heuristics to make decisions gives you an inaccurate conclusion that works as a quick, short-term solution. But by understanding these automatic heuristics, you can take the steps to engage System 2 and make slower, more logical decisions with more fruitful long-term results.
Below, we'll walk through three of the most pertinent heuristics that Kahneman describes in Thinking, Fast and Slow. We'll explain how staying in System 1's mode of thinking can lead to unsustainable decisions, and offer solutions to counter this way of thinking and steer towards System 2.
Heuristic #1: Priming
One of the key ways in which System 1 thinking affects our decisions is cognitive priming or exposing our brain to outside ideas.
Priming affects the way we think and act because it encodes a stereotype into our thinking. Once that's in place, it's hard to break our thinking away from that stereotype. For example, people who were primed with the idea of a professor before a test showed to have higher confidence in their knowledge and perform better on the test. People who were primed with the idea of an athlete before an athletic activity showed higher physical persistence.
Priming is a result of System 1 thinking. We already have material in our recent memory that helps us fill in the blanks and ease uncertainty in how we should think and feel.
Imagine you've spent the weekend reading about minimalistic UI design. When you come into work on Monday, your team meets to lay out your quarterly product roadmap. You might suggest making changes to simplify your UI, even though that may not be the most pressing issue with your product.
With the understanding that we never make decisions in a vacuum, you can take steps to control System 1 and engage System 2 to gain objectivity:
- Recognize the influences that are affecting your thoughts and decisions. These include recent conversations and experiences, books you're reading, shows you're watching, et cetera. Identifying these influences makes it easier to recognize their contributions to a decision.
- Try to explain your line of reasoning to someone else. This will help you realize if there are points in your reasoning that can be attributed to bias or impression. Then consider whether your decision might change if that bias was eliminated.
- Consider the counter position when making an important decision. Forcing yourself to consider an alternate decision, imagining the outcomes, and reflecting on the pros and cons exposes your brain to new ideas so that you don't make a final decision by default.
Making this effort helps you recognize the effects of priming and, when necessary, counter these effects to ensure you're making thoughtful and rational decisions.
Heuristic #2: Substitution
Another way in which we automatically use System 1 to makes our lives easier is by substituting difficult questions with simpler ones.
Typically, we don't grapple with big-picture questions on a daily basis. We substitute the more important but more difficult questions with less important, small-scale questions, which helps us find quicker and more concrete answers.
This can be a good thing because fine-grained questions and answers help us to move quickly through tasks throughout the day.
The trouble with our tendency to make these substitutions is that we do it even when the harder question should be answered.
For example, consistently high abandonment on a checkout page might cause your team to ask, “Where should we move the checkout button to increase engagement?” This will prompt an answer like, “move the checkout button to the top center.” This question allows for a solution that can be completed and checked off, so you can move on with your day.
But the more difficult question might be, “What is confusing users about the process of checking out?” or “How can we streamline the checkout flow to make it easier for users?” These questions will require more thorough analysis of user behavior, but address the issues at the heart of cart abandonment.
Seeking out the more difficult questions will engage System 2, meaning it takes more effort but has a higher payoff. You can start with these steps:
- Recognize where you're making dangerous substitutions: According to Kahneman, one of the defining attributes of this heuristic is that we swap questions “usually without noticing the substitution.” To notice the substitution, you have to be willing to accept complexity. Once you've identified a complex question, you can break it down to solve it—but recognize that these easier questions are all just pieces of a larger problem.
- Use the Five Whys method: Sakichi Toyoda, the founder of Toyota, developed this problem-solving model. When you're confronted with a problem, ask “why” five times. This helps you identify the root problem, even if you don't start by asking the most difficult question. [Diagram adapted from: Source]
- Ask for several perspectives on a problem: The “easier” question that you're asking yourself is shaped by your specific biases. But if you ask for a teammate's perspective on the problem, you'll get an easier question shaped by their biases. Combining several easier questions can help you identify and work towards solving the underlying, more difficult one.
Constantly avoiding the more difficult question leads you to treat the symptoms rather than the disease. Recognize where you're making the swap and take the time to dig for the more difficult question to break this habit.
Heuristic #3: Theory-induced blindness
Our brain's tendency to cling to mental models holds huge potential for error in making decisions. Kahneman writes, “Once you have accepted a theory and used it as a tool in your thinking, it is extraordinarily difficult to notice its flaws.”
After we've accepted a theory and come across something that doesn't fit, we don't assume the theory is wrong. Instead, we assume there's a connection (that we just aren't seeing) that will make the observation fit the theory. Consider this scenario:
You're more likely to assume that something is wrong with the blue flower than to assume that something is wrong with your underlying theory, especially if you've believed that all flowers are red for your entire life.
This is especially true on teams. The organizational complexity that comes along with operating within a team requires that the team adopt universal frameworks. Trouble arises when the paradigms outlive their validity—like when your perceptions of customers' needs are based on a market that has since evolved, or when your understanding of user behavior is based on your early-access beta users.
Once you take off the blinders of an outdated theory, new observations become obvious. Here are some steps to get out of this mode of System 1 thinking:
- Identify the mental models you and your team use to make decisions: These include decisions about design, use-cases, and product management. Think about when you established these theories and the conditions under which you established them. For example, were they set up hurriedly at the start of your company?
- Determine how those conditions have changed: Changing conditions include an increased team size, increased competitors, or developments in customers' needs due to new technology. When conditions have significantly changed, your theories are most likely outdated.
- Create a system to constantly reevaluate your teams' mental models: Models will continue to become outdated as your company evolves. You need to create a system of checks so that you never end up clinging to old paradigms. Try building in a quarterly evaluation of your models in your board meetings to make sure you maintain awareness.
Recognizing the theories you rely on in your day-to-day and your big-picture work will show you some immediately obvious flaws. As with many of Kahneman's heuristics, awareness of these theories is the first step in overcoming our biases.
We are blind to our blindnesses
Kahneman writes that one of the biggest failures of our reliance on System 1 thinking is that “we can be blind to the obvious, and we are also blind to our blindnesses.”
But stopping to think that we might be wrong and slowing down goes contrary to our ideas of efficiency and productivity, especially in a fast-paced company environment.
Making more rational decisions means understanding the benefits of arduous effort and accepting less immediate outcomes. In a world of faster and easier, get comfortable with thinking slowly.