
By Eva Andrijcic
Assistant Professor
Engineering Management
Cognitive biases that impact your team’s performance
Psychologist Daniel Kahneman’s 2011 book “Thinking, Fast and Slow” has become the favorite of many of us who are interested in how cognitive biases impact our ability to make decisions, especially under constrained and uncertain conditions [1, 2]. According to Kahneman, cognitive heuristics are mental shortcuts that allow us to simplify our complex thought processes and come to a decision faster and with less mental investment. Cognitive biases are the resulting effects of taking such shortcuts. It may be surprising to learn that cognitive biases are not always bad, but they often lead to poor decision making. Research has shown both laypeople and experts are prone to using them. Why do we use them, you may wonder, when many of us have been trained to carefully process data and information, clearly state and test our assumptions, and consider problems from a holistic perspective?
Kahneman (who has won a Nobel Memorial Prize in Economics) and his collaborator Amos Tversky (now deceased; had been a psychologist) hypothesized and proved that people unconsciously employ a variety of cognitive heuristics in specific situations:
- When they have too much information to process, or conflicting information;
- When they lack meaning/context;
- When they have to operate under time constraints;
- Or because they don’t have enough mental capacity to process all information.
Consequently, humans remember “representative” examples and extremes, they generalize, make assumptions, and create patterns, all of which allows them to make decisions faster. While, from an evolutionary perspective, this ability to react fast was needed for the human species to survive, it often causes us to make sub-optimal decisions which can have significant strategic implications.
When people learn about cognitive biases, their typical response is, “Yes, I know others are prone to these biases, but I am not, or not to the same degree,” which is a bias in itself (blind spot bias). In fact, most biases occur subconsciously so people don’t even know that they are using them, unless they have been trained to notice them.
A few months ago, while teaching a professional development seminar on cognitive biases to graduate students and working professionals, I decided to illustrate the degree to which we all (even experts!) fall prey to these cognitive biases.
I gave all of the seminar participants a 10 minute quiz that contained 13 questions that were based on some of the original questions developed by Kahneman and Tversky. The results of the quiz stunned the participants, mostly the experienced ones. They were mostly shocked by how easily I could focus them on a specific value. For example, among the questions they were asked were:
- Is the percentage of African nations among members of the United Nations larger or smaller than X? What is your best guess for the percentage? (For one group X = 10, for the other X = 65)
- Was Gandhi more than Y years old when he died? What is your best guess for how old Gandhi was when he died? (For one group Y = 144, for the other Y = 35).
Obviously, few people other than serious trivia connoisseurs would know the correct answers to these questions, so people had to guess, but what they didn’t understand was how much the initial anchor value that I gave them (10 or 65, and 144 or 35) would influence their guess. The average guesses for the first question were 18% (for those who saw the value of 10 in the question), and 46% (for those who saw the value of 65 in the question). Similarly, the average guesses for the second question were 80 (for the group that saw a 144 reference value), and 70 (for the group that saw a 35 reference value). Given that participants were randomly assigned to the two groups, these results are quite striking!
This is a nice example of what Kahneman and Tversky called the anchoring and adjusting bias. We get stuck on the anchor (reference value), and we fail to adequately adjust our reference value to account for the relevant contextual information. Anchoring and adjusting bias can significantly impact teamwork, since a reference point or idea suggested at the start of a project could limit the scope of possible options considered later in the development process.
Some other very important cognitive biases can impact the functioning of a team and the direction of any team meeting and project. Consider the following and how they may determine your team’s direction and focus:
- Sunk cost fallacy, where people stay (or become more) committed to a project in which they have invested time/money/effort, without realizing that any previous investment is unrecoverable and should therefore not impact the future commitment;
- Champion bias, where team members evaluate a proposed idea based on the prior successes of the person championing the proposal, rather than based on any project-related facts;
- Sunflower bias, where lower-ranking team members might be less likely to speak against a dominant idea if a more senior person speaks first (especially one that they might report to);
- Shared information bias, where team members spend far more time discussing shared information, rather than sharing information that only they might have access to and that might advance their project further (this is often caused because of a perceived lack of psychological safety in the group);
- Groupthink, often referred to as illusion of unanimity, where the team wants more than anything to maintain group unity, so individual opinion is often considered unanimous with the majority view, even when it is not;
- Confirmation bias, where individuals and teams focus primarily on information that reinforced their preconceived notions, and fail to adequately account for information that might challenge their preconceived notions.
This is just a small sampling of the many biases that might impact a team’s performance, from selecting team members, to pitching ideas to larger groups, to coming up with initial estimates or directions (for a more comprehensive list see [1]).
So, the next time you meet with your team, try asking your teammates to answer a few simple questions (for examples, see [1, 3]), without the use of calculators, or any references, and see to what extent they utilize certain biases. This awareness might just be the key to your team’s long-term success. Recognition of present biases is the first, but instrumental, step in reducing their impact on your decision making process. It can be very empowering to recognize that everyone in the room (regardless of professional or life experiences) is subject to mostly the same cognitive biases, and this recognition can lead to more structured approaches to decision making in a team environment. There are many existing debiasing techniques that aim to minimize the impact of cognitive biases on team decision making processes, but we’ll leave that as a topic of a future post
References:
- Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus and Giroux.
- Holt, J. Two Brains Running. 2011. The New York Times Sunday Book Review. Available at: http://www.nytimes.com/2011/11/27/books/review/thinking-fast-and-slow-by-daniel-kahneman-book-review.html.
- Lalinde, J. (2011). The Quiz Daniel Kahneman Wants You to Fail. Vanity Fair. Available at: https://www.vanityfair.com/news/2011/12/kahneman-quiz-201112.