The Perspective Values Biases In Critical Thinking
The purpose of this article is to discuss several key cognitive biases and their effects on decision making within strategic innovation management as well as how to minimize their effects so that team members can contribute optimally to the fuzzy innovation process. They are essential in understanding and managing appropriately to ensure your innovation outputs are most suitable to your challenges and problems identified, rather than being decided upon by instantaneous emotional instinct without objective reflection.
By the end of this article, you will learn:
- how to identify key innovation related cognitive biases
- how to challenge them
- how to make better decisions for innovation outcomes
Cognitive biases are not all bad
Cognitive biases are mental shortcuts (known as heuristics) and they actually make a lot of sense: they’re designed to help us survive in the hunter-gatherer sense.
Our brains have evolved over two hundred thousand years and they operate in much the same way today, despite our enormously different and fast-changing environment. The World is vastly complex and humans have never before been bombarded by so much information on a daily basis. We cannot process all the information around us, therefore we must resort to mental shortcuts to make decisions quickly and effectively.
Great for species survival, not for innovation
Biases can often result in accurate thinking, but also make us prone to errors that can have significant impacts on overall innovation performance as they get in the way in the modern knowledge economy that we live in and can restrict ideation, creativity, and thinking for innovation outcomes.
Our prior experiences and expertise cause ‘errors’ that limit our ability to thinking divergently and generate new ideas from a subconscious level. Nobel prize-winning research by Daniel Kahneman and Amos Tversky popularized the term ‘anchoring’ which refers to these deeply held biases and how they result in irrational decision making within economics. The result within innovation is less creative thoughts and decisions, causing us to jump to less than optimal outcomes because our brains have evolved to instinctively reduce uncertainty and keep us on the ‘safe path’ where ever possible. Great for species survival, not so great for innovation.
How do they limit creative and innovative thinking in particular?
Known broadly as the ‘curse of knowledge’ (or effect of knowing), biases rely on our past experiences and ways of applying prior knowledge, particularly in decision making. The more previous success you’ve had in applying that knowledge, the harder it is to imagine alternatives. This helps explain why older team members tend to struggle most to think divergently. Most decision making is instinctively guided and controlled by these rational short-cuts, without us even being aware of it consciously. The less you practice doing this, the harder it is. The result can be a negative impact upon creative and innovative thinking (especially in divergent ideation and conceptualization phases) where key decisions about what to take forward are made. Not keeping them in check can also mean you end up trying to solve the wrong problems whilst ignoring critical flaws only to repeat the same patterns again for future projects.
Most of the decisions we believe we’re making with a clear mind are actually controlled by mental shortcuts known as cognitive biases and it is important to learn how to minimize their negative impacts on innovation.
Information processing vs. emotional biases
Broadly speaking, cognitive biases can be split into two types: information processing and emotional biases. Information processing biases are statistical, quantitative errors of judgment that is easy to fix with new information. Emotional biases are much hard to change or fix as they are based on attitudes and feelings, consciously and unconsciously. Both types can have implications when assessing new potentially innovative concepts to further iterate and develop because they operate to keep you within your comfort zone of what is already de-risked and know, because of the underlying belief that you’ll be safer, more secure and more comfortable with less uncertainty and risk. Whereas we strive to do the opposite in our innovation consulting, by getting people outside of their day-to-day frames of reference. Including environment, organizational thinking routines, comfort zones and into the ‘adjacent possible’ where the unlocked and unrestricted creative magic really happens.
Not everyone experiences biases in the same way or extent, but some or a mixture of just a few can distort creative and critical thinking and optimal decision making. Which could result in not best serving the interests of the firm, but in satisfying personal biases, sub conscious egos and agendas.
⚠️ Given this scenario, here’s a 3-step process to de-bias innovation within your organization :
1. Spot the biases
First, you need to know when a bias is having an impact on the process. There are some key moments to watch out for when biases can be most influential:
- When carrying out selective research on existing innovations
- During ideation rounds
- When discussing most important features to develop for customers
- When discussing and critically assessing final ideas to develop into concepts
- When storyboarding your prototypes to build and test
- When deciding on critical assumptions to test
- During business model canvas sessions
- When developing your pitch content
25 sentences which should alert you!
Here’s a list of comments drawn from some of our innovation workshops demonstrating hints of cognitive biases at play:
- “That’s the way we’ve always done it”
- “We know what our customers want”
- “Millennials are just too demanding”
- “We should know what to make, not our customers!”
- “What’s the KPI for this innovation project?”
- “Middle management won’t let that fly”
- “The CEO needs to validate it first”
- “It’s too uncertain, we need a spreadsheet”
- “That’s too disruptive”
- “How do we know it would even work?”
- “Our development cycles are too long for that”
- “Let me check with my N+1”
- “That idea is too crazy”
- “I can’t think creatively”
- “It’s already been done”
- “Nobody would buy it”
- “I have too many meetings anyway”
- “We’ve already tested something like that”
- “I’m not a creative person”
- “We can look at that next year”
- “I’m too logical a thinker for that”
- “Not everybody believes in innovation”
- “There’s no budget for this risky stuff”
- “Let’s just do a survey”
- “There are too many silo’s for that to work”
2. Know & conquer : 16 key innovation specific cognitive biases
We next need to become consciously aware of the specific biases at work so we can identify them ourselves as they occur. Here are 16 cognitive biases to look out for that impact creativity and innovation process. They can originate from personal biases to group dynamics and politics and more. Here are some that affect divergent and creative thinking in working groups:
- Confirmation bias: we believe what we want to believe by favoring information that confirms preexisting beliefs or preconceptions. This results in looking for creative solutions that confirm our beliefs rather than challenge them, making us closed to new possibilities.
- Conformity bias: choices of mass populations influence how we think, even if against independent personal judgments. This can result in poor decision making and lead to groupthink which is particularly detrimental to creativity as outside opinions can become suppressed leading to self-censorship and loss of independent thought.
- Authority bias: favoring authority figure opinions ideas within innovation teams. This means that innovative ideas coming from senior team members trump or better all others, even if other concepts, ideas, and inputs could be more creative and relevant to problem-solving.
- Loss-aversion bias: once a decision has been made, sticking to it rather than taking risks due to the fear of losing what you gained in starting something and wishing to see it finished. We also attach more value to something once we have made an emotional investment in it. A consequence of effort, time and energy put into creative thinking, team members can become biased and become emotionally attached to their outcomes. To remedy this, the 11th commandment: “thou shalt not fall in love with thy solutions”.
- False causality bias: citing sequential events as evidence the first caused the second. This can occur within the Design Thinking empathize phase where you are intentionally seeking confirmation of causality between what people say vs. what they do, leading to taking the wrong problems or needs forward to solve.
- Action bias: when faced with ambiguity (creative fuzzy-front-end) favoring doing something or anything without any prior analysis even if it is counterproductive: “I have to do something, even if I don’t know what to do”. Team members can feel that they need to take action regardless of whether it is a good idea or not. This can be an issue when under time pressure in strict design sprint workshops for example.
- Self-serving bias: favoring decisions that enhance self-esteem. This results in attributing positive events to oneself and conversely negative events as blame on oneself. Within innovation workshops, this can mean that decisions made can be loaded with personal agendas rather than customer and business logic for the company.
- Framing bias: being influenced by the way in which information is presented rather than the information itself. We see this one all the time, particularly when developing prototypes for pitching as well as in presenting polished slides. People will avoid risk if presented well and seek risk if presented poorly meaning that decision making logic can easily be skewed.
- Ambiguity bias: favoring options where the outcome is more knowable over those which it is not. This bias has dire impacts innovation outcomes because the process is fundamentally risky and unknown process. If team members sub consciously favors known known’s, you will most likely follow know knowns and previously trodden paths.
- Strategic misrepresentation: knowingly understating the costs and overstating the benefits. When developing innovative concepts, ballpark figures and business model prototypes, teams are prone to understating the true costs and overstating the likely benefits in order to get a project approved (which happens all the time in large governmental contracting). Over-optimism is then spotted and challenged by managers assessing how truly innovative team outcomes are.
- Bandwagon bias: a commonly known bias favoring ideas already adopted by others.This is especially influential when linked to authority bias. The bandwagon effect is a common occurrence we see in workshops. The rate and speed at which ideas are adopted by others (through discussion, the rate of silent dot voting etc) can significantly influence the likelihood of those ideas and concepts being selected by the group and taken forward.
- Projection bias: from behavioral economics, over-predicting future tastes or preferences will match current tastes or preferences. This bias has particular influence as new innovations are conceived in the now and are projected into the future when they enter markets resulting in over value-appreciation of consumer preferences.
- Pro-innovation bias: new innovations should be adopted by all members society (regardless of the wider needs) and are pushed-out and accepted regardless. Novelty and ‘newness’ are seen as inherently good, regardless of potential negative impacts (inequality, elitism, environmental damage etc) resulting in new ideas and concepts generated being judged through somewhat rose tinted spectacles.
- Anchoring bias: being influenced by information that is already known or that is first shown. This causes pre-loaded and determined tunnel vision and influences final decision making. We deliberately manipulate team members’ minds by ‘pre-loading’ them one of our warm-up exercises to demonstrate this bias at play. The impact is highly-significant on creative thinking and outcomes.
- Status-quo bias: favoring the current situation or status quo and maintaining it due to loss aversion (or fear of losing it) and do nothing as a result. This is a subtle bias on an emotional level that makes us reduce risk and prefer what is familiar or “the way we do things around here” as it is known. It has severe consequences when seeking out new ways to creatively solve needs and problems.
- Feature positive effect (close links with optimism bias): due to limited time or resources, people tend to focus on the ‘good’ benefits whilst ignoring negative effects even when the negative effects are significant. This is influential when deep-diving into specific new feature sets for new concepts (especially when coupled with loss aversion bias) because it means that teams will overlook missing information especially when it is outside expertise resulting taking ideas forward with critical flaws.
“The opposite of courage is not cowardice, it is conformity. Even a dead fish can go with the flow” – Jim Hightower
3. How to overcome cognitive biases for innovation.
Thirdly, you need to become aware that your decision making and selection criteria can and are being affected by your sub-conscious biases. Followed by understanding that your biases may be keeping you within irrational judgment and your existing frames of reference. To break this you need to think about the way you and your team are thinking and to challenge each other. This takes continuous practice and time like any new skill. The brain has high-plasticity though with the ability to change continually throughout life.
Although there is no magic bullet solution to prevent us from being affected by our own cognitive biases, it is possible to minimize their effects as mentioned, by consciously understanding and spotting key moments in which they operate. This results in minimizing their influence and allowing increased likelihood for objective (logical and creative) reasoning for decision making to take place.
It requires disciplined practice, but over time you will become more and more aware of your own perceptual habits that trigger your biases, and more critically you will be able to identify them in others. It is about challenging your instincts versus more rational thinking and having the assertiveness to speak-up.
Here some solutions you can experiment.
- Master Lateral Thinking methods. The good news is there are a vast number of innovation tools available to challenge our biases through lateral thinking methods. For example: Opposite Thinking, Analogy Thinking, Six Thinking Hats, Brain Writing to name just a few. Including one of the many good reasons for using post-its in workshops because they flatten hierarchy (authority bias). These are designed to break our biases intentionally and consciously by restricting our instinctive mental shortcuts so we can pursue more creative ideas and ultimately more innovative outcomes. The result is allowing us to diverge free associative thinking, devoid of biases, in a structured way to come-up with large numbers of ideas in order trigger more creative ideas and concepts to feed into the innovation pipeline.
- Pay particular attention during tiring sessions (e.g. fuzzy front-end). After generating vast numbers of ideas on post-its to refine and develop, the challenge is then to assess and select the suitability of final ideas to pursue. Cognitive biases are sneaky culprits and can play a key influencing role here as well. It is vital to understand their impacts during the fuzzy front-end of the innovation process such as after intensive ideation rounds or towards the end of workshops when team fatigue can start to kick-in amplifies bias effects.
- Ask external facilitators. The common problem with all cognitive biases is that they are subconscious and instinctive behaviors. Of course, having trained and skilled facilitators from outside your industry and team is key in identifying biases in action and they will actively challenge participants’ way of thinking. If needed, hire an innovation facilitator.
✅ Now we are aware (and can understand how they are influencing us), we can choose to interject consciously and challenge ourselves and others to break them in order to make better decisions for innovation outcomes.
To sum up : 3 steps to de-bias you and your teams
- Spot the biases. Identify specific biases affecting you or your team at key moments by listing them individually.
- Know & conquer your enemy. Reflect and challenge biases identified by openly discussing impacts on current decision making at key decision-making points.
- Overcome cognitive biases. Flip, reverse, remove biases identified by asking questions like:
- What if x, y, z bias did not exist at this moment?
- What if the opposite of this bias were true at this point?
- Would you individually (or as a group) make the same decision in light of new awareness?
Cognitive biases are particularly challenging for innovation as they have a profound impact on the creative right-side brain which is critical for divergent ideas to lead to disruptive concepts. Research has shown they have more of an effect towards poor decision making when teams are using intuitive or creative, right brained thinking which is entirely essential during the innovation process, rather than more rational left brain thinking. Right brain thinking is more risky and prone to biases as it deals with abstract unknowns vs. left brain thinking which deals with more logical concrete knows.
As an ongoing best practice, you should ensure you allocate regular time for ‘bias reflection moments’ at key decision points in your innovation process and it should take no more than 10-15 mins if you follow the 5-steps above. You should also make sure you have continued outsider perspectives to challenge team members and potential decisions before they are made, either through skilled facilitator’s or through assigning different personas to individual team members (disruptor, optimist, pessimist, creative, feelings etc).
Essentially though it is about modifying instinctive reactions, reasoning and mental models of the World that we have developed through prior experience whilst consciously challenging the way we project our creative predictions about the future problems we are trying to solve. Actively thinking about the way you think and challenging it for better innovation outcomes.
Thanks a million for reading.
I’m Mike Pinder, Innovation Consultant @ Board of Innovation. Spreading innovation culture is in our DNA – if you liked the read, contribute to our mission by sharing this article.
- Kahneman, D. and A. Tversky (1979). “Prospect Theory: An Analysis of Decision under Risk ” Econometrica 47(2): 263-291.
- Kahneman, D. (2013). Thinking Fast and Slow, Farrar, Straus and Giroux.
- Kirsch, M. (2014). “The Bias Against Innovation.” from https://www.wired.com/insights/2014/03/bias-innovation/.
- Lebowitz, S. (2015). “20 cognitive biases that screw up your decisions.” from http://uk.businessinsider.com/cognitive-biases-that-affect-decisions-2015-8.
- Magazine, W. (2013). “Does Cognitive Bias Kill Creativity?” from https://www.websitemagazine.com/blog/does-cognitive-bias-kill-creativity
- Mumford, M., et al. (2006). “Errors in Creative Thought? Cognitive Biases in a Complex Processing Activity.” The Journal of Creative Behavior 40(2): 75-145.
- Mueller, J., et al. (2010). “The Bias Against Creativity: Why People Desire But Reject Creative Ideas.”
- Scaltsas, T. (2016). “A Cognitive Trick for Solving Problems Creatively.” from https://hbr.org/2016/05/a-cognitive-trick-for-solving-problems-creatively.
- Siniki, A. (2013). “How This One Cognitive Bias Is Damaging Your Creativity and Relationships.” from http://www.healthguidance.org/entry/17072/1/How-This-One-Cognitive-Bias-Is-Damaging-Your-Creativity-and-Relationships.html.
- Tversky, A. and D. Kahneman (1974). Judgment under Uncertainty: Heuristics and Biases. Theory and Decision Library.
- Zynga, A. (2013). “The Cognitive Bias Keeping Us from Innovating.” from https://hbr.org/2013/06/the-cognitive-bias-keeping-us-from.
- Zynga, A. (2013). “The Innovator Who Knew Too Much.” from https://hbr.org/2013/04/the-innovator-who-knew-too-muc.
Quick decisions save time and energy, but sometimes those knee-jerk reactions lead to bad choices. That’s because biases impact our thinking every day, but few of us even know they exist, says Norma Montague, assistant professor of accounting at Wake Forest University in Winston-Salem, North Carolina.
“The word bias has a negative connotation, but it’s most often unintentional and a result of heuristics–mental shortcuts that allow people to make quick, efficient decisions,” she says. “Good decisions are often the result, but not always.”
Biases work well because they’re often systematic and predictable, but problems arise when individuals habitually rely on this method of decision making, excluding or ignoring additional information. Montague, whose research on the topic has been published in the Journal of Accountancy, gives the example of someone who lives in New York City: “There are a lot of one-way streets, and natives accustomed to the traffic flow are being efficient if they look only to the right for oncoming traffic,” she says. “If we were to take that New Yorker to London where streets run in the opposite direction, their mental shortcut could have a bad outcome.”
While Montague’s research focuses on bias in accounting, her findings apply to any profession. She shares five biases that unknowingly influence your thinking, and how you can avoid making a bad decision as a result:
If you rely on information that is the most readily available to make a decision, you might be missing out on facts or opinions that could make a difference, says Montague.
“Individuals have a tendency to make decisions based on whatever information is easily retrievable to them,” she says. “This can be problematic when making decisions that involve other people, as their information or perspective may differ.”
Availability bias is especially misleading when information is subjective. If you’re asked to evaluate your own performance relative to the performance of others, for example, most people will rate their own contribution to be higher, because that is the information they have most available, says Montague. Avoid this bias by routinely asking for feedback from others before making a decision.
If you’re assessing a situation and you’ve been given an “anchor” fact, you could come to an incorrect conclusion based on its reliability. Montague tested this bias by giving half her class the arbitrary number value of 300 and the other half the number value of 3,000. She then asked students to estimate the length of the Mississippi River. The average response from students who had been given the anchor of ‘300’ was 800 miles, while the students who had been given ‘3,000’ gave an average response of 2,800 miles.
Anchors are a popular tactic in sales, says Montague. “When you buy a car, for example, salespeople deliberately throw out an anchor number, because they know the general population will insufficiently adjust from there,” she says.
Avoid this bias by verifying facts you’re given. And if you are going into a negotiation, Montague says, it’s to your advantage to be the first one to throw out the anchor.
While overconfidence is a personality trait often seen in top executives, it can provide a bias that leads to bad decisions, such as over promising, says Montague.
“Decision makers can overestimate their own abilities to do a task,” she says. “If you’re overconfident and don’t perform, you will let down your team or your company. Interestingly, some say this is a good bias.”
While this bias is more difficult to avoid, it can be helpful to slow down your decision and consult with others on your team to make sure what you’re promising is realistic.
People who only seek evidence that supports their beliefs or expectations will make decisions that are affected by confirmation bias.
“This bias is often used when you’re in a debate and you need facts to support your desired outcome,” says Montague. “The problem comes when disconfirming evidence surprises and weakens your position.”
Avoid confirmation bias by applying professional skepticism to your decisions: “Consider the opposite or explain why your initial assessment could be incorrect,” she says. “This exercise forces you to take the time and mental effort to thoughtfully consider the limitations of your chosen solution.”
The strong desire to make a quick decision can lead to a rush-to-solve bias, but people in a hurry often fail to consider all of the possible data before making their decision. Montague says environmental factors like time and budgetary constraints often put people on a rush to solve.
“If you’re in a hurry, you’re also more likely to fall prey to other biases,” she says. Avoid this bias by slowing down decisions whenever possible. “Awareness is the first step to improving quality of judgment,” she says.