Think you make decisions rationally? Pride yourself on your control of logic? Think again. Science is against you. Decades of research support the view that humans, including you, are biased towards irrationality. Some scientists even argue that evolution favors irrationality.
How can leaders make the best decisions, given they have logical blind spots?
A fine article by Sharon Begley recently reminded me of my favorite course in business school. Organizational Behavior was a requirement for graduation, but one I expected to be a waste of time. I avoided it until my last quarter, when my several attempts to argue my way out of it failed.
What a surprise! I was fascinated by Professor Hogarth’s research into our inherent biases, and I believe it helped make me a better manager to be suspicious of logical argumentation (especially my own). More recently, Nassim Nicholas Taleb has built a substantial second career summarizing the research into our tendency to underweight the likelihood and impact of improbably events.
Every decision maker would do well to remember that the human mind favors many biases and errors, including:
- Confirmation bias – seeing and recalling only evidence that supports our beliefs, or discounting evidence that argues against them
- Misplaced trust – valuing preconceptions and emotions more than empirical data
- Preference for winning an argument over finding truth
- Sunk cost fallacy – “think what we’ll have wasted if we don’t continue!”
- Tendency to evaluate facts within specific contexts (e.g., thinking one way in the classroom and another on the job about exactly the same data, or treating $5 saved on the price of a car as less valuable than $5 saved on a book)
- Tendency to be overly influenced by the first or last in a series of data
- Confusing absence of evidence with evidence of absence
- Generalizing from the specific (i.e., overweighting the importance of data points)
- Greater comfort with order than randomness, and thus trying to see order where it does not exist
- Believing our stories about situations (we remember our most recent telling of an event more than the event itself, and the telling changes with time)
- Assuming the absence of nonsense is the same as the presence of sense
- Believing associations (e.g., the politician talks about God, and God is good, so I trust the politician)
- Tendency for our confidence in our knowledge to exceed our actual knowledge (honest! proven time and again)
Ms. Begley’s article presents arguments that there may have been an evolutionary reason for our innately poor reasoning abilities: illogic keeps us focused, easy to understand, and able to harness the power of emotion. Think of Hitler’s rhetoric. Or the schoolyard bully who humiliated anyone who tried to reason.
You get the idea. You have reason not to trust your mind. So what can you do about it?
- Stay humble. Always assume someone else may understand something you don’t
- Stay open. Always be willing to reassess decisions in the light of new data
- Balance reason and emotion. Sometimes your gut knows what the data do not. After all, many business disasters have been fact-based.
- Examine your biases. For example, I know I tend to anchor on early data and discount new information, and that I tend to trust intuition over data. Awareness helps me manage my tendencies.
- Stay sharp. Work logic puzzles like sudoku or games like chess to keep from falling into fuzzy thinking.
In the end, our biases and fallacies aren’t good or bad. They’re just the way we are. Cultivate awareness of them and you will make better decisions.