We often believe we are right when, in fact, we are wrong.
Believing we are right underlies why so many bad decisions are made. At the least, if we knew and admitted we did not know the answer, we would put more effort into exploring options and limiting risks when making important decisions.
For my upcoming book, Big Decisions: Why we make decisions that matter so poorly. How we can make them better, I have identified and categorized nearly 350 psychological, perception, memory, logic, physical and social errors, biases, shortcuts, traps, fallacies and effects that lead us into making bad decisions.
Now.I have been investigating cases where it seems that "believing we are right" has led to bad outcomes for leaders, teams and organizations. In the many high-profile current and widely known historical situations that I have examined, indications abound that the bad results have been produced by mental errors and traps. I can't know for sure all of the factors that led to "right belief" that was wrong, but often mental traps and errors are obvious and in every case hints suggest what led to the bad outcome.
This begins a series of posts in which I unpack and diagnose situations and outcomes, along with some insight on what may have led to the undesirable results. (I cite several mental traps and errors in each case, but, in fact, I have identified many more likely causal traps and errors for each situation.)
Let's start the series with "the elephant in the room," a person who offers so many examples of "wrong belief." In this case, his wrong belief led to a ban, a demand for a wall and other unprecedented Presidential actions.
TRUMPED UP
President Donald Trump's believes that thousands of Muslim immigrants in New Jersey celebrated the 9/11 attacks on the World Trade Center.
While all credible fact checkers concluded that there is no evidence that any such mass celebrations occurred, nonetheless President Trump used this "alternative fact" to help justify what he saw as the need for an immigration ban from some majority Muslim countries.
Several of the many mental traps and errors that may have led to Trump believing he was right when he clearly was not:
- Trump may have relied on Anecdotal Evidence (generalizing from a few firsthand stories, which may or may not have been true, rather than accumulating evidence from various reliable sources) to reach an erroneous conclusion.
- Or he may have misled by False Memory Reconstruction (having positive, seemingly definitive memories of events that did not actually happen).
- What he thought he knew and then said had the Illusion of Truth (people are more likely to believe statements they have previously heard, even if they can't remember having heard them, whether or not they are actually true).
- Whether Trump made it up, knitted together anecdotal evidence or had false memories, he exhibited Expectation Bias (only perceiving what he wanted to see and setting aside or ignoring other perceptions or viewpoints) and the Semmelweis reflex (having a knee-jerk tendency to reject new evidence that contradicted his existing beliefs and world view).
Trump continues to peddle "alternative facts," which are often easy for others to debunk. But false belief can be pernicious and not so obvious as it is in Trump's case: When we harbor false beliefs, they are not obvious to us and can dramatically color our decision making.
I expect to publish my new book, Big Decisions: Why we make decisions that matter so poorly. How we can make them better, later this year. It will be an antidote for bad individual and organizational decision making. You can help me get it published and in the hands of decision makers whose decisions not only affect their lives but all of ours.
Learn more about Big Decisions: Why we make decisions that matter so poorly. How we can make them better and my special half-price pre-publication offer. Thank you!
Comments