American Mensa Header

Everybody Plays the Fool

Man's greatest trick is the one he plays on himself

Illustration: A person holding a breifcase has before them two doors: one green, one blue
istock.com/Drawlab19

There is a near universal tendency among us fallible humans to avoid thinking we have not made the best of all possible decisions — and we do it automatically.

As a marketing student, I once saw this only in a relatively harmless context. I was wrong, as we will see later. But discussing the phenomenon in that hypothetical commercial situation nevertheless makes the process easy to understand.

Chevys and Our Choices — Like a Rock

A consumer purchasing an automobile might be faced with a choice between a Ford or a Chevy. Each appears to offer equal advantages to the buyer, who could decide either way. Once he decides on the Chevy, however, that consumer will want to see arguments to reinforce that decision. Much advertising, particularly for automobiles, is designed for that very purpose.

Suppose the purchaser chose a Chevy and then came across new information indicating that a Ford would have been a smarter choice. This would cause a jarring conflict between two realizations, or cognitions. The buyer had chosen the Chevy, but that was an unwise decision. It is almost impossible for him to entertain those two thoughts at the same time. Because we humans have a strong, built-in desire to be correct, the resulting cognitive dissonance would diminish the individual’s self-esteem. His unconscious mind will work overtime to prevent that.

To protect his opinions of himself and his abilities, the consumer has a program running in the background that screens out unfavorable information about the Chevy and favorable information about the Ford, so he does not need to think he made the wrong choice. To further prove to himself that he made the correct decision, if he has the means, he might even double down on his choice and buy his kid a Chevy too. After all, everybody knows they are the best!

The Unconscious Program

The theory of cognitive dissonance is now some six decades old and has been the subject of several studies ever since social psychologist Leon Festinger first explored the concept in the groundbreaking book A Theory of Cognitive Dissonance in 1957. Indeed, the string of academic investigations continues.

Most decisions in life are what we might call free choices — between two or more brands of automobiles, for example. For such a decision, before we have decided one way or the other, any choice might seem equally desirable. After it is made, they do not.

Cognitive dissonance can occur under other circumstances, however, if an action taken by an individual violates deeply felt beliefs about its morality, its wisdom, or its suitability for the person we believe ourselves to be.

Studies have shown that an individual might take some action that violates his feelings about right and wrong when faced with an undesirable consequence if he does not take the action — in other words, forced compliance: “You fire good ol’ Fred this week, or I will do it and then fire you!” Even though firing Fred goes against the individual’s concept of right and wrong, he will likely feel less guilty about it because he can take refuge in the fact that he was essentially forced to do so.

In the absence of forced compliance, however, an individual who commits an act that violates previously held beliefs about right and wrong — telling a lie, for instance — is likely to suffer mental discomfort; that is, cognitive dissonance. Our desire to avoid this discomfort can make us hold to decisions when we should not or to continue previously avoided behaviors that will be harmful to our long-term best interests.

For important decisions, it would seem we should be ready to search for new evidence and reevaluate, but our minds do not work that way. The more important a decision, the more likely we are to stick with it and ignore new evidence. That need to be correct becomes even stronger: We could not possibly have been wrong about something so important!

The unconscious program is always there, running in the background. New information might come along that strongly indicates we should change our original decision. Almost always, unless we realize what is happening, we tend to ignore it. This can keep us from changing a poor decision to one that will enhance our future well-being and happiness. For a free-choice decision, such as what product to buy, it is almost impossible for us to give new information the weight it deserves.

Sliding Down the Pyramid of Choice

In their highly readable 2007 book Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts, authors Carol Tavris and Elliot Aronson provide a clear explanation for the mental mechanisms that lead to cognitive dissonance, confirmation bias, and other cognitive biases. What the two social psychologists call the “pyramid of choice” is described like this:

Line drawing: A person sliding down a pyramid
After making a decision, the perceived benefits of continuing on course may seem preferable to reversing course due in part to the sunk cost fallacy. istock.com/Drawlab19

In a free-choice decision, to start with, it is as if we are sitting at the peak of pyramid. We can think of what seem like good reasons to decide one way and what seem like equally good reasons to decide the other way, and the options’ respective benefits seem relatively close. Once we make a decision, however, we start sliding down the pyramid. Without realizing it, we begin to perceive only things that justify our original decision. We forget factors that pointed the other way. By the time we reach the wide bottom of the pyramid, the two choices that appeared to balance earlier are now far apart, and there is no doubt in our minds that we made the correct choice.

While we are sliding down the decision-justification pyramid, more complete information might become available that would tell us, if we were not programmed to ignore it, “No, wait! The other way makes more sense!” But unless we are aware of this unconscious pull to be correct at all costs, it is too late. We devalue or ignore that new information.

Why do job counselors say the first 30 seconds of an interview are critically important? Why might you never get a second chance to make a good first impression? Because anyone quick to form an opinion will unconsciously be on the lookout for arguments or evidence that his or her initial assessment was correct. Unless significant, contradicting new information comes to light, people are unlikely to change their first opinions.

Tavris and Aronson describe a study that analyzed college students’ choice of fraternity. One fraternity had a severe pledging process, another’s much less so. The students who chose the social club with the severe initiation process later valued membership more highly. Why? They had made a deliberate choice to go through that arduous process to gain membership, so they could not have been wrong! It must have been worth it.

More Than Just Fords and Chevys

Imagine an individual who undergoes a surgical procedure. The patient’s doctor makes a choice about follow-up treatment. Now suppose the doctor comes across research indicating that a different course of action would be superior. Would the cognitive dissonance influence the physician to devalue the new information? We would hope not, but there very well could exist such a tendency to stick with decisions not in the best interests of the patient.

Consider another hypothetical. A person mistreats someone and comes away from the situation feeling guilty. Not wanting to identify as unfeeling or cruel, that person goes on to explain away to himself or herself the behavior: It actually wasn’t that bad. Or, that person deserved such treatment. Or, that person is inferior and unworthy of empathy. Whatever the justification, the behavior is liable to become normalized and the perpetrator more prone to recommitting the mistreatment in the future.

It is easy to see how alcohol, drugs, or other mind-altering substances that reduce the effectiveness of an individual’s judgment could lead to a pattern of behavior the individual previously considered beyond the pale, with deleterious impacts on the individual and society. We can face up to behavioral failings and resolve to never do them again, or we can change our belief system to fit what we have done. The first is difficult, the second relatively easy.

When faced with the results of a free-choice decision, how do we keep the automatic pilot from operating to make us hang with a decision not in our own best interest? Or if we are trying to cope with having done something we feel is not right or that is not in keeping with our own self-image, how do we resolve the resulting cognitive dissonance? The key is to be aware of this tendency that exists in all of us and consciously combat it.

As Tavris and Aronson put it: “People can go to confession, religiously or publicly, and admit they did a bad thing and they are sorry, but it won’t make a dime’s worth of difference if they don’t get what that was and get that they are not going to do it again.” It’s popular to value self-esteem, but Tavris and Aronson say maturity requires “an active, self-reflective struggle to accept the dissonance we feel about hopes we did not realize, opportunities we let slide by, mistakes we made, challenges we could not meet, all of which changed our lives in ways we could not anticipate.”

The psychologist authors remind us that our integrity does not depend on being error-free but on what we do after committing the error, a sentiment they summarize in citing some of poet Stephen Mitchell’s version of the Taoist classic text Tao Te Ching.

A great nation is like a great man:
When he makes a mistake, he realizes it.
Having realized it, he admits it.
Having admitted it, he corrects it.
He considers those who point out his faults as his most benevolent teachers.


Leonard Gaston headshotLeonard earned a bachelor’s in engineering from Texas Tech and an MBA and Ph.D. from The Ohio State University. He had a 32-year career as an industrial engineer and operations research analyst and a second career in academia, retiring as a tenured full professor. He believes our decisions can be improved if we are aware of an innate tendency to resist admitting we are wrong — admitting an initial decision might be incorrect. The more important the decision (lives, careers), the stronger the resistance.
Cincinnati Area Mensa | Joined 2010