Cognitive Dissonance

The interwebz have no shortage of bizarre beliefs like the Earth is flat, the COVID vaccines were a microchipping scheme, or the moon is hollow. Are these claims true? Frankly, there is a better chance that Jason Mamoa and Scarlett Johansson will invite the wifey and me to a free couples retreat in the Bahamas to “grab a few drinks and see where the night takes us.”*1 Yet, what happens when we present evidence to committed believers that these claims are demonstrably false (or at least woefully unlikely)? All too often, the believers perform an Olympic-level mental gymnastics routine to rationalize their cherished ideas. But…why? How? A classical theory in social psychology called cognitive dissonance theory can help us understand. Before diving into that, you may be interested in learning the bizarre backstory of how this theory was developed.

How this theory started

Cognitive dissonance theory was originally proposed by the psychologist Leon Festinger (1957) and his colleagues to explain how people preserve (or change) their beliefs, attitudes, or behaviors when confronted with contradicting information. How did he develop this theory? He studied an apocalyptic cult based in the Chicago area. Of course.*2

In the 1950s, there was a UFO rapturist-like cult called The Seekers in the American Midwest. Dorthy Martin, known as “Sister Thedra” headed the chapter in the Chicago area. Martin claimed to be in telepathic communication with various spirits via a process known as “automatic writing,” where one lets some other entity use their body to write messages. Martin’s abilities allegedly began with her deceased father, who she believed resided in an “astral” plane along with other spirits. Later, she believed she was receiving messages from alien spiritual teachers known as “The Guardians.” I couldn’t find anything explicitly saying this extra-terrestrial cadre included Rocket from Guardians of the Galaxy, but I’m guessing not. But the aliens did have Sananda, whom the Seekers believed was a manifestation of none other than Jesus Christ. And how were these enlightened missionaries able to contact Sister Thedra and use her hand to write messages? Did they ride their bikes to Earth and knock on her door, saying, “Excuse me, do you have a moment to talk about space, Jesus?” No, the aliens used atmospheric changes (in some kind of astral vibrations?) caused by the testing of nuclear weapons. Of course.

Martin later announced in a local newspaper that the Jesus aliens wanted to warn humanity that on December 21st of 1953, enormous tidal waves were going to flood huge swaths of Earth. But, of course, The Guardians would whisk away true believers in a flying saucer. Some committed believers gave up their homes, possessions, and jobs in preparation for their salvation during this Jesus-ET version of a Noah’s Ark rapture. Yikes.

Of course, when Festinger and some fellow researchers heard of this cult, they said, “What the [bleep] are these [bleeps] thinking?” Probably. Okay, I wasn’t there, but that is what I would mutter to myself. Festinger was leading a team that was working on hypotheses regarding beliefs. So, as psychologists, they were very interested in how the cult members would react when the predictions would not come true. (To be fair, they assumed these events would not happen, but it was a reasonable assumption). So, to test their hypotheses, they infiltrated the cult like James Bond. Okay, infiltrating the cult was not a hard nut to crack, but you get the point.

Unsurprisingly, neither a flood nor a flying saucer came, and the boring reality of the Cold War continued as usual without any sign of an apocalypse (sarcasm). So, how did the members react? It was a mixed bag. Some members acknowledged they made a rather embarrassing mistake. But some very committed cult members re-interpreted the facts to claim their faithfulness saved the Earth. Some even proselytized this reinterpretation to the public. This went something like, “Hey y’all, good news. The aliens were super happy with the pure faith of our small group and graciously decided to save us all! Please feel free to thank us for saving your life and join our super awesome club.” I wish I could say I am surprised, but having surpassed level 40 in the game of human existence, I have to say I am very much not surprised.

Though strange, these events allowed Festinger and his colleagues to test their ideas about cognitive dissonance theory (Festinger, 1957). For more information on these events, see Festinger (2013) and Tumminia (2005).

Cognitive Dissonance

Since Festinger, cognitive dissonance has been immensely researched and evolved into various theoretical frameworks, each with somewhat different foci, explanations, and paradigms (see Harmon-Jones & Mills, 2019, for a more complete review). The “belief-disconfirmation paradigm” is the paradigm that I will use here. In this view, when someone’s belief is contradicted by conflicting information, this conflict elicits a feeling of discomfort (i.e., a dissonance), which must be resolved. How? This can be done by various means, as explained by Harmon-Jones & Mills (2019, p. 6), shown below.

How dissonance is relieved

1) Changing one’s mind. Other theoretical frameworks, like social judgment theory, show us this can be hard, especially if the person is heavily invested in this belief. But, as we saw with some cult members, people can change their minds—even if it is embarrassing. Though it is preferable to not fall for bovine scat in the first place, we are imperfect primates stumbling through life. Yes, we need to sharpen our skeptical inquiry and critical thinking skills to inoculate ourselves from making these mistakes in the first place. But changing our minds, even if embarrassing, is the next best thing. So, cheers to those who do change their minds.

2) Finding some way to reject or downplay the new information (even if it is true). If you have been on the internet, I’m sure you’ve seen this one. For example, “Okay, you presented a lot of evidence GMOs are safe and thus not linked to cancer. But GMOs are still unnatural products made by corporations, and you can’t prove to me 100% that they don’t cause cancer. I may not have much empirical evidence from elitist science, but my gut tells me they are poison.” Here, the person downplays the significance of robust empirical evidence in preserving their preferred perception.

By Katie of Beatrice the Biologist, posted 19 October 2011.

3) Finding some way to reconcile the two conflicting pieces of information. This is often done by seeking a false (or at least unsupported) middle ground via the middle ground and other ground fallacy. For example, suppose person A and B are discussing vaccines. Person A claims vaccines are bad because they don’t work and are dangerous. Then, person B shows a plethora of evidence as to why vaccines are safe and effective (with uncommon exceptions). Person A can relieve the dissonance (and save face) by falling back to a false middle ground that acknowledges vaccines can prevent diseases but still views vaccines as “bad” by refusing to acknowledge just how safe they truly are—i.e., ”Fine, vaccines technically work, but they are not safe, so there is no point.” See my post on vaccines for more information.

4) Getting social support from the community that shares their belief. For better or worse, we are social critters, and we often rely on social cues on what we should believe or how we should act. Instead of rejecting the belief on their own, the person relieves the discomfort by turning to their social network to affirm that they should maintain the old belief. (i.e., “These facts are uncomfortable; I’m going to consult my echo chamber”). Again, this is a common urge, so don’t fall for the “I would never do that” bias because that is a good recipe for falling for this coping strategy. Granted, this support may be an intermittent step that leads to rejection, changing, or reconciling. But, the dissonance (for now) is attempted to be relieved through social support. The individual may have to use another strategy if this doesn’t work.

Credit: Tom Fonder of HappyJar comics.*3

5) Attempting to persuade others their belief is correct. Here the person seeks a role as an opinion leader to build/grow their belief’s bandwagon. This can be an attempt to bolster their confidence and may be done in conjunction with other options, like rejecting, reconciling, and supporting. As we saw earlier, some cult members used this technique to proselytize their reinterpretation of why the flood never happened. I wonder if Wes Goodman (an Ohio state representative who was vocal about his anti-LGBT views) used this coping strategy before he was caught having an extra-marital sexual encounter with another man in his own office back in 2017. I mean, none of us are perfect, but come on, bro. (Okay, poor choice of words). But to be fair, that example is about behavior, not misinformation. A misinformation example might be an anti-vaxer who has doubts but tries to persuade others to try and bolster their own confidence.

Critiques

Of course, other paradigms of this theory focus on other aspects, such as behaviors, decisions, et cetera (again, see Harmon-Jones & Mills, 2019). So, the theory is broadly applied and, unsurprisingly, has its fair share of critiques. I am unaware of any perfect theoretical framework, and demanding such perfection from theories is counterproductive. Theories are not exact replicas of reality (if there is even such a thing as an exact replica). Theories are evidence-based explanations for how a part of reality works under some settings and as a result, are not perfect. Still, there are valid critiques for this theory. These include ambiguities in defining what constitutes dissonance and a lack of standards in measuring dissonance, both of which can make it hard to compare results (see Vaidis & Bran, 2019).

Final Thoughts

Despite some flaws, in my opinion (which is worth exactly what you pay for) cognitive dissonance theory is a very useful tool to help us understand how people might preserve a demonstrably false belief. More than that, it underscores how normal it is to feel uncomfortable when our deeply held views are contradicted. It’s okay to be human. So when we feel that dissonance, we can recall cognitive theory. With such a reminder, we can acknowledge that discomfort is normal and remind ourselves of the importance of intellectual humility. We can turn to testing our idea. Instead of seeking why our idea is correct (and comfortable), we can see why our idea might be wrong. Sure, you may not change your mind in one day. (Again, we are human). But using (and continuously improving) our critical thinking skills is crucial. None of us are immune to being misled, and nobody is better equipped to mislead us than ourselves. And now, hopefully, I helped you understand this theory a little more…or maybe, just prompted you to reject it entirely.

Footnotes

*1. Yes, I know I’ve written several iterations of that joke. I could say something trite like, “If I put it out there in the universe enough times, maybe it would happen.” But honestly, the wifey and I would settle for a gift card to have a date night at Chilli’s sans celebrity eye candy. We’re easy. So to speak.

*2. Much of this article uses verbiage from my Facebook posts on this topic (such as this one). But instead of continuing to post long headers explaining this theory, I wanted to make a blog article that I could just simply link in future posts. Besides, an article allows me to be more creative and (more importantly) add more links to topics within the article.

*3. I like to try to give comic authors the credit they deserve. But the Happy Jar website is not working (at least when I posted this draft). Moreover, it appears the comic has become a meme template with several iterations. I am unsure what the original looked like, nor can I link Fonder’s Happy Jar work. However, I did find that Fonder has created a new series called The Adventures of Business Cat. Feel free to give him some love.

Citations

Festinger, L. (1957). A theory of cognitive dissonance. Evanston, IL: Row, Peterson.

Festinger, L., Riecken, H.W., Schacter, S. (2013, orig, 1956). When Prophecy Fails. Pinter and Martin.

Harmon-Jones, E., & Mills, J. (2019). An introduction to cognitive dissonance theory and an overview of current perspectives on the theory. In E. Harmon-Jones (Ed.), Cognitive dissonance: Reexamining a pivotal theory in psychology (pp. 3–24). American Psychological Association. https://psycnet.apa.org/doi/10.1037/0000135-001

Tumminia, Diana G. (2005). “How Prophecy Never Fails”. When Prophecy Never Fails: Myth and Reality in a Flying-Saucer Group. Oxford University Press

Vaidis, D.C., Bran, A. (2019). Respectable challenges to respectable theory: Cognitive dissonance theory requires conceptualization clarification and operational tools. Frontiers in Psychology, 10. https://doi.org/10.3389/fpsyg.2019.01189