
I want to first give credit to the authors of “Mistakes Were Made (but not by me)” – Carol Tavris and Elliot Aronson. Their talk of cognitive dissonance and the metaphor of the ‘pyramid of choice’ has inspired my comments below. Although the ideas in this book have obvious ramifications for psychology, psychotherapy, political science, behavioural economics, marketing, education, conflict resolution and so on, I will use them in this post to make a few points or raise a few questions about philosophy specifically. There is much more to be said than what I’ve put here, but I’ll keep it brief, and try not to spoil too much for those who haven’t yet read the book.
The picture below is of course my own, since it is quite terrible. Further, it, as well as the description that follows, merely represent my own personal interpretation of the core ideas in “Mistakes Were Made.”
There are two main points/questions I want to raise, but first I’ll offer context for this picture. Cognitive Dissonance (CD) is a real thing, in the sense that it is robustly supported by empirical evidence: it is the intellectual stress that a person encounters when she perceives an inconsistency between her beliefs, or between a set of beliefs and her actions. This stress leads humans to reliably act in various ways that we would normally consider to be quite irrational, but which are correctly predicted by CD theory as an attempt to reduce dissonance.
For example, it correctly predicts that people will like or dislike random people (whom they don’t know) significantly more after they have been asked to either say nice things or nasty things to those same people. Their actions (saying either good or bad things to a complete stranger) must be reconciled with their beliefs about themselves as decent people, even if their action came first and was without any justification whatsoever. The way to reconcile is to feel that the person must somehow deserve the action they received (because I’m a good person, and a good person wouldn’t hurt someone who doesn’t deserve it).
This is a little scary: I may very well hate you just because I did something bad to you (which could then lead to further mistreatment of you by me). However, on the flipside, it means I will like people I do good things for, and this will then be reinforced by more good deeds. The practical upshot of this, which I found totally mind blowing, was that an effective way to make someone like you is to have that person do a big favour for you. They will typically reconcile their actions by changing their beliefs about you: “I wouldn’t do anything nice for a person that didn’t deserve it, so they must actually be a good person.”
Powerful internal forces can reduce dissonance in various ways, unbeknownst to the subject: by actively shaping their beliefs (as above), by making them ignore any and all contrary evidence, and even by altering their memories (the examples in this book, done with control groups no less, will absolutely shock you.) An inability to reduce dissonance is not necessarily a good thing, as it is linked to depression and PTSD in extreme cases. However, these forces lead to an escalation of sorts if the person allows them to run unchecked, as we witness so often in disputes between countries, companies, and in our personal lives (think of the animosity between many couples getting a divorce). Enter: the metaphor of the pyramid.
The pyramid applies to cases where people are initially somewhat ambivalent about a particular decision, and the steps represent the first and subsequent actions taken in a particular direction. While at first any particular person could go left or right, as soon as they make a decision or take that first step, a wedge or polarization tends to occur. CD tends to make people “forget” that at one point they actually could have gone either way: people self-justify their initial course of action, altering their beliefs about themselves and their actions, and then take next steps and repeat through further self-justification and confirmation bias (ignoring counter-evidence), eventually pushing them down one side of the pyramid, and farther and farther apart from those on the other side. In the end you can have two people finding themselves completely at odds, and in the worst of cases, ready to go to war over a topic that was initially met with ambivalence!
Once someone moves down one side of the pyramid, moving to the other side becomes almost impossible due to the cognitive dissonance that would result, especially when a person has incurred great costs to get there. To move to the other side would essentially mean that their effort, money, actions, decisions and beliefs have been wrong-headed, and this is (almost) impossible for one to recognize and accept. Although the authors don’t say it, perhaps this is reason we have diametrically opposed “isms”, camps, and especially activist groups of any kind – all of which may contain people that we consider to be quite reasonable, intelligent, and self-reflective.
That’s my quick description, and now onto my first point, which I’ll frame as a potential criticism of philosophy. Although confirmation bias is certainly discussed in philosophy of science (logic of confirmation / induction), I’m wondering why the notion of cognitive dissonance is not also a prominent part of the discussions about science, science versus pseudo-science, and how scientific theories change? I’m not saying that no philosopher has ever mentioned this concept (since I haven’t read everything), but I am saying that this was certainly not a major discussion item when we were learning about Popper, Lakatos, Kuhn, etc. And I think that this is extremely relevant: of course science will not just throw out a theory when new (contradictory) evidence emerges – for science is made up of scientists, and scientists are people. And people have strong psychological mechanisms to reduce cognitive dissonance, including the ability to explain away contradictory evidence! This would add an interesting dimension to the whole discussion in my opinion, especially for the prospects of developing an adequate theory of demarcation.
I’m worried that this shows that philosophy is too narrow, too inward looking, and is not branching out to learn or synthesize ideas from other fields even when those ideas would add value. I am not the first to raise this concern about philosophy and academic disciplines in general, but perhaps CD is actually not the best example of this – your comments on whether I’ve missed the boat here are welcome.
But now to perhaps a bigger worry for the more practical aim of philosophy. As anyone viewing the picture above might guess, it is very difficult to persuade anyone who has already made many decisions, exerted a lot of effort, or otherwise committed to a particular side (and hence “walked down the pyramid”) that they have made a mistake, or are wrong in their conclusions. In fact, evidence in the book shows that trying to convince people already in this state using facts or arguments tends to further solidify their resolve that they are correct.
As an aside, nowhere is this more obvious right now than the current division between Trump supporters and condemners amongst US citizens. I have acquaintances who sit on both sides, and it is truly astonishing just how polarized they have become over time – and the news is providing more information on a daily basis to further divide the camps. Some now simply completely ignore any and all evidence that contradicts their beliefs, and “fake news” has become an all-pervasive meme. I of course have my own opinions on Trump, but I’ve found this polarizing phenomenon to be much more interesting…
…which leads me back to my worry: CD seems to imply that arguing, debating, or trying to persuade people who would have to deal with significant cognitive dissonance to change their beliefs is actually quite pointless. Philosophers do no good here, but not only that, they actually do harm: they further entrench the beliefs that they are trying to change.
However, it is not all bad news. Philosophy, if it catches people early during formal education and exposes them to the concept of confirmation bias, encourages the principle of total evidence, and enhances critical thinking in general, might prevent people from sliding down the pyramid too quickly (or at all). Dogmatic entrenchment can be prevented, even if it is not curable…? Philosophy does not uniquely play this role of course; indeed, the authors noted above point to ‘proper scientific training’ as being a good antidote to such entrenchment. But philosophy can certainly do a great job in this area and can / should tag team with other academic disciplines to help combat this phenomenon.
I’m curious what readers think about the ramifications of this for philosophy as a discipline, and as always, thanks for reading!
–Mike
Clare Flourish
April 6, 2017
Speaking personally, my lack of desire to see myself as a good person might make dissonance less effective at deluding me- with other costs. I can change my opinion, but find it more difficult to take steps to achieve goals.
LikeLike
Tyler L.
April 7, 2017
I think the issue of cognitive dissonance is a good one to learn and discuss. When people became so entrenched in their personal beliefs, so to speak, it can almost be confusing. There can be so much evidence towards a new type of information or science that can disprove an old existing belief, one strongly held for decades of time. I would agree in that, for some, it can be hard to sway their own set of beliefs towards another side when they are all they way to one side of the pyramid. But I think that it may have to do with a lack of self-awareness for the individual, making the change in mind so difficult. We can become so attached to our routines, views, and informational culture, that we can’t look deeply into our personal psyche to see if this is a set of values that resonate with us. It seems often that when people take the time to do self-evaluations and self-reflections they are more apt to detach from old belief structures. It can still take time, but to know a deeper meaning to your persona, and in turn creating a deeper understanding of all people and their meanings and perceptions. Creating cultures that can move with the wind of understanding and knowledge, making it easier for us as humans to restructure beliefs when old ones are proved wrong.
Great read, thank you!
LikeLike
Vanes
April 8, 2017
The philosophy of Bruno Latour who does effectively an anthropology of science touches on the problem of demarcation stating that the rules of the method determine if something is scientific or not. Method of course determined by the community.
“We are all familiar with the notion of rules of method which have been devised for ’scientific’ experiments. … Even though these rules might not be enough to certify that interesting results will be obtained, they have been found useful nonetheless in establishing the state of the art. Equipped with those rules, it is possible, according to their promoters, to say why some argument, behaviour, discipline, or colleague is or is not scientific enough”
In: Latour, Bruno. ‘Which protocol for the new collective experiments’. Experimental Cultures
LikeLike
Alison K McConwell
April 20, 2017
Great post Mike! And timely. Given what you’ve said here, cognitive dissonance is alive and well in the breed specific legislation debate. BSL is a nice example of the activist point you made above. In my experience, person who are hyperactively supportive of special interest group sites in favour of BSL, such as dogsbite, animals24/7, and NPBVA, often refuse to accept anything that speaks against their beliefs about BSL–they declare it as illegitimate, necessarily biased, or ignorant of the emotional impact of bite incidents regardless of the source and without acknowledging the content of the claim. Those against BSL are guilty of something of the same, however, the scholarly work seems to be on their side at the moment so the presence of any CD in that case doesn’t seem to be as problematic. This may not necessarily be a point about the ramifications for philosophy per se, though I think that philosophy has a potential corrective role concerning the cognitive dissonance of those groups, such as bringing together findings from epidemiologists, animal behaviourists, and biologists, and interpreting and explaining the science in a publicly accessible way. I’ve wondered whether the combativeness strategy often taken by philosophers in criticizing ideas is what does the harm when engaging in debate with persons who would have to deal with significant CD. Instead maybe the situation calls for more of a traditional Socratic role to help them make their own way towards the potential limitations of their own view and any positive features of the opposition. What’s interesting is the potential role philosophers have here when it comes to mitigating the harmful effects of cognitive dissonance in the public. I’d say there’s something about problem navigation and the delivery of information that can either exacerbate or alleviate CD.
LikeLiked by 1 person
Mike Steiner
April 21, 2017
Exactly Alison. Just coincidentally, there is an article circulating in social media about how the more liberal / democrat leaning comedy shows across many of the US networks (think Stephen Colbert, John Oliver, etc) may have actually helped bring about Trump’s victory — essentially by alienating anyone who held more conservative views, and making them eschew a lot of what the associated networks were presenting in their news programs. In short, mocking and arguing with people who have already picked sides may have exactly the opposite effect one desires…a much gentler approach to dialogue might be required (or something completely different altogether).
LikeLiked by 1 person
Anushka K.C.
April 29, 2017
Reblogged this on and commented:
Cognitive dissonance explains a lot of things for me. Sometimes I tend to act irrationally. I always wondered why I acted the way I did even if I am aware of my actions. Even when I know I am being irrational and potentially hurting someone. Cognitive dissonance is the answer. Now I know what it is, I can work on it.
LikeLike