When I was in college, I had a social psychology text by Elliot Aronson that just floored me. I even thought about ditching ministry for research just so I could do more with social psychology. Well, Aronson has a new book out and it is just as fantastic.
The title is "Mistakes Were Made (but not by me): Why We Justify Foolish Mistakes, Bad Decisions and Hurtful Acts." It's a cute title but the book is a systematic and comprehensive look at cognitive dissonance, something that has fascinated me all my adult life. C teases me about my compulsive need to be "consistent," which is directly related to the fact that I can see cognitive dissonance all the time but I can't do anything about it except in my own life.
Basically, cognitive dissonance is what we experience when we are confronted with two conflicting beliefs and we feel the internal pressure to resolve the conflict by making one of them go away. A few examples: The fact is that I had an affair. I'm confronted with two beliefs about myself that are in conflict: "I am a good person." "A good person wouldn't do what I did." Now begins the justifying process. Either I can decide that I'm not really a very good person (rarely happens) OR I can think of reasons why a good person would do what I did (and it usually has to do with blaming my spouse.)
Here's another example: You're a prosecutor in a rape case that convicts a man and sends him to prison. 20 years later, you learn that the DNA evidence exonerates him of the crime. You have two conflicting beliefs: "I am a good and competent prosecutor who would never send an innocent man to prison for 20 years." "The DNA evidence indicates that this man did not commit this crime." You will try to lessen this conflict (or dissonance) by deciding that the man may not be guilty of this crime but he is clearly a bad guy who needed to be kept off the streets for 20 years or by deciding that he obviously wore a condom and the semen in the rape victim is inexplicable or . . . whatever you have to do to hold on to your belief that you are a good and competent prosecutor. You may even decide that DNA evidence isn't all that reliable (even though you frequently use it when it will convict a suspect), never realizing your own inconsistency.
One of my favorite experiments has to do with cheating. Two people who believe that cheating is wrong but who are faced with the opportunity to cheat in order to pass a very important class make opposite decisions (one decides to cheat and one decides not to). Although before the they had very similar responses to hypothetical questions about cheating, stress, honesty, etc., now their answers will be very different. The one who decided to cheat will now believe that honesty is overrated, that most people cheat, that it's a necessary evil. The one who decided not to cheat is now even more firmly convinced that cheating is wrong and has even less empathy about the stress that might motivate someone to cheat. They start out very close together but that one decision drives them very far apart. You might think that the cheater would have respect for the guy who resisted temptation but he is more likely to feel contempt. Likewise, you might think that the noncheater would have empathy for the guy who gave in to temptation, having been tempted himself, but he is also far more likely to feel contempt.
Here's one more and then I'll quit: If you take a group of Israelis, say, and you present a peace plan to them that is actually the peace plan favored by Palestinians but you tell them that it is opposed by Palestinians, they will like it very much. Same with Republicans and Democrats or with people of opposite religious views. We like the ideas that come from people we think agree with us and vice versa, even when those views are reversed. We can't handle the dissonance that is caused when someone we dislike has good ideas.
This is a wonderfully fascinating and challenging book. The frustrating part is that you will suddenly see it everywhere (like psychics see ghosts) but you won't be able to do anything about it because people are mostly incapable of seeing their own self-justifying thinking. If Rush Limbaugh reads this book, he will either think it's bogus OR he'll think it applies to everyone except him OR his head will explode. That would be interesting.
The title is "Mistakes Were Made (but not by me): Why We Justify Foolish Mistakes, Bad Decisions and Hurtful Acts." It's a cute title but the book is a systematic and comprehensive look at cognitive dissonance, something that has fascinated me all my adult life. C teases me about my compulsive need to be "consistent," which is directly related to the fact that I can see cognitive dissonance all the time but I can't do anything about it except in my own life.
Basically, cognitive dissonance is what we experience when we are confronted with two conflicting beliefs and we feel the internal pressure to resolve the conflict by making one of them go away. A few examples: The fact is that I had an affair. I'm confronted with two beliefs about myself that are in conflict: "I am a good person." "A good person wouldn't do what I did." Now begins the justifying process. Either I can decide that I'm not really a very good person (rarely happens) OR I can think of reasons why a good person would do what I did (and it usually has to do with blaming my spouse.)
Here's another example: You're a prosecutor in a rape case that convicts a man and sends him to prison. 20 years later, you learn that the DNA evidence exonerates him of the crime. You have two conflicting beliefs: "I am a good and competent prosecutor who would never send an innocent man to prison for 20 years." "The DNA evidence indicates that this man did not commit this crime." You will try to lessen this conflict (or dissonance) by deciding that the man may not be guilty of this crime but he is clearly a bad guy who needed to be kept off the streets for 20 years or by deciding that he obviously wore a condom and the semen in the rape victim is inexplicable or . . . whatever you have to do to hold on to your belief that you are a good and competent prosecutor. You may even decide that DNA evidence isn't all that reliable (even though you frequently use it when it will convict a suspect), never realizing your own inconsistency.
One of my favorite experiments has to do with cheating. Two people who believe that cheating is wrong but who are faced with the opportunity to cheat in order to pass a very important class make opposite decisions (one decides to cheat and one decides not to). Although before the they had very similar responses to hypothetical questions about cheating, stress, honesty, etc., now their answers will be very different. The one who decided to cheat will now believe that honesty is overrated, that most people cheat, that it's a necessary evil. The one who decided not to cheat is now even more firmly convinced that cheating is wrong and has even less empathy about the stress that might motivate someone to cheat. They start out very close together but that one decision drives them very far apart. You might think that the cheater would have respect for the guy who resisted temptation but he is more likely to feel contempt. Likewise, you might think that the noncheater would have empathy for the guy who gave in to temptation, having been tempted himself, but he is also far more likely to feel contempt.
Here's one more and then I'll quit: If you take a group of Israelis, say, and you present a peace plan to them that is actually the peace plan favored by Palestinians but you tell them that it is opposed by Palestinians, they will like it very much. Same with Republicans and Democrats or with people of opposite religious views. We like the ideas that come from people we think agree with us and vice versa, even when those views are reversed. We can't handle the dissonance that is caused when someone we dislike has good ideas.
This is a wonderfully fascinating and challenging book. The frustrating part is that you will suddenly see it everywhere (like psychics see ghosts) but you won't be able to do anything about it because people are mostly incapable of seeing their own self-justifying thinking. If Rush Limbaugh reads this book, he will either think it's bogus OR he'll think it applies to everyone except him OR his head will explode. That would be interesting.
3 comments:
It would seem to me that some tolerance of cognitive dissonance is an essential element to change. It is that painful liminal space where previous "knowns" loosen before a new and deeper truth is planted. When what my daughter calls little "t" truth surrenders a bit more, or perhaps becomes enfolded within big "T" Truth.
It would seem to me that some tolerance of cognitive dissonance is an essential element to change. It is that painful liminal space where previous "knowns" loosen before a new and deeper truth is planted. When what my daughter calls little "t" truth surrenders a bit more, or perhaps becomes enfolded within big "T" Truth.
It is . . . for the rare person who allows that process to happen. What is more common, though, is that our "knowns" become more rigid and almost impossible to dislodge, even in the face of unequivocable evidence.
Post a Comment