It’s December 20, 1954, and a group of people is gathered in Dorothy Martin’s living room. They sit and chat anxiously as they wait for the clock to strike midnight. They aren’t truly strange people; they are mostly upper-middle class, they have good jobs that they have abandoned and nice homes that they have sold, but they are intelligent and well-spoken people with mostly mainstream views. Except they also believe that the entire world will be destroyed by a flood the next day, December 21, and that at midnight they will be rescued by an alien spaceship from the planet Clarion.
Among them is Leon Festinger, a psychologist who has always wondered what happens to a religious group after prophecy fails. In order to find out, he has infiltrated Dorothy Martin’s group called the Seekers in order to document what happens when the alien spaceship doesn’t arrive. The group believes that they have been receiving messages from aliens on the Planet Clarion by way of automatic writing through Dorothy Martin. As the clock gets closer and closer to midnight, they all remove metal from their bodies, such as loose change, watches, zippers, or bra straps. Apparently the metal interferes with the alien spaceship.
They watch the clock edge closer and closer to midnight. Soon, the time hits, but nothing has happened. The group waits for about five minutes, the tension getting ripe. One of the group members notices that there is another clock in the room that still reads 11:55, and the groups lets out the tension with relieved laughter. The second clock moves on and nothing happens. The group is stunned and terrified. Did their saviors decide not to save them after all? Were they deemed unworthy? Or had they just been wrong?
By 4:00 am, the group is in tears. The flood is supposed to happen in just three hours, and no one knows what to do. Some have tried to offer explanations, but nothing helps. The moment that Leon Festinger has waited for arrives. This is when he will find out what the human brain does when it is put into an extreme shock by a shattered worldview. He would later coin the term “cognitive dissonance” to describe the effect, and this story has become one of the most famous in psychology.
What would you do? One of the things that had confused Festinger at first was that these people weren’t some kind of crazy cult fanatics; they were rational and intelligent people. Festinger assumed that this shock would lead them to change their beliefs, to become different people, and to move on rationally with their lives. The result was so surprising to him that he would go on to do many scientific experiments in order to understand what happened, opening up an entire branch of psychology.
At 4:15, Dorothy Martin produced another piece of automatic writing. It said that this group, the Seekers, had shone so much light into the world that God had decided to change his mind and not destroy the world after all. The group erupts in cheers. Instead of changing their worldview, they go out and start lining up interviews with newspapers and radio shows to share the news about how they had saved the world. Festinger was shocked. The prophecy that their group had been so certain about had just failed, but instead of changing, they had just doubled down on their beliefs.
Understanding the way that human beings react to cognitive dissonance opens up a lot of doors to increased potential, and it’s something that I think could benefit a lot of Magic players. Everyone wants to think of themselves as purely rational beings. We like to believe that if we were with the Seekers, that we would be able to see the farce for what it was, but the truth is that we are human beings, and that we are susceptible to human nature just like anybody else.
As I’ve been writing columns over the past few years, I’ve encountered this phenomenon over and over. I’ll sit down a watch a modest sample of games in a format, record the results, and then talk about what they mean. The responses follow a similar pattern. There are some people that read the article and say that it changed the way that they looked at the format, but those people are few and far between. For a lot of people, they’ll tell me that they liked the article because it fits their own experience of the format. Others will read the article and see that it doesn’t fit their experience, so they’ll start to rationalize all sorts of reasons why they are still right, even though the numbers don’t back them up. And then there are the people that are incredibly upset that my numbers don’t fit their worldview. That’s when the personal attacks start.
One of my favorite comments was when a player told me that my analysis had dramatically improved because the article finally matched up with his experiences. I don’t mean any disrespect to that player, because I get it, but it made me laugh. I hadn’t really changed my process or my analysis at that point; I hadn’t suddenly become a better player. But this player had shown that it was difficult for them to revise their beliefs in the past when they were presented with cognitive dissonance.
Responding to Dissonance
Cognitive dissonance has since been defined in two different ways. The first definition is the uncomfortable feeling you get when you encounter information that clashes with your current worldview or system of thought (schema), and it is also often called cognitive disequilibrium. For example, if you believe that RW is the best deck in the format, but a source you trust tells you that UB is the best deck in the format. The second definition describes it as the gap between the way you act and the things you believe. For example, you may believe that cheating is wrong, but you realize that you drew an extra card off of a Divination, but you decide not to call a judge and just hope that your opponent doesn’t notice. Festinger’s research on the way you react in the second example is fascinating, but I want to focus on what the research says the human responses tend to be for the first definition.
When you encounter information that clashes with your current schema, you basically have four ways to react. Let’s imagine that you currently believe that RW is the best deck in a draft format and that UB is terrible. This is your schema. Then a source that you trust gives you strong evidence that UB is actually very good, and that it is the best deck in the format. This makes you uncomfortable, and it causes cognitive dissonance. Here are the ways that humans tend to react:
- You throw out your old beliefs completely and adhere to the new idea. While you thought that RW was the best deck, you read an article that says UB is the best, so you just accept that as fact. This reaction is inherently problematic, and people will warn you against it over and over, but the good news is that it happens less than the other responses. It is very rare for people to accept a completely new schema at face value.
- You revise your schema to accommodate the new information. When you hear that UB is the best deck, you accept that this is probably true since you trust the source, but you also generate an idea that RW can be very strong when it is underdrafted or when it is drafted correctly. You still hold the belief that RW is a strong deck in the right circumstances, but your behavior will probably change to value UB more highly than RW.
- You rationalize the new data in accordance with your previous schema. You decide that RW is still the best deck, but perhaps not by as much as you originally though. You also think that UB is still bad, but possibly not terrible. It’s not that you think the article is incorrect; you just think that it comes with qualifiers like “UB is the best when it is underdrafted, or when you are in the right seat, or when you are lucky enough to get these cards.” Your schema doesn’t really change, but you mentally add qualifiers to the information being presented.
- You double down and reject the data being presented. Although you’ve been presented with new data, you refuse to accept that your own experiences could be wrong. You reinforce your position that RW is the best deck and that UB is terrible. You say that the writer’s opinions are incorrect because they do not conform with your own limited experience in the format. You’ll have to generate all kinds of reasons and theories about why the new information is valid, which might include personal attacks against the writer. You’ll also intentionally seek out information that validates your own viewpoint or rejects the opposing viewpoint, which is commonly called confirmation bias.
Recognizing Your Choices
So you have adherence, revision, rationalization, and rejection. The first one, adherence is very uncommon, but it can happen when a person is not very attached to their own opinion or when they hold a special reverence for the source of the new schema. Human beings tend to value their schemas very highly, and they don’t like to change those schemas because it creates the most shock to their psychological system. This is probably a good thing; if people changed their beliefs completely whenever they were presented with differing information, the species would probably have died out very quickly.
Both of the second two responses are entirely rational and valid approaches to new information. However, it’s important to remember that people have a natural tendency towards rationalization, even when the data is against them. Rationalization is much more common than revision; people don’t tend to revise their schemas except when they encounter very strong cognitive dissonance. In other words, they won’t revise their opinions unless they encounter very strong evidence or if the information comes from a very trusted source. This is unfortunate, because it is also one of the best ways to improve quickly. I like to call this “sponge mode,” because you are absorbing other people’s ideas and giving them a chance.
Rationalization is the most common approach to cognitive dissonance. This is why most people judge articles or opinions based on how well it matches their own experiences. It is the most comfortable response to cognitive dissonance, and it also requires the least amount of work. Basically, revision and rationalization depend on how strong the evidence for the new information is. If there’s strong evidence, then it’s better to use revision; if the evidence is weaker, then rationalization is better.
Doubling down is possibly the most dangerous of the four responses, and it severely limits a player’s potential for growth. This is the response that Festinger encountered with the Seekers in 1954. It’s not the most common response, because people that are presented with strong evidence that their schema’s are wrong will change in some way, but Festinger did outline several factors that make doubling down more likely. It needs to be a conviction that a person uses to guide their behavior, and it also needs to be a schema into which the person has dedicated a substantial amount of resources. So someone that drafts a lot of RW and has written a lot of articles in support of RW but then encounters strong evidence that UB is better will be more likely to double down on their belief.
The biggest leaps in learning occur when a person is able to encounter cognitive dissonance and revise their systems of belief to support the new data. It is precisely in this moment of discomfort that we get our best opportunity to improve, but it is hard work, and it requires people to fight against their natural tendencies. Studies have shown that this tendency is persistent regardless of intelligent level. Smart people are just as prone to confirmation bias as anyone else, and it’s even more dangerous because they are more capable of finding information that supports their own position.
It is a constant fight to accept new information. It means constantly seeking out new information and actively finding information that produces cognitive dissonance. I know that I have run into this problem many times in my life, but it is possible to find new opportunities to grow by embracing cognitive dissonance by actively seeking out viewpoints that don’t align with your own, giving new information a solid chance, experimenting on the new information to confirm its truth, and then revising your systems of belief to accommodate the new information. And at all cost, avoid doubling down on something just because you want it to be true.