To Heal the West, We Must Heal Ourselves (Part 2)
One step to reducing affective polarization is to more deeply know ourselves.
This is the second installment in my series To Heal the West, We Must Heal Ourselves. My hypothesis is that a lot of us (myself sometimes included) contribute to the problem of affective polarization without realizing it. But when we focus on healing the broken parts of our own psychology and spirituality, we can start to see our own blind spots. We can start to identify places where we're part of the problem, and then change our actions to become part of the solution instead.
Think of our political discourse like a choir in which we're all singing. Right now, it's a cacophony. Everything is off-key because so many of us are fighting with our fellow singers rather than harmonizing. But when we focus on taking the log out of our own eye, we can get back in key and change the tone of the choir just a little bit towards euphony instead.
One of the most surprising things I've learned while studying affective polarization and the science of belief is that we decide what we want to believe, and then we look for evidence to justify that belief. As social psychologist Jonathan Haidt puts it in The Righteous Mind: "intuitions come first, strategic reasoning second."
Haidt discovered this surprising insight by running a series of experiments. He sat down with participants and asked them to read a story in which someone was acting in a way that runs contrary to our moral intuitions (for instance, a man had consensual protected sex with his sister), and then asked participants if the people in the story were acting morally or immorally. What he found was that participants instantly decided that the behavior of the people in the story was immoral. But when pressed to justify their reasoning, they really struggled. Many tried to justify their reasoning by arguing that the brother and sister might be harming someone (for instance, what if they got pregnant and the child had birth defects?). When Haidt pointed out why that reasoning wasn't valid (for instance, by gently reminding the participants that the brother and sister were using 100% effective birth control in the story) subjects still clung to their beliefs. People decided what they wanted to believe first, and then reached for evidence to justify those beliefs.
As Haidt describes it, our intellectual reasoning process is like a "press secretary." The job of a presidential press secretary isn't to tell their boss what to do. Instead, their job is to justify whatever their boss actually does. In the same way, our reasoning doesn't tell us what to believe. Instead, it primarily exists to justify our beliefs.
In his book How Minds Change, David McRaney describes the process like this: "reasons, justifications, and explanations for maintaining one’s existing opinion can be endless, spawning like heads of a hydra." "If you cut away one," he cautions, "two more would appear to take its place."
Most of us like to think of ourselves as open-minded. But that's not quite right. Mostly, we find things that we want to believe and then look for ways to justify those beliefs.
But what if you truly have arrived at all of your conclusions via deep and open-minded reasoning, with no cognitive biases or emotional attachments to cloud your judgment? Psychology has a term for that. It's "naive realism." McRaney summarizes it this way: naive realism is "the belief that you perceive the world as it truly is, free from assumption, interpretation, bias, or the limitations of your senses."
"The late psychologist Lee Ross, who helped popularize the term," McRaney continues, "told me that it leads many of us to believe we arrived at our beliefs, attitudes, and values after careful, rational analysis through unmediated thoughts and perceptions."
One of the downsides of deciding what we want to believe and then looking for evidence for it is that we're often not as open-minded as we think we are. The second danger is that we can get pretty fiery when someone challenges our beliefs, because there's so much emotion tied up there. When our moral intuitions tell us X and someone on the other side tells us non-X, we can react with anger and defensiveness because we really want X to be true.
I spent most of my 20s more or less embodying both of these problems.
I'm pretty libertarian. I think small government is generally good, war is generally bad, and taxes and regulations often do more harm than good. But in my 20s I was a devout libertarian. I told myself that I was open-minded, but I would find elaborate ways to dismiss any argument that even the smallest government intervention could be good. And, because I was so devout, I could get pretty obnoxious whenever anyone suggested to me that a certain government intervention might be beneficial. I was, of course, blind to both of these tendencies in myself.
The problem was that I didn't arrive at libertarianism through pure reason. Oh, I had plenty of intellectual justifications for my beliefs; I mainlined books and scholarly papers by libertarian economists like Tyler Cowen and Steve Horwitz. I could quote a dozen arguments against any particular act of government by very smart people who studied these topics for a living.
But those arguments weren't why I was libertarian. The reason I was libertarian was deeper, and it was emotional. In my childhood I had some bad experiences with authority. As a result I grew up thinking that authority was always toxic and self-interested, and I wanted nothing to do with it. When I discovered government, I engaged in what psychologists call generalization. I transposed the toxic authority figures I met as a child onto the ultimate human authority of government, and decided that government must be as toxic and as self-interested as the figures I met when I was younger.
From there, I decided two things. First, that I wanted nothing to do with the government; the government would only hurt me. Second, government must be an unmitigated evil and so I had a moral obligation to protect other people from it.
And from there, I became a libertarian pundit.
I couldn't seriously consider arguments against libertarianism, because "government = authority = bad" was so deeply ingrained in my psyche. And, because this belief was an emotional one formed at an early age, I made it part of my identity. When people didn't see the wisdom in libertarianism, I thought I had a moral obligation to argue them out of their mistaken beliefs. This could be pretty obnoxious.
Eventually I was able to "unbundle" (to use Alexandra Hudson's excellent term) the toxic experiences of my childhood from the actual government. At that point I could start to seriously see the value of certain acts of government, for instance fighting in World War II or banning lead in gasoline. I became just a little bit more open-minded. I also became a more peaceful conversant, because I no longer felt threatened by disagreement.*
Haidt and McRaney would say that a lot of us come to our political beliefs this way. We develop an emotional attachment to X. Then we come up with intellectual justifications for believing X. Along the way, because our belief in X is emotional at its root, we fight with a lot of people who don't believe X.
So what's the solution? I think the way out is to better know ourselves. When we truly know ourselves, then we can see our emotional attachments and how they influence our thinking. That doesn't make them go away; but it does give us some distance from them. It lets us pick them up and set them down in the course of a conversation, rather than them running the show and determining what ideas we're open to and how we interact with other people.
Here's how Dr. Alok Kanojia, Instructor in Psychiatry at Harvard Medical School, puts it: "You can separate a part of your mind aside and choose to act independently from it, but you can't do that without the separation."
The goal isn't to stop believing X. The goal is to get better at metacognition, at understanding ourselves and why we believe X. It's not to change our reasoning, but rather to see the emotional attachments that underlie that reasoning.
One of the best ways to get to know ourselves is through meditation. When we sit in silence and do nothing, with no distractions, we start to see more deeply who we are. We become aware of things that we didn't see before and didn't know about ourselves, things that we can't see when we're endlessly distracted by our phones and by social media and by constant conversation.
So here's our action item as a community of practice. Pick a belief that you have. Sit in silence for twenty minutes and meditate on this question: "Why is it important to me that X be true?"
And then let us know what you discovered.
*Of course, I'm still not perfect. I have fairly emotional ideas around, among other things, Christian Universalism and the importance of free speech. But I'm better than I was; and as I continue to work through those emotional attachments, I hope to become better still.
Heal the West is 100% reader-supported. If you enjoyed this article, please consider upgrading to a paid subscription or becoming a founding member. Your support is greatly appreciated.