Confirmation Bias: How it Impacts What I Think I Know
Recently I was listening to a lecture on how we make decisions. The speaker identified several decision-making traps that take us down the road of poor decision making that can prove to be catastrophic.
One trap in particular piqued my curiosity. It’s a cognitive process known as “confirmation bias.” This refers to our “tendency to gather and rely on information that confirms our existing views and to avoid or downplay information that disconfirms our preexisting hypothesis” (Michael Roberto, Bryant University).
It causes us to gather or remember information selectively. Furthermore, we interpret the information in a way that confirms what we already believe to be true. Researchers note that this effect is particularly strong for emotionally charged and deeply entrenched beliefs.
Confirmation bias has been cited as a contributing factor to a variety of issues ranging from beliefs and laws concerning the death penalty and the events leading to the Columbia shuttle disaster in 2003. People, including law makers and supposedly unbiased scientists, look at data through a skewed lense. They filter the evidence to confirm what they assume is correct and disregard information that would conflict with their opinions.
Research indicates that we take it even further. We interpret data, information and arguments to fit what we want to be true, regardless of objective observations to the contrary. It was found that NASA officials wanted to believe the Columbia was in safe operating order to the point they disregarded data that pointed to real danger.
It seems to me that confirmation bias is well entrenched in the church and theology worlds as well. Note how we continually buy books that we are certain will confirm what we already believe. We keep going to hear speakers who affirm what we assume to be true.
We can pretty well predict what these authors and speakers are going to say ahead of time. We like it this way. It creates emotional and psychological safety nets for us.
This isn’t all bad, I’m sure. I know I’m as susceptible to confirmation bias as anyone. But it does raise questions for me in how we do church leadership, Bible interpretation and theology. If the experts are right in saying confirmation bias is particularly strong for emotionally charged and deeply entrenched beliefs, it’s safe to assume it runs wild in the Christian community.
Take how we manage church conflict. Left unchecked confirmation bias will distort the “facts” that are presented by both sides. We can quickly react and make decisions out of that distortion, all the while convinced we are “right.” Afterall, we have the “facts” right in front of us.
The more we can slow the process down and become aware of our own biases, emotionally held positions, and our need to self-protect, the more objective we can be. The more open we are to genuinely consider the other side, the less we will be governed by confirmation bias. A good adage is to strive to be calm, nonreactive and self-aware in those moments.
We also need to hold the tendency of confirmation bias in one hand while we interpret our Bibles, evaluate our doctrine and define our theology with the other. Honesty about our own propensity to allow confirmaiton bias to define our beliefs will go a long way.
I notice a couple of tendencies in this area. One, when we discuss theology, doctrine and Bible interpretation, we tend to lean toward monologue rather than dialogue. In monologue (where I do the talking and I talk at rather than with) I control the direction of conversation. I dictate the ideas and content. It’s my way of ensuring that what I already believe will be reinforced and protectecd.
Dialogue is more difficult. In dialogue, I quiet down and let you share your perspective and define your position. I stay open and consider the validity of your views. I weigh them and give them respectful, careful consideration. Dialogue is risky because I choose to allow you to influence me. I may conclude that my emotionally held beliefs aren’t as accurate as I thought (and wished) they were.
How many Christian conversations are more two people (or two factions, two churches, two theological traditions) trying to “out-monologue” each other rather than true dialogue? How does that cost us in terms of valuing each other, valuing relationships, and actually learning and growing?
A second tendency is that we become rigid in our theology, doctrine and Biblical interpretation. We approach the Scripture with our theology and exegesis already established. We force our meaning upon the text and make it agree with us.
Bradley Jursak speakes to this: “Dare we let Scripture say what it says without reinterpreting what ‘it really means’ into the margins of our Study Bibles?”
We all make our own canon in some manner. We hold to our predisposed interpretations over what the text may actually say. Again, the more honest and self-aware we are of our propensity to do this, the better we can manage it.
As I close, I can’t help but wonder: 1) How my confirmation bias has influenced what I just wrote? and, 2) How is your confirmation bias influencing how you’re reading this blog?
Confirmation bias is one of the most tricky cognitive biases to combat, because it helps us deal with a fundamental problem with life: how to act on incomplete information.
Let me use a rather harmless example. Confirmation bias has often been the driver of the continuing myth that giving candy/sugar to children leads to hyperactivity. While there is another fallacy at play in the beginning (correlation is not causation), the perpetuation of this idea is caused by the simple fact when people give candy to children, they expect an increase in activity. When they perceive an increase, they assign causality to the act of giving them candy.
The question then was how do we fight it? In this case, there was an experiment conducted (several actually – it’s important to repeat any experiment to show it’s validity). There are at least four groups of people in the experiment: the children (of course), a person/group giving out candy, a person/group measuring activity levels, and then someone to analyze the results. The absolutely vital part is that the person giving out candy is not the person measuring activity.
It begins with a measurement of base activity. After that, the person taking the measurement leaves the room. Then someone comes in to hand out candy. They only give candy to a subset of the group. The person giving the candy keeps careful records. Which children received candy, which did not. They leave the room. Then the person comes in to measure activity. Then the results are analyzed. That’s the basic setup, though I skipped some details.
Is there any difference between the activity levels of those children who had candy and those who did not? The answer comes back: no statistical difference. Then they repeat. Maybe their data was an anomaly. Other people in other places do the experiment as well. Slowly, over much time and care, the picture emerges: candy doesn’t actually cause an increase in the activity of children.
That’s a lot of time and effort to put into in order to answer a simple question. And I left many steps and details out. In practical terms, the average person doesn’t have the time and resources to do that. They don’t have the time and resources to even look up the information. It probably wouldn’t even occur to them. They see what they expect to see, and that’s ‘good enough.’ So the myth continues.
Fighting cognitive bias is a constant battle that we lose most of the time, if we’re even aware of the battle. Luckily, most of the time, there aren’t any serious consequences. If we try to check every single thing, we’ll never get anywhere, and so, in order to get things done, we end up going with what makes sense to us. It’s understandable, I think. The real trick is having the wisdom to know which times really matter.