On Confirmation Bias
In Gore Vidal’s “Burr: A Novel”, the narrator is asked to dig up dirt on his companion Aaron Burr. Through the process he invariably ends up admiring Burr and decides that, in addition to digging for dirt, it would be most interesting to help Burr craft his memoirs. The narrator, upon telling the newspaper editor his plans, finds the editor unconvinced:
You will be favorable to Burr and so must fail, because the American reader cannot bear a surprise. He knows that this is the greatest country on earth, Washington the greatest man that ever lived, Burr the wickedest, and evidence to the contrary is not admissible. That means no inconvenient facts, no new information. If you really want the reader’s attention you must flatter him, make his prejudices your own, tell him things he already knows–he will love your soundness.
Confirmation bias is one of our characteristics that most impedes our ability to change our minds, even in those instances when we should. We invest an enormous amount of energy in constructing personal narratives: who we are in the world, how we fit into the larger society, and how the world operates. When presented with ideas that contradict our narratives, we find it easier to refuse to believe them than to revise these narratives. One such example is how organized religion responded to ideas and evidence challenging scripture. There are some believers who try to twist dinosaur fossils to fit their young-earth theory. Others claim that scripture was written as a set of moral parables and was never meant to be interpreted literally. Others rather cleverly treat all new discoveries that on their surface conflict with religious beliefs as further evidence of the majesty of God and the universe he’s created “for us.”
Confirmation bias in politics is no different, as anyone who has spent any time reading political publications or watching the political opinionators on cable news channels can attest. Behind them lies enormous, solidified constructs of narrative and political philosophy that acts like an information processing unit. New information comes in, enters the processing unit, and is spit out colored and biased to become just another piece of evidence supporting whatever ideology was already there.
The ironic thing is that most of us prefer it this way. Information that supports our worldview is easier to absorb and makes us feel smart because what we hear agrees with what we believe. Evidence to the contrary is dissonant, challenges our self-esteem, and has negative psychosomatic effects. Only when a person is forced by shocking, incontrovertible, and unavoidable new information does the seed of a revised personal narrative begin to grow.
If one day aliens land on earth and they tell us that they’re the reason God created the universe, how do you think will we react?