Pages

Wednesday, July 12, 2006

Confirmation Bias

Michael Shermer has an interesting piece at Scientific American on Confirmation Bias. He writes that:

The human understanding when it has once adopted an opinion ... draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises ... in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate. --Francis Bacon, Novum Organum, 1620

Pace Will Rogers, I am not a member of any organized political party. I am a libertarian. As a fiscal conservative and social liberal, I have found at least something to like about each Republican or Democrat I have met. I have close friends in both camps, in which I have observed the following: no matter the issue under discussion, both sides are equally convinced that the evidence overwhelmingly supports their position.

This surety is called the confirmation bias, whereby we seek and find confirmatory evidence in support of already existing beliefs and ignore or reinterpret disconfirmatory evidence. Now a functional magnetic resonance imaging (fMRI) study shows where in the brain the confirmation bias arises and how it is unconscious and driven by emotions. Psychologist Drew Westen led the study, conducted at Emory University, and the team presented the results at the 2006 annual conference of the Society for Personality and Social Psychology.

During the run-up to the 2004 presidential election, while undergoing an fMRI bran scan, 30 men--half self-described as "strong" Republicans and half as "strong" Democrats--were tasked with assessing statements by both George W. Bush and John Kerry in which the candidates clearly contradicted themselves. Not surprisingly, in their assessments Republican subjects were as critical of Kerry as Democratic subjects were of Bush, yet both let their own candidate off the hook.

The neuroimaging results, however, revealed that the part of the brain most associated with reasoning--the dorsolateral prefrontal cortex--was quiescent. Most active were the orbital frontal cortex, which is involved in the processing of emotions; the anterior cingulate, which is associated with conflict resolution; the posterior cingulate, which is concerned with making judgments about moral accountability; and--once subjects had arrived at a conclusion that made them emotionally comfortable--the ventral striatum, which is related to reward and pleasure.

"We did not see any increased activation of the parts of the brain normally engaged during reasoning," Westen is quoted as saying in an Emory University press release. "What we saw instead was a network of emotion circuits lighting up, including circuits hypothesized to be involved in regulating emotion, and circuits known to be involved in resolving conflicts." Interestingly, neural circuits engaged in rewarding selective behaviors were activated. "Essentially, it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massively reinforced for it, with the elimination of negative emotional states and activation of positive ones," Westen said.

The implications of the findings reach far beyond politics. A jury assessing evidence against a defendant, a CEO evaluating information about a company or a scientist weighing data in favor of a theory will undergo the same cognitive process. What can we do about it?

In science we have built-in self-correcting machinery. Strict double-blind controls are required in experiments, in which neither the subjects nor the experimenters know the experimental conditions during the data-collection phase. Results are vetted at professional conferences and in peer-reviewed journals. Research must be replicated in other laboratories unaffiliated with the original researcher. Disconfirmatory evidence, as well as contradictory interpretations of the data, must be included in the paper. Colleagues are rewarded for being skeptical. Extraordinary claims require extraordinary evidence.

Skepticism is the antidote for the confirmation bias.

This affirms what people have long known intuitively which is that we tend to see or believe that which we most want to be the case, especially about God. Take the classical cosmological argument for the existence of God, for example. A person who wants there to be a God will find this argument very persuasive even if they recognize that it's not a proof. They will be driven toward agreeing with the conclusion that there must be some necessary, self-existent underlying cause of all the contingent entities which comprise the universe. Things like galaxies and planets don't just pop into existence uncaused. On the other hand, a person not inclined to accept the existence of God will take refuge in the possibility that the principle of sufficient causes doesn't apply to the universe, or that perhaps the universe as a whole is the necessary entity upon which all else is contingent.

Which of these responses has most appeal to someone will depend on their prior inclinations and whether or not they want there to be a God. This is one reason why so few people are persuaded to believe (or disbelieve) in God through arguments. The most important step in conversion from one's former state of belief or non-belief is acquiring a desire either to find God or to reject Him. Once the desire is present reasons to support one's embrace or rejection will prove easy enough to come by.

Another way to put this is to say that belief is more a matter of the heart than it is of the mind.

Telic Thoughts has some thoughts on Shermer's piece as well.