Confirmation biases contribute to overconfidence in personal beliefs and can maintain or strengthen beliefs in the face of contrary evidence. Poor decisions due to these biases have been found in political and organizational contexts. Another explanation is that people show confirmation bias because they are weighing up the costs of being wrong, rather than investigating in a neutral, scientific way.
In psychology, social or cognitive science, confirmation bias is a tendency to search for or interpret information in a way that confirms one's preconceptions, leading to perceptional errors. Confirmation bias is the reason for our knack for stereotyping. We do this for the simple reason that our brains cannot instantly process, nor has immediate access to, tons of information in seconds and draw a statistically valid conclusion, so our brain takes those stereotypical short-cuts to quickly draw a conclusion and/or make a decision. We do this, and did this over the course of evolution, so we could survive. Confirmation bias is not necessarily a bad thing, as long as you are aware of this evolutionary and biochemical predisposition.
Confirmation bias is a phenomenon wherein people have been shown to actively seek out and assign more weight to evidence that confirms their hypothesis, and ignore or underweigh evidence that could disconfirm their hypothesis.
The English philosopher Francis Bacon was laying the foundations for Science and Enlightenment when he addressed this conformation bias in 1620:
"The human understanding when it has once adopted an opinion draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects; in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate. . . . And such is the way of all superstition, whether in astrology, dreams, omens, divine judgments, or the like; wherein men, having a delight in such vanities, mark the events where they are fulfilled, but where they fail, though this happen much oftener, neglect and pass them by.”
Charles Darwin, who published the theory of evolution in his book On the Origins of Species in 1859, was aware of his own confirmation bias as he stated in his autobiography:
“I had also, during many years, followed a golden rule, namely, that whenever a published fact, a new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once; for I had found by experience that such facts and thoughts were far more apt to escape from the memory than favourable ones. Owing to this habit, very few objections were raised against my views which I had not at least noticed and attempted to answer.”
As Wason coined, and Bacon and Darwin referred, be aware of our pre-disposition to confirmation bias.
Peter Wason and Confirmation Bias
Why people Ignore Facts, in Psychology Today
Confirmation Bias: A Ubiquitous Phenomenon in Many Guises
What is Confirmation Bias?
Confirmation bias is the tendency of people to favor information that confirms their existing beliefs. Confirmation bias in society, also called confirmatory bias or myside bias, is the tendency to search for, interpret, favor, and recall information in a way that confirms one's preexisting beliefs or hypotheses. It is a type of cognitive bias and a systematic error of inductive reasoning. People display this bias when they gather or remember information selectively, or when they interpret it in a biased way. The effect is stronger for emotionally charged issues and for deeply entrenched beliefs, like politics and religion. People also tend to interpret ambiguous evidence as supporting their existing position. Biased search, interpretation and memory have been invoked to explain attitude polarization (when a disagreement becomes more extreme even though the different parties are exposed to the same evidence), belief perseverance (when beliefs persist after the evidence for them is shown to be false), the irrational primacy effect (a greater reliance on information encountered early in a series) and illusory correlation (when people falsely perceive an association between two events or situations), ie, conspiracy theory.
A series of experiments by Peter Wason in the 1960s suggested that people are biased toward confirming their existing beliefs. He coined the term "confirmation bias" to describe the tendency for people to immediately favor information that validates their preconceptions, hypotheses and personal beliefs regardless of whether they are true or not. Explanations for the observed biases include wishful thinking and the limited human capacity to process large amounts of information - we are not computers, but computers cannot evaluate information, draw conclusions and create an understanding of the world the way the human brain can.