Related Topics:
cognitive bias

confirmation bias, people’s tendency to process information by looking for, or interpreting, information that is consistent with their existing beliefs. This biased approach to decision making is largely unintentional, and it results in a person ignoring information that is inconsistent with their beliefs. These beliefs can include a person’s expectations in a given situation and their predictions about a particular outcome. People are especially likely to process information to support their own beliefs when an issue is highly important or self-relevant.

Background

Confirmation bias is one example of how humans sometimes process information in an illogical, biased manner. The manner in which a person knows and understands the world is often affected by factors that are simply unknown to that person. Philosophers note that people have difficulty processing information in a rational, unbiased manner once they have developed an opinion about an issue. Humans are better able to rationally process information, giving equal weight to multiple viewpoints, if they are emotionally distant from the issue (although a low level of confirmation bias can still occur when an individual has no vested interests).

(Read Steven Pinker’s Britannica entry on rationality.)

One explanation for why people are susceptible to confirmation bias is that it is an efficient way to process information. Humans are incessantly bombarded with information and cannot possibly take the time to carefully process each piece of information to form an unbiased conclusion. Human decision making and information processing is often biased because people are limited to interpreting information from their own viewpoint. People need to process information quickly to protect themselves from harm. It is adaptive for humans to rely on instinctive, automatic behaviours that keep them out of harm’s way.

Another reason why people show confirmation bias is to protect their self-esteem. People like to feel good about themselves, and discovering that a belief that they highly value is incorrect makes them feel bad about themselves. Therefore, people will seek information that supports their existing beliefs. Another closely related motive is wanting to be correct. People want to feel that they are intelligent, but information that suggests that they are wrong or that they made a poor decision suggests they are lacking intelligence—and thus confirmation bias will encourage them to disregard this information.

Evidence

Research has shown that confirmation bias is strong and widespread and that it occurs in several contexts. In the context of decision making, once an individual makes a decision, they will look for information that supports it. Information that conflicts with a person’s decision may cause discomfort, and the person will therefore ignore it or give it little consideration. People give special treatment to information that supports their personal beliefs. In studies examining my-side bias, people were able to generate and remember more reasons supporting their side of a controversial issue than the opposing side. Only when a researcher directly asked people to generate arguments against their own beliefs were they able to do so. It is not that people are incapable of generating arguments that are counter to their beliefs, but, rather, people are not motivated to do so.

Confirmation bias also surfaces in people’s tendency to look for positive instances. When seeking information to support their hypotheses or expectations, people tend to look for positive evidence that confirms that a hypothesis is true rather than information that would prove the view is false (if it is false).

Are you a student?
Get a special academic rate on Britannica Premium.

Confirmation bias also operates in impression formation. If people are told what to expect from a person they are about to meet, such as that the person is warm, friendly, and outgoing, people will look for information that supports their expectations. When interacting with people whom perceivers think have certain personalities, the perceivers will ask questions of those people that are biased toward supporting the perceivers’ beliefs. For example, if Maria expects her roommate to be friendly and outgoing, Maria may ask her if she likes to go to parties rather than asking if she often studies in the library.

Importance

Confirmation bias is important because it may lead people to hold strongly to false beliefs or to give more weight to information that supports their beliefs than is warranted by the evidence. People may be overconfident in their beliefs because they have accumulated evidence to support them, when in reality they have overlooked or ignored a great deal of evidence refuting their beliefs—evidence which, if they had considered it, should lead them to question their beliefs. These factors may lead to risky decision making and lead people to overlook warning signs and other important information. In this manner, confirmation bias is often a component of black swan events, which are high-impact events that are unexpected but, in retrospect, appear to be inevitable.

Implications

Confirmation bias has important implications in the real world, including in medicine, law, and interpersonal relationships. Research has shown that medical doctors are just as likely to have confirmation biases as everyone else. Doctors often have a preliminary hunch regarding the diagnosis of a medical condition early in the treatment process. This hunch can interfere with the doctor’s ability to assess information that may indicate an alternative diagnosis is more likely. Another related outcome is how patients react to diagnoses. Patients are more likely to agree with a diagnosis that supports their preferred outcome than a diagnosis that goes against their preferred outcome. Both of these examples demonstrate that confirmation bias has implications for individuals’ health and well-being.

In the context of law, judges and jurors sometimes form an opinion about a defendant’s guilt or innocence before all of the evidence is known. Once a judge or juror forms an opinion, confirmation bias will interfere with their ability to process new information that emerges during a trial, which may lead to unjust verdicts.

In interpersonal relations, confirmation bias can be problematic because it may lead a person to form inaccurate and biased impressions of others. This may result in miscommunication and conflict in intergroup settings. In addition, when someone treats a person according to their expectations, that person may unintentionally change their behavior to conform to the other person’s expectations, thereby providing further support for the perceiver’s confirmation bias.

Bettina J. Casad J.E. Luebering

cognitive bias, systematic errors in the way individuals reason about the world due to subjective perception of reality. Cognitive biases are predictable patterns of error in how the human brain functions and therefore are widespread. Because cognitive biases affect how people understand and even perceive reality, they are difficult for individuals to avoid and in fact can lead different individuals to subjectively different interpretations of objective facts. It is therefore vital for scientists, researchers, and decision makers who rely on rationality and factuality to interrogate cognitive bias when making decisions or interpretations of fact. Cognitive biases are often seen as flaws in the rational choice theory of human behaviour, which asserts that people make rational choices based on their preferences.

Although cognitive biases can lead to irrational decisions, they are generally thought to be a result of mental shortcuts, or heuristics, that often convey an evolutionary benefit. The human brain is constantly bombarded with information, and the ability to quickly detect patterns, assign significance, and filter out unnecessary data is crucial to making decisions, especially quick decisions. Heuristics often are applied automatically and subconsciously, so individuals are often unaware of the biases that result from their simplified perception of reality. These unconscious biases can be just as significant as conscious biases—the average person makes thousands of decisions each day, and the vast majority of these are unconscious decisions rooted in heuristics.

One prominent model for how humans make decisions is the two-system model advanced by Israeli-born psychologist Daniel Kahneman. Kahneman’s model describes two parallel systems of thought that perform different functions. System 1 is the quick, automated cognition that covers general observations and unconscious information processing; this system can lead to making decisions effortlessly, without conscious thought. System 2 is the conscious, deliberate thinking that can override system 1 but that demands time and effort. System 1 processing can lead to cognitive biases that affect our decisions, but, with self-reflection, careful system 2 thinking may be able to account for those biases and correct ill-made decisions.

One common heuristic that the human brain uses is cognitive stereotyping. This is the process of assigning things to categories and then using those categories to fill in missing information about the thing in question, often unconsciously. For example, if an individual sees a cat from the front, they may assume that the cat has a tail because the heuristic being applied refers to things that fit into the category of “cats have tails.” Filling in missing information such as this is frequently useful. However, cognitive stereotyping can cause problems when applied to people. Consciously or subconsciously putting people into categories often leads one to overestimate the homogeneity of groups of people, sometimes leading to serious misperceptions of individuals in those groups. Cognitive biases that affect how individuals perceive another person’s social characteristics, such as gender and race, are described as implicit bias.

Cognitive biases are of particular concern in medicine and the sciences. Implicit bias has been shown to affect the decisions of doctors and surgeons in ways that are harmful to patients. Further, interpretation of evidence is often affected by confirmation bias, which is a tendency to process new information in a way that reinforces existing beliefs and ignores contradictory evidence. Similar to other cognitive biases, confirmation bias is usually unintentional but nevertheless results in a variety of errors. Individuals who make decisions will tend to seek out information that supports their decisions and ignore other information. Researchers who propose a hypothesis may be motivated to look for evidence in support of that hypothesis, paying less attention to evidence that opposes it. People can also be primed in their expectations. For example, if someone is told that a book they are reading is “great,” they will often look for reasons to confirm that opinion while reading.

Other examples of cognitive bias include anchoring, which is the tendency to focus on one’s initial impression and put less weight on later information—for example, browsing for T-shirts and coming across a very cheap T-shirt first and subsequently thinking all the other shirts you encounter are overpriced. The halo effect is the tendency of a single positive trait to influence a person’s impression of a whole—for example, thinking, without evidence, that an attractive or confident person is also smarter, funnier, or kinder than others. Hindsight bias is the tendency to see events as being more predictable than they were—for example, looking back at a particularly successful investment and attributing success to skill rather than chance. Overgeneralization is a form of cognitive bias in which individuals draw broad conclusions based on little evidence; an example is encountering a very friendly Dalmatian dog and consequently assuming all Dalmatians are very friendly.

Cognitive biases are sometimes confused with logical fallacies. Although logical fallacies are also common ways that humans make mistakes in reasoning, they are not caused by errors in an individual’s perception of reality; rather, they result from errors in the reasoning of a person’s argument.

Are you a student?
Get a special academic rate on Britannica Premium.
Stephen Eldridge