Subjective Validation (Austin Cline)

Seeing Patterns & Connections That Aren’t Really There

Subjective Validation is also sometimes called the “personal validation effect” because it refers to a process by which people accept some claim or phenomenon as valid based solely upon a few personal experiences and/or subjective perception. In practice, this error is cited when a person perceives two independent events as having some sort of deeper, hidden relationship because of that person’s prior beliefs, expectations or hypotheses about the world.

According to the premises from which this person interprets the world, such a relationship must necessarily exist, and so the person will find a way to explain the data in terms of the assumed relationship. This subjective validation is often accompanied by Confirmation Bias whereby the person weighs the supporting data much more heavily than information which might cast doubt upon their beliefs.

This subjective validation is generally at the heart of people’s reports of the experience of paranormal phenomena. For example, when it comes to readings by astrologers or psychics, a person will quickly focus on and remember the “hits” or accurate statements, but forget and ignore the misses, or inaccurate statements. In this manner, the person has subjectively validated their preconception that there exists some sort of astrological or psychic connection between things in the universe.

Subjective validation is also sometimes used to describe how people can become overconfident about their prejudices and pet ideas. Essentially, we talk ourselves into believing that we are right even when the evidence at hand should convince us that we are wrong — or at least that the case for our position isn’t very sound. It could be said that we “know” better, but our desires are so powerful that they override our better sense.

This, in turn, can lead us into all sorts of problems when it comes time to actually defend our position in the face of challenges and questions posed by others who are not emotionally or psychologically wedded to the idea that our claims must be true. We might become economical with the truth, we might avoid certain questions, and we might even engage in general rationalization of our position.

Another common name for subjective validation is The Forer Effect, named after psychologist B.R. Forer. He discovered in experiments with his undergraduate students in 1948 that a person can be quite willing to accept some general or vague description of their personality as being unique to them, even though the exact same description would apply equally well (or equally badly) to everyone.

In his experiment, Forer gave a personality test to his students and then, without bothering to even read them, gave back a general personality analysis — the exact same one to each student, taken from a newspaper astrology column. He asked his students to rate his analysis and received an overwhelmingly positive response — his students were convinced that he could “read” their personalities. The same or similar experiments have been performed repeatedly through the decades in a variety of contexts, and the results continue to be the same.

Why does the Forer Effect operate? Various explanations have been offered, from human gullibility to ignorance to plain wishful thinking. It does, however, seem to provide a basis for understanding people’s acceptance of things like astrology, graphology, divination and other pseudosciences.

The best way to deal with someone whose claims rely upon subjective validation is to point out that what they really need is independent validation and confirmation. Independent evidence from some source that doesn’t have a stake in the outcome would be particularly useful. An experiment which could disconfirm the belief would also be very good. If such things cannot be provided, then it is reasonable to point out that the belief isn’t very rational.

 download (5)

Advertisements

Confirmation Bias (Austin Cline)

Selective Use of Evidence to Support Our Beliefs

 images (2) images (3)

Confirmation bias occurs when we selectively notice or focus upon evidence which tends to support the things we already believe or want to be true while ignoring that evidence which would serve to disconfirm those beliefs or ideas. This bias plays a stronger role when it comes to those beliefs which are based upon prejudice, faith, or tradition rather than on empirical evidence.

For example, if we already believe or want to believe that someone can speak to our deceased relatives, then we will notice when they say things which are accurate or pleasant but forget how often that person says things which are simply incorrect. Another good example would be how people notice when they get a phone call from a person they were just thinking about but don’t remember how often they didn’t get such a call when thinking about a person.

The confirmation bias is simply a natural aspect of our personal biases, its appearance is not a sign that a person is dumb. As Michael Shermer stated in the September 2002 issue of Scientific American:

Smart people believe weird things because they are skilled at defending beliefs they arrived at for nonsmart reasons.

Our biases are some of the non-smart reasons we have for arriving at beliefs; the confirmation bias is perhaps worse than most because it actively keeps us from arriving at the truth and allows us to wallow in comforting falsehood and nonsense. This bias also tends to work closely with other biases and prejudices the more emotionally involved we are with a belief the more likely it is that we will manage to ignore whatever facts or arguments might tend to undermine it.

Why does this sort bias exist? Well, it’s certainly true that people don’t like to be wrong and that anything which shows them to be wrong will be harder to accept. Also, emotional beliefs which are involved with our self-image are much more likely to be defended selectively. For example, the belief that we are superior to someone else because of racial differences can be difficult to abandon because that entails not only admitting that the others are not inferior, but also that we are not superior.

However, the reasons for confirmation bias aren’t all negative. It also seems likely that data which supports our beliefs is simply easier to deal with on a cognitive level we can see and understand how it fits into the world as we understand it, while contradictory information that just doesn’t fit can be set aside for later.

It is precisely because of the strength, pervasiveness, and perniciousness of this kind of bias that science incorporates the principle of independent confirmation and testing of one’s ideas and experiments. It is the hallmark of science that a claim should be supported independent of personal bias, but it is a hallmark of pseudoscience that only true believers will discover the evidence which supports their claims. That is why Konrad Lorenz wrote in his famous book On Aggression:

  • It is a good morning exercise for a research scientist to discard a pet hypothesis every day before breakfast. It keeps him young.

Of course, just because scientists are supposed to construct experiments designed specifically to disprove their theories, that doesn’t mean that they always do. Even here the confirmation bias operates to keep researchers focused on that which tends to support rather than that which might serve to refute. This is why there is such a vital role in science for what often seems like antagonistic competition between scientists: even if we can’t assume that one person will work hard to refute her own theories, we can generally assume that her rivals will.

Understanding that this is a part of our psychological makeup is a necessary step if we are to have any chance at correcting it, just as the acknowledgment that we all have prejudices is necessary in order to overcome those prejudices. When we realize that we have an unconscious inclination to weigh evidence selectively, we will have a better chance at recognizing and utilizing the material we might have overlooked  or that others have overlooked in their attempts to convince us of something.

 confirmation