Psychological Problems in Thinking (Michael Shermer, 1997)

NOTE: The following is taken from the book, Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time, pp. 58-61. It is the last section of the 3rd Chapter entitled, How Thinking Goes Wrong: 25 Fallacies That Lead Us to Believe Weird Things:

Michael Shermer
Michael Shermer
  1. Effort Inadequacies and the Need for Certainty, Control, and Simplicity

Most of us, most of the time, want certainty, want to control our environment, and want nice, neat, simple explanations. All this may have some evolutionary basis, but in a multifarious society with complex problems, these characteristics can radically oversimplify reality and interfere with critical thinking and problem solving. For example, I believe that paranormal beliefs and pseudoscientific claims flourish in market economies in part because of the uncertainty of the marketplace. According to James Randi, after communism collapsed in Russia there was a significant increase in such beliefs. Not only are the people now freer to try to swin-die each other with scams and rackets but many truly believe they have discovered something concrete and significant about the nature of the world. Capitalism is a lot less stable a social structure than communism. Such uncertainties lead the mind to look for explanations for the vagaries and contingencies of the market (and life in general), and the mind often takes a turn toward the supernatural and paranormal.

quote-Michael-Shermer-anecdotal-thinking-comes-naturally-science-requires-training-169941

Scientific and critical thinking does not come naturally. It takes training, experience, and effort, as Alfred Mander explained in his Logic for the Millions: “Thinking is skilled work. It is not true that we are naturally endowed with the ability to think clearly and logically—without learning how, or without practicing. People with untrained minds should no more expect to think clearly and logically than people who have never learned and never practiced can expect to find themselves good carpenters, golfers, bridge players, or pianists” (1947, p. vii). We must always work to suppress our need to be absolutely certain and in total control and our tendency to seek the simple and effortless solution to a problem. Now and then the solutions may be simple, but usually they are not.

https://archive.org/details/logicformillions00mand
https://archive.org/details/logicformillions00mand
  1. Problem-Solving Inadequacies

All critical and scientific thinking is, in a fashion, problem solving. There are numerous psychological disruptions that cause inadequacies in problem solving. Psychologist Barry Singer has demonstrated that when people are given the task of selecting the right answer to a problem after being told whether particular guesses are right or wrong, they:

  1. Immediately form a hypothesis and look only for examples to confirm it.
  2. Do not seek evidence to disprove the hypothesis.
  3. Are very slow to change the hypothesis even when it is obviously wrong.
  4. If the information is too complex, adopt overly-simple hypotheses orstrategies for solutions.
  5. If there is no solution, if the problem is a trick and “right” and “wrong” isgiven at random, form hypotheses about coincidental relationships they observed. Causality is always found. (Singer and Abell 1981, p. 18)

If this is the case with humans in general, then we all must make the effort to overcome these inadequacies in solving the problems of science and of life.

approaches-to-problem-solving-and-stages-involved-7-728

  1. Ideological Immunity, or the Planck Problem

In day-to-day life, as in science, we all resist fundamental paradigm change. Social scientist Jay Stuart Snelson calls this resistance an ideological immune system: “educated, intelligent, and successful adults rarely change their most fundamental presuppositions” (1993, p. 54). According to Snelson, the more knowledge individuals have accumulated, and the more well-founded their theories have become (and remember, we all tend to [ look for and remember confirmatory evidence, not counterevidence), the greater the confidence in their ideologies. The consequence of this, however, is that we build up an “immunity” against new ideas that do not corroborate previous ones. Historians of science call this the Planck Problem, after physicist Max Planck, who made this observation on what must happen for innovation to occur in science: “An important scientific innovation rarely makes its way by gradually winning over and converting its opponents: it rarely happens that Saul becomes Paul. What does happen is that its opponents gradually die out and that the growing generation is familiarized with the idea from the beginning” (1936, p. 97).

Planck Problem

Psychologist David Perkins conducted an interesting correlational study in which he found a strong positive correlation between intelligence (measured by a standard IQ test) and the ability to give reasons for taking a point of view and defending that position; he also found a strong negative correlation between intelligence and the ability to consider other alternatives. That is, the higher the IQ, the greater the potential for ideological immunity. Ideological immunity is built into the scientific enterprise, where it functions as a filter against potentially overwhelming novelty. As historian of science I. B. Cohen explained, “New and revolutionary systems of science tend to be resisted rather than welcomed with open arms, because every successful scientist has a vested intellectual, social, and even financial interest in maintaining the status quo. If every revolutionary new idea were welcomed with open arms, utter chaos would be the result” (1985, p. 35).

the potential for ideological immunity

In the end, history rewards those who are “right” (at least provisionally). Change does occur. In astronomy, the Ptolemaic geocentric universe was slowly displaced by Copernicus’s heliocentric system. In geology, George Cuvier’s catastrophism was gradually wedged out by the more soundly supported uniformitarianism of James Hutton and Charles Lyell. In biology, Darwin’s evolution theory superseded creationist belief in the immutability of species. In Earth history, Alfred Wegener’s idea of continental drift took nearly a half century to overcome the received dogma of fixed and stable continents. Ideological immunity can be overcome in science and in daily life, but it takes time and corroboration.

shr0478l

Spinoza’s Dictum

Skeptics have the very human tendency to relish debunking what we already believe to be nonsense. It is fun to recognize other people’s fallacious reasoning, but that’s not the whole point. As skeptics and critical thinkers, we must move beyond our emotional responses because by understanding how others have gone wrong and how science is subject to social control and cultural influences, we can improve our understanding of how the world works. It is for this reason that it is so important for us to understand the history of both science and pseudoscience. If we see the larger picture of how these movements evolve and figure out how their thinking went wrong, we won’t make the same mistakes. The seventeenth-century Dutch philosopher Baruch Spinoza said it best: “I have made a ceaseless effort not to ridicule, not to bewail, not to scorn human actions, but to understand them.”

 Spinoza's Dictum

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s