This is the second half of the article begun in the last NZ Skeptic

Representativeness

Seemingly unusual events must be considered for their representativeness of that class of phenomena. In the case of the “Bermuda Triangle”, where ships and planes “mysteriously” disappear, there is an immediate assumption that something strange or alien is at work. But we must consider how “representative” the event is in that area. There are far more shipping lanes in the so-called “Bermuda Triangle” than in surrounding areas, so accidents and mishaps are more likely to happen in such an area. (As it turns out, there are actually fewer accidents in the Bermuda Triangle, per rate of traffic, than in surrounding areas. So these areas should be called, perhaps, the “Non-Bermuda Triangle.” See Kusche, 1975, for a full explanation of this solved mystery.)

Similarly, in investigating haunted houses we must have a baseline representative measure of noises, creaks, and other events before we can say that an occurrence is unusual (and therefore “mysterious”). I used to hear rapping sounds in the walls of my house. Ghosts? Nope. Bad plumbing. I occasionally hear scratching sounds in my basement. Poltergeists? Nope. Rats. One would be well-advised to first rule out worldly explanations before turning to other-worldly ones.

Failures are Rationalised

In science the value of negative findings – failures – cannot be overemphasised. Usually they are not wanted, and often they are not published. But most of the time failures are how we get closer to the mark of truth. And honest scientists will admit error because they know that their fellow scientists will publicise their errors, especially since they have had their share as well. Not so with pseudoscientists. Failures are ignored or more often rationalised, especially when exposed.

If they are actually caught cheating – not a frequent occurrence – they claim that normally their powers work but not always, so when pressured to perform on TV or in a laboratory, they resort to cheating. If they simply fail to perform, they claim any number of creative reasons: too many controls in an experiment cause negative results; the powers do not work in the presence of skeptics; the powers do not work in the presence of electrical equipment; or the powers come and go, and this is one of those times they went. Finally, they claim, if skeptics cannot explain everything, then there must be something paranormal, falling into the unexplained is not inexplicable fallacy. It is rare for any of us to say “I was wrong.” Rationalisation is less painful to the ego.

Remember Hits, Ignore Misses

This fallacy is a classic among psychics, prophets, and soothsayers, who make hundreds of predictions on January 1 and then tally up the handful of “hits” at the end of the year (mostly generalised, sure-bet types like “there will be a major [not defined] earthquake in Southern California” or “I see trouble for the Royal Family”). The next year they publish their hits and ignore the misses, and hope no skeptics bothered to keep track.

When a psychic makes statements about a person, they usually do so in the format of lots of questions. But this fallacy has a subtler effect on us all. We are startled when we go to the phone to call our friend, and it rings with a call to us from that same friend, because we have forgotten how many times the friend did not call under those circumstances, or someone else called, or the friend called but we were not thinking of him or her, and so on.

As Aristotle said, “the sum of the coincidences equals certainty”. Because we forget most of the insignificant coincidences, the meaningful ones are remembered, the meaningless ones ignored. We must be always vigilant to remember the larger context in which the seemingly unusual event occurred.

Burden of Proof

Who has to burden to prove what to whom? Herein lies the social nature of science and knowledge. The person making the extraordinary claim has the burden of proving to the experts and to the community at large that his or her belief has more validity than the one almost everyone else accepts.

It works a bit like a democracy. You have to lobby for your opinion to be heard. Then you have to marshal experts on your side so you can convince the majority to “vote” for your claim over the one for which they have always voted. Finally, when you are in the majority, the burden of proof switches to the outsider who wants to challenge you with his or her unusual claim.

The burden of proof is on the creationists to show why the theory of evolution is wrong and why creationism is right, not on evolutionists to defend themselves. (Evolutionists had the burden of proof for a half a century after Darwin and now enjoy the reversed roles.) The burden of proof is on the Holocaust revisionists to prove the Holocaust did not happen, not on Holocaust historians to prove that it did. The burden of proof is on Eric Lerner to prove the Big Bang never happened, not on cosmologists to prove that it did (though this has only very recently made the shift).

This is the price you pay for being an outsider, regardless of whether you are right or wrong.

Logical Problems in Thinking

Emotive Words and False Analogies

Emotive words are used to provoke emotion and obscure rationality. They can be positive emotive words – motherhood, America, integrity, honesty. Or they can be negative emotive words – rape, cancer, evil, communist. Politicians are masters at this fallacy, talking about inflation as “the cancer of society”, or industry “raping the environment”. Similarly, metaphors and analogies can be powerful tools of language, but they can also be misleading when they redirect thinking into emotions or down an irrelevant path.

In the 1992 Democratic nomination speech by Al Gore, for example, he constructed an elaborate analogy around the story of his sick son, holding him in his arms as he hovered on the brink of death, and finally tending him back to health. He made constant references to the sick country, America, hovering on the brink of death after 12 years of Reagan/Bush, now to be nurtured back to health under the new administration. It is a powerful tool of language that can cut both ways, for or against.

Ad Ignorantiam

This is an appeal to ignorance or lack of knowledge, and is related to the burden of proof and unexplained is not inexplicable fallacies, where someone might argue that if you cannot disprove a claim, it must be true. For example, if you cannot prove there is not psychic power, then there must be.

The absurdity of this argument would be clear if one argued that if you cannot disprove Santa Claus, then he must exist. You can also argue the opposite in a similar manner. If you cannot prove Santa Claus, then he does not exist. Proof comes from positive evidence in support of a claim, not lack of evidence for or against a claim. In either case, an appeal to ignorance gets us no closer to the truth.

Ad Hominem and Tu Quoque

Literally “to the man” and “you are another”, these fallacies misdirect thinking from the idea to the person holding the idea (and, in a defensive posture, to accuse the accuser of the same thing). The goal is to discredit the claimant in hopes that it will discredit the claim. Calling someone an atheist, a communist, a child abuser or a neo-Nazi does not in any way answer the specific challenge.

It might be good to know if someone is a particular religion or holds a particular ideology, in case this has in some way biased their research, but refuting claims must be done directly, not indirectly. If a Holocaust revisionist, for example, is a neo-Nazi or an anti-Semite, it would be good to know because this would certainly bias them in their selection of historical events to emphasise or ignore. But if they are making the claim, for example, that Hitler did not have a master plan for the extermination of European Jewry, to just say “Oh, he is saying that because he is a neo-Nazi” does not refute the argument. Either Hitler had a master plan or he did not, and this question can be settled historically.

Similarly with tu quoque – if someone accuses you of cheating on your taxes, to answer “well so do you” is not an explanation, although it might be construed as a reasonable defence against an ad hominem attack. (Try that your next audit!)

Hasty Generalisation

In logic, the hasty generalisation is a form of improper induction. In life it is called prejudice. In either case, conclusions are drawn before the facts warrant it. Because our brains evolved to be constantly on the alert to find connections between events and underlying causes of phenomena (to help us survive), this fallacy is one of the most common of all. A couple of bad teachers are generalised to an unworthy school. A few bad cars are inferred to mean that brand of automobile is unreliable. A handful of members of a group are used to judge the entire group.

In science, we must gather as much information as possible before announcing our conclusions. This is why Alfred Kinsey collected data on over 10,000 men and women before releasing his startling conclusions about human sexual behaviour. Kinsey has been accused of many things, but hasty generalisation is not one of them.

After-the-Fact Reasoning

Also known as post hoc, ergo propter hoc, this fallacy is related to the coincidences are not causation fallacy, where the reasoning is literally “after this, therefore because of this.” At its basest level, it is a form of superstition. The baseball player does not shave and hits two home runs. The gambler wears his lucky shoes because he has won with them in the past.

More subtly, scientific studies can fall prey to this fallacy. In 1993 a study found that breast-fed children have higher IQs. There was much clamour over what in mother’s milk could increase intelligence. Mothers who bottle-fed their babies were made to feel guilty. But soon after, researchers began to wonder if perhaps breast-fed babies were attended to differently, or if maybe nursing mothers spent more time with their babies and that motherly vigilance was the cause of higher intelligence. As David Hume taught us correctly, the fact that two events follow each other in sequence does not mean they are connected causally. Correlation does not mean causation.

Opposition Fallacy

If the opposition is for it, we should be against it because they are wrong about other things. This is a particularly susceptible fallacy for skeptics, because we tend to think that people who believe in the paranormal are incapable of right thinking in other areas. This may be true in general, but it certainly is not in particular. Many good scientists, for example, have been easily duped by clever magicians or flim-flam artists into believing any number of wacky claims. Alfred Russell Wallace, who co-discovered natural selection as the prime mechanism of evolutionary change, also believed in spirits, ghosts, and the afterworld. If we were to discount all of his thinking because of these beliefs, we would be missing a lot of good thoughts.

Genetic Fallacy

This is an appeal to the genesis or source of an idea to support or destroy it, and it goes in two directions: (1) the source of an idea is a recognised expert; (2) the source of an idea is a recognised quack. In other words, who is making the claim makes all the difference. If it is a Nobel laureate making the claim, we take note because he or she has been right in a big way before. If it is a discredited scam artist, we give a loud guffaw because he or she has been wrong in a big way before. While this is a useful screening tool for separating the wheat from the chaff, it is dangerous in that we might either (1) accept a wrong idea just because it was supported by someone we respect (false positive), or (2) reject a right idea just because it was supported by someone we disrespect (false negative). How do you know which is which? Examine the evidence.

Either-Or

Also known as the fallacy of negation or false dilemma, this is the tendency to dichotomise the world so that when you discredit the one, the observer is forced to accept the other. This is a favourite tactic of the creationists, who claim that life was either divinely created or evolved. Then they spend the majority of their time discrediting the theory of evolution, concluding that since evolution is wrong, creationism must be right.

In scientific revolutions and paradigm shifts, however, it is not enough to just discredit a theory. You must also replace it with one that explains both the “normal” data and the “anomalous” data not explained by the old theory. In other words, it must be a superior model, which requires that you present evidence in favour of it, not just against the opposition. The problem with either-or thinking was expressed with levity by an unknown poet:

In matters controversial,
My perception’s rather fine.
I always see both points of view,
The one that’s wrong, and mine.

Circular Reasoning

Also known as the fallacy of redundancy, begging the question, or tautology, this is when the conclusion or claim is merely a restatement of one of the premises. Christian apologetics (theological defences) are filled with tautologies: Is there a God? Yes. How do you know? Because the Bible says so. How do you know the Bible is correct? Because it was inspired by God — i.e., God is because God is.

Science also has its share of redundancies: What is gravity? The tendency for objects to be attracted to one another. Why are objects attracted to one another? Gravity. In other words, gravity is because gravity is. The problem is in definitions, which are difficult to make without being tautological in your thinking: Why does Mother Theresa do such good work for others? Because she is moral. What does it mean to be moral? It is doing good works for others. Difficult as it is, we must try to construct operational definitions that can be tested, falsified, and refuted.

Reductio ad Absurdum and the Slippery Slope

Reductio ad absurdum is the refutation of an argument by reducing it to an absurd conclusion if carried out to its logical end. If the consequences are absurd, then the statement must be false. This is not necessarily so, though sometimes this is a useful exercise in critical thinking because often this is a way to discover if a claim has validity, especially if the experiment (the actual reduction) can be run to find out.
Similarly with the slippery slope fallacy, where one thing leads ultimately to another so extreme that the first step should never be taken. For example: Eating Ben & Jerry’s ice cream will cause you to put on weight. Putting on weight will make you overweight. Soon you will weigh 350 pounds and die of heart disease. Eating Ben & Jerry’s ice cream leads to death. Don’t even try it. Certainly eating Ben & Jerry’s ice cream may lead to obesity, and could possibly, in very rare cases, cause someone to balloon up to 350 pounds. But this is quite unlikely. The consequence does not necessarily follow from the premise.

Psychological Problems in Thinking

Effort Inadequacies and the Need for Certainty, Control, and Simplicity

Most of us, most of the time, have a desire for certainty, a need to control our environment, and a preference for simplicity. This, no doubt, stems from our evolutionary background in the quest to better understand and change the environment for the purpose of survival. (Those who were most successful in understanding and controlling their environment left behind the most offspring, who in turn were more successful than their ancestors, and thus left behind the most offspring, and so on to us.) Thus, the need for certainty, control, and simplicity, and the desire to expend the least effort for the greatest return, is probably biologically wired and good for the species. But good for the species is not always good for the individual. In a multifarious society with complex problems, these characteristics can interfere with critical thinking and problem solving.

Scientific and critical thinking does not come naturally. It takes training, experience, and effort, as Alfred Mander explained in his Logic for the Millions (1947, p. vii):

Thinking is skilled work. It is not true that we are naturally endowed with the ability to think clearly and logically – without learning how, or without practising. People with untrained minds should no more expect to think clearly and logically than people who have never learned and never practised can expect to find themselves good carpenters, golfers, bridge players, or pianists.

We must always work to suppress the need to be absolutely certain, in total control, and always to seek the simple and effortless solution to problems. The solutions, of course, may be simple and easy to derive, but this is not usually the case. We are well advised to keep this component of our psyche in abeyance.

Over-reliance on Authorities

Similar to the genetic fallacy (but broader in scope), we tend to rely heavily on authorities in our culture, especially if they are considered to be highly intelligent. The IQ score has taken on nearly mystical proportions of power in the last half century, but as James Randi notes: “Possession of a ‘high IQ’ often has little to do with one’s ability to function as a rational human being.”

As an example, Randi notes that belief in the paranormal is not uncommon among Mensa members, representing the top two percent of the population, with some arguing that their “Psi-Q” is also superior. The problem, says biochemist and Skeptic editorial board member Elie Shneour, “is the low estate in our society of individuals able or willing to think for themselves. At almost any time there are pervasive, almost all-encompassing, pressures to direct what we are to think.”

Randi is also fond of lampooning authorities in the form of PhDs who, he says, once they are granted the degree find it almost impossible to say two things: “I don’t know” and “I was wrong.” Authorities, by virtue of their expertise in a field, may have a greater probability of being right in that field, but it is certainly not guaranteed, and their expertise does not necessarily qualify them to jump to conclusions in other areas.

Problem-Solving Inadequacies

All critical and scientific thinking is, in a fashion, problem-solving. There are numerous psychological disruptions that cause problem-solving inadequacies. Psychologist Barry Singer has demonstrated that when people are given the task of selecting the right answer to a problem by being told whether particular guesses are right or wrong, they do the following (1981, p. 18):

  • Immediately form a hypothesis and look only for examples to confirm it;
  • Do not seek evidence to disprove the hypothesis;
  • Are very slow to change the hypothesis even when it is obviously wrong;
  • If the information is too complex, adopt overly-simple hypotheses or strategies for solutions;
  • If there is no solution, if the problem is a trick and “right” and “wrong” is given at random, form hypotheses about coincidental relationships they observed. Causality is always found.

If this is the case with humans in general, then we all must be vigilant in our efforts to overcome these inadequacies in solving the problems of science and of life.

Ideological Immunity, or The Planck Problem

In his now classic book, The Structure of Scientific Revolutions (1962), Thomas Kuhn described the essence of revolutions as “paradigm shifts”. When enough members of the scientific community (particularly those in positions of scientific hegemony) are willing to abandon the old orthodoxy in favour of the (formerly) radical new theory, then, and only then, can the paradigm shift occur. This generalisation about change in science is usually made about the paradigm as a system, but we must recognise, of course, that the paradigm is a mental model in the minds of individuals.

We can thus consider the problem of the resistance to change to be a psychological as well as a sociological one. My friend Jay Stuart Snelson has identified this obstinacy to change in individuals as an ideological immune system, where “educated, intelligent, and successful adults rarely change their most fundamental presuppositions” (1993, p. 54). According to Snelson, the more knowledge individuals have accumulated, and the more well-founded their theories have become, the greater the confidence in their ideologies. The consequence of this, however, is that they build up an “immunity” against new ideas that do not corroborate previous ones.

Historians of science call this the Planck Problem, after Max Planck, who made this observation of what must happen for innovative progress to occur in science (p. 97):

An important scientific innovation rarely makes its way by gradually winning over and converting its opponents: it rarely happens that Saul becomes Paul. What does happen is that its opponents gradually die out and that the growing generation is familiarised with the idea from the beginning.

Psychologist David Perkins conducted an interesting correlational study in which he found a high positive correlation between intelligence (measured on a standard IQ test) and the ability to give reasons for taking a point of view and defending that position; he also found a high negative correlation between intelligence and the ability to consider other alternatives. That is, the more intelligent the individual, the greater the potential for ideological immunity.

On one level, however, ideological immunity is purposefully built into the scientific enterprise as a way of maintaining the status quo long enough to test the validity of various claims. Historian of science I.B. Cohen explains:

New and revolutionary systems of science tend to be resisted rather than welcomed with open arms, because every successful scientist has a vested intellectual, social, and even financial interest in maintaining the status quo. If every revolutionary new idea were welcomed with open arms, utter chaos would be the result” (1985, p. 35).

In the end, history rewards those who are “right” (at least provisionally). Change does occur. In astronomy, the Ptolemaic geocentric universe was slowly displaced by Copernicus’s heliocentric system. In geology, Cuvier’s catastrophism was gradually edged out by the more soundly supported uniformitarianism of Hutton and Lyell. In biology, Darwin’s evolution superseded the creationist’s belief in the immutability of species. In Earth history, Alfred Wegener’s idea of continental drift took nearly half a century to find acceptance in opposition to the received dogma of fixed and stable continents. Because science is progressive, however, such immunity is eventually overcome.

Transcendental Temptation

There is one final psychological component to consider in the disruption of critical thinking, and that is what the philosopher Paul Kurtz calls the transcendental temptation, discussed at length in his book of this title (1986). In essence, it affects all human beings who have thoughtfully considered the ultimate end of our being – death and the possibility of life after death. The temptation, says Kurtz, touches every soul for the simple reason that none of us is thrilled by the prospect of a finality to life:

The transcendental temptation lurks deep within the human breast. It is ever-present, tempting humans by the lure of transcendental realities, subverting the power of their critical intelligence, enabling them to accept unproven and unfounded myth systems (p. 477).

Specifically, Kurtz argues, myths, religions, pseudosciences, and claims of the paranormal are lures tempting us beyond rational, critical, and scientific thinking, for the very reason that they touch something in all of us that is so sacred and important – life and immortality:

It is apparent that the quest for transcendence expresses a passionate desire within the human breast for immortality and permanence. This impulse is so strong that it has inspired the great religions and paranormal movements of the past and the present and goaded otherwise sensible men and women to swallow patently false myths and to repeat them constantly as articles of faith” (p. 417).

One would be hard pressed, of course, to find a gene or trait for “transcendental temptation”, so we must consider what is behind the construction of these beliefs that are so appealing to our emotions and desires. Kurtz claims it is the “creative imagination” that is the driving force behind the transcendental temptation (p. 459):

There is a constant battle in the human heart between our fictionalised images and the actual truth. We fabricate ideal poetic, artistic, and religious visions of what might have been in the past or could be in the future. But whether these idealised worlds are true is another matter. There is a constant tension between the scientist and the poet, the philosopher and the artist, the practical man and the visionary. The scientist, philosopher, and practical man wish to interpret the universe and understand it for what it really is; the others are inspired by what it might become. Scientists wish to test their hypothetical constructs; dreamers live by them. All too often what people crave is faith and conviction, not tested knowledge. Belief far outstrips truth as it soars on the wings of imagination.

This is the price we pay for being humans and not automata.

Spinoza’s Dictum

In considering how thinking goes wrong, especially in the context of conducting skeptical investigations of pseudoscience and paranormal claims, we might ask ourselves, do we enjoy the process because it “debunks” what we believe to be nonsense? In part, I confess, there is some pleasure in seeing someone else’s bizarre claim harpooned. This pleasure, I suspect, is normal and can be found in most members of most groups when confronting others who think differently. But as rational skeptics and critical thinkers we must move beyond our emotional responses and realise that through understanding how others have gone wrong, and how all science is subject to social control and cultural influences, we can improve our understanding of how the world works. It is for this reason that it is so important for us to understand the history of both science and pseudoscience, so we understand the larger picture of how these movements evolve. This is why the Skeptics Society has adopted as its motto the belief of the 17th-century Dutch philosopher Baruch Spinoza:

I have made a ceaseless effort not to ridicule, not to bewail, not to scorn human actions, but to understand them.

Recommended Posts