NOTE: The following article is taken from the 5th chapter of Misunderstanding Cults: Searching for Objectivity in a Controversial Field, entitled, Towards a Demystified and Disinterested Scientific Theory of Brainwashing.
Nobody likes to lose a customer, but religions get more touchy than most when faced with the risk of losing devotees they have come to define as their own. Historically, many religions have gone to great lengths to prevent apostasy, believing virtually any means justified to prevent wavering parishioners from defecting and thus losing hope of eternal salvation. In recent centuries, religion in our society has evolved from a system of territorially based near-monopolies into a vigorous and highly competitive faith marketplace in which many churches, denominations, sects, and cults vie with one another for the allegiance of ‘customers’ who are free to pick and choose among competing faiths. Under such circumstances, we should expect to find that some of the more tight-knit and fanatical religions in this rough-and-tumble marketplace will have developed sophisticated persuasive techniques are known in the literature by the controversial term ‘brainwashing.’ This chapter is devoted to a search for a scientific definition of brainwashing and an examination of the evidence for the existence of brainwashing in cults. I believe that research on this neglected subject is important for a fuller understanding of religious market dynamics.1 And, ultimately, research on this subject may yield a wider dividend as well, assisting us in our quest for a fuller understanding of mass charismatic movements such as Fascism, Nazism, Stalinism, and Maoism.
Do We Need to Know Whether Cults Engage in Brainwashing?
The question of why people obey the sometimes bizarrely insane commands of charismatic leaders, even unto death, is one of the big unsolved mysteries of history and the social sciences. If there are deliberate techniques that charismatic leaders (and charismatically led organizations) use to induce high levels of uncritical loyalty and obedience in their followers, we should try to understand what these techniques are and under what circumstances and how well they work.
This chapter is about nothing other than the process of inducing ideological obedience in charismatic groups. Many people call this process brainwashing, but the label is unimportant. What is important is that those of us who want to understand cults develop models that recognize the importance that some cults give to strenuous techniques of socialization designed to induce uncritical obedience to ideological imperatives regardless of the cost to the individual.
The systematic study of obedience has slowed down considerably within the behavioural sciences. Early laboratory studies of obedience-inducing mechanisms got off to a promising start in the 1960s and 1970s, but were correctly criticized by human rights advocates for putting laboratory subjects under unacceptable levels of stress (Kelman and Hamilton 1989; Milgram 1975; Zimbardo 1973). Permission to do obedience-inducing experiments on naive experimental subjects became almost impossible to obtain and these sort of laboratory experiments virtually ceased. However, large numbers of charismatic cultic movements appeared on the scene just in time to fill this vacuum left by abandoned laboratory studies. Being naturally occurring social ‘experiments,’ obedience-induction in such groups could be studied ethnographically without raising the ethical objections that had been raised concerning laboratory studies.
Social theorists are well aware that an extremely high degree of obedience to authority is a reliably recurring feature of charismatic cult organizations (Lindholm 1990; Oakes 1997). But most social scientists interested in religion declined this opportunity. For reasons having more to do with political correctness than scientific curiosity, most of them refused to design research focused on obedience-induction. Many even deny that deliberate programs of obedience-induction ever occur in cults.
The existence of a highly atypical form of obedience to the dictates of charismatic leaders is not in question. Group suicides at the behest of a charismatic leader are probably the most puzzling of such acts of obedience (Hall 2000; Lalich 1999; Weightman 1983), but murder, incest, child abuse, and child molestation constitute other puzzling examples for which credible evidence is available (Bugliosi and Gentry 1974; Lifton 1999; Rochford 1998). However, agreement on these facts is not matched, as we shall see, by agreement on the causes of the obedience, its pervasiveness among cult populations, or the rate at which it decays after the influence stimuli are removed.
But given the fact that only a small proportion of the human population ever join cults, why should we care? The answer is that the sociological importance of cults extends far beyond their numerical significance. Many cults are harmless and fully deserving of protection of their religious and civil liberties. However, events of recent years have shown that some cults are capable of producing far more social harm than one might expect from the minuscule number of their adherents. The U.S. Department’s annual report on terrorism for the year 2000 concludes that ‘while Americans were once threatened primarily by terrorism sponsored states, today they face greater threats from loose networks of groups and individuals motivated more by religion or ideology than by politics’ (Miller 2000:1).
In his recent study of a Japanese apocalyptic cult, Robert Jay Lifton (1999: 343) has emphasized this point in the following terms:
‘Consider Asahara’s experience with ultimate weapons…With a mad guru and a few hundred close followers, it is much easier to see how the very engagement with omnicidal weapons, once started upon, takes on a psychological momentum likely to lead either to self-implosion or to world explosion…Asahara and Aum have changed the world, and not for the better. A threshold has been crossed. Thanks to this guru, Aum stepped over a line that few had even known was there. Its members can claim the distinction of being the first group in history to combine ultimate fanaticism with ultimate weapons in a project to destroy the world. Fortunately, they were not up to the immodest task they assigned themselves. But whatever their bungling, they did cross that line, and the world will never quite be the same because, like it or not, they took the rest of us with them.’
Potentially fruitful scientific research on obedience in cultic settings has been stymied by the well-intentioned meddling of two bitterly opposed, but far from disinterested, scholarly factions. On the one hand, there has been an uncompromising outcry of fastidious naysaying by a tight-knit faction of pro-religion scholars. Out of a fear that evidence of powerful techniques for inducing obedience might be used by religion’s enemies to suppress the free expression of unpopular religions, the pro-religion faction has refused to notice the obvious and had engaged in a concerted (at times almost hysterical) effort to sweep under the rug any cultic-obedience studies not meeting impossibly rigorous controlled experimental standards (Zablocki 1997).On the other hand, those scholars who hate or fear cults have not been blameless in the pathetic enactment of this scientific farce. Some of them have tried their best to mystically transmute the obedience-inducing process that goes on in some cults from a severe and concentrated form of ordinary social influence into a magic spell that somehow allows gurus to snap the minds and enslave the wills of any innocent bystander unlucky enough to come into eye contact. By so doing, they have marginalized themselves academically and provided a perfect foil for the gibes of pro-religion scholars.
Brainwashing is the most commonly used word for the process whereby a charismatic group systematically induces high levels of ideological obedience. It would be naively reductionistic to try to explain cultic obedience entirely in terms of brainwashing. Other factors, such as simple conformity and ritual, induce cultic obedience as well. But it would be an equally serious specification error to leave deliberate cultic manipulation of personal convictions out of any model linking charismatic authority to ideological obedience.
However, the current climate of opinion, especially within the sociology of new religious movements, is not receptive to rational discussion of the concept of brainwashing, and still less to research in this area. Brainwashing has for too long been a mystified concept, and one that has been the subject of tendentious writing (thinly disguised as theory testing) by both its friends and enemies. My aim in this chapter is to rescue for social science a concept of brainwashing freed from both mystification and tendentiousness. I believe it is important and long overdue to restore some detachment and objectivity to this field of study.
The goal of achieving demystification will require some analysis of the concept’s highly freighted cultural connotations, with particular regard to how the very word brainwash became a shibboleth in the cult wars. It is easy to understand how frightening it may be to imagine that there exists some force that can influence one down to the core level of basic beliefs, values, and worldview. Movies like The Manchurian Candidate have established in the popular imagination the idea that there exists some mysterious technique, known only to a few that confers such power. Actually, as we will see, the real process of brainwashing involves only well-understood processes of social influence orchestrated in a particularly intense way. It still is, and should be, frightening in its intensity and capacity for extreme mischief, but there is no excuse for refusing to study something simply because its frightening.
The goal of establishing scientific disinterest will require the repositioning of the concept more fully in the domain of behavioural and social science rather than its present domain, which is largely that of civil and criminal legal proceedings. It is in this domain that it has been held hostage and much abused for more than two decades. The maxim of scholarly disinterest requires the researcher to be professionally indifferent as to whether our confidence in any given theory (always tentative at best) is increased or decreased by research. But many scholarly writers on this subject have become involved as expert witnesses, on one side or the other, in various law cases involving allegations against cult leaders or members (where witnesses are paid to debate in an arena in which the only possible outcomes are victory or defeat). This has made it increasingly difficult for these paid experts to cling to a disinterested theoretical perspective.
In my opinion, the litigational needs of these court cases have come, over the years, to drive the scientific debate to an alarming degree. There is a long and not especially honourable history of interest groups that are better armed with lawyers than with scientific evidence, and that use the law to place unreasonable demands on science. One need only think of the school segregationists’ unreasonable demands, fifty years ago, that science prove that any specific child was harmed in a measurable way by a segregated classroom; or the tobacco companies’ demands, forty years ago, that science demonstrate the exact process at the molecular level by which tobacco causes lung cancer. Science can serve the technical needs of litigation, but, when litigation strategies set the agenda for science, both science and the law are poorer for it.
My own thirty-six years of experience doing research on new religious movements has convinced me beyond any doubt that brainwashing is practised by some cults some of the time on some of their members with some degrees of success. Even though the number of times I have used the vague term some in the previous sentence gives testimony to the fact that there remain many still-unanswered questions about this phenomenon, I do not personally have any doubt about brainwashing’s existence. But I have also observed many cults that do not practise brainwashing, and I have never observed a cult in which brainwashing could be reasonably described as the only force holding the group together. My research (Zablocki 1971; 1991; 1996; Zablocki and Aidala 1991) has been ethnographic, comparative, and longitudinal. I have lived among these people and watched the brainwashing process with my own eyes. I have also interviewed people who participated in the process (both as perpetrators and subjects). I have interviewed many of these respondents not just one time but repeatedly over a course of many years. My selection of both cults and individuals to interview has been determined by scientific sampling methods (Zablocki 1980: app A), not guided by convenience nor dictated by the conclusions I hoped to find. Indeed, I have never had an axe to grind in this field of inquiry. I didn’t begin to investigate cults in the hope of finding brainwashing. I was surprised when I first discovered it. I insist on attempting to demonstrate its existence not because I am either for or against cults but only because it seems to me to be an incontrovertible, empirical fact.
Although my own ethnographic experience leads me to believe that there is overwhelming evidence that brainwashing is practised in some cults, my goal in this chapter is not to ‘prove’ that brainwashing exists, but simply to rescue it from the world of bogus ideas to which it has been banished unfairly, and to reinstate it as a legitimate topic of social science inquiry. My attempt to do so in this chapter will involve three steps. First, I will analyse the cultural misunderstandings that have made brainwashing a bone of contention rather than a topic of inquiry. Second, I will reconstruct the concept in a scientifically useful and empirically testable form within the framework of social influence theory. Third, I will summarize the current state of evidence (which seems to me to be quite compelling) that some cults do in fact engage in brainwashing with some degrees of success.
To be continued…
- Most of the examples in this chapter will be drawn from studies of religious cults because these are ones with which I am most familiar through my research. But it should be noted that cults need not be religious, and that there are plenty of examples of brainwashing in political and psychotherapeutic cults as well.