Brainwashing as a Scientific Concept (Benjamin Zablocki, 2001)

NOTE: The following article is taken from the 5th chapter of Misunderstanding Cults: Searching for Objectivity in a Controversial Field, entitled, Towards a Demystified and Disinterested Scientific Theory of Brainwashing.

Misunderstanding_Cults

What I am presenting here is not a ‘new’ theory of brainwashing but a conceptual model of the foundational theory developed in the mid-twentieth century by Lifton, Schein, and Sargant as it applies to charismatic collectivities. Because its scientific stature has been so frequently questioned, I will err on the side of formality by presenting a structured exposition of brainwashing theory in terms of eight definitions and twelve hypotheses. Each definition includes an operationalized form by which the trait may be observed. If either of the first two hypotheses disconfirmed, we must conclude that brainwashing is not being attempted in the cult under investigation. If any of the twelve hypotheses is disconfirmed, we must conclude that brainwashing is not successful in meeting its goals within that cult.

I do not pretend that the model outlined here is easy to test empirically, particularly for those researchers who either who either cannot or will not spend time immersing themselves in the daily lives of cults, or for those who are not willing, alternatively, to use as data the detailed retrospective accounts of ex-members. However, it should be clear that the model being proposed here stays grounded in what is empirically testable and does not involve mystical notions such as loss of free will or information disease (Conway and Siegelman 1978) that have characterized many of the extreme ‘anti-cult models.’

Nor do I pretend that this model represents the final and definitive treatment of this subject. Charismatic influence is still a poorly understood subject on which much additional research is needed. With few exceptions, sociology has treated it as if it were what engineers call a ‘black box,’ with charismatic inputs coming in one end and obedience outputs going out the other. What we have here is a theory that assists in the process of opening  this black box to see what is inside. It is an inductive theory, formed largely from the empirical generalizations of ethnographers and interviewers. The model itself presents an ideal-type image of brainwashing that does not attempt to convey the great variation among specific obedience-inducing processes that occur across the broad range of existing cults. Much additional refinement in both depth and breadth will certainly be needed.

Definitions

EE 2018

D1. Charisma is defined, using the classical Weberian formula, as a condition of ‘devotion to the specific and exceptional sanctity, heroism, or exemplary character of an individual person, of the normative patterns or order revealed or ordained by him’ (Weber 1947: 328). Being defined this way, as a condition of devotion, leads us to recognize that charisma is not to be understood simply in terms of the characteristics of the leader, as it has come to be in popular usage, but requires an understanding of the relationship between leader and followers. In other words, charisma is a relational variable. It is defined operationally as a network of relationships in which authority is justified (for both superordinates and subordinates) in terms of the special characteristics discussed above.

D2. Ideological Totalism is a sociocultural system that places high valuation on total control over all aspects of the outer and inner lives of participants for the purpose of achieving the goals of an ideology defined as all important. Individual rights either do not exist under ideological totalism or they are clearly subordinated to the needs of the collectivity whenever the two come into conflict. Ideological totalism has been operationalized in terms of eight observable characteristics: milieu control, mystical manipulation, the demand for purity, the cult of confession, ‘sacred science,’ loading the language, doctrine over person, and the dispensing of existence (Lifton 1989: chap. 22).1

D3. Surveillance is defined as keeping watch over a person’s behaviour, and, perhaps, attitudes. As Hechter (1987) has shown, the need for surveillance is the greatest obstacle to goal achievement among ideological collectivities organized around the production of public goods. Surveillance is not only costly, it is also impractical for many activities in which agents of the collectivity may have to travel to act autonomously and at a distance. It follows from this that all collectivities pursuing public goals will be motivated to find ways to decrease the need for surveillance. Resources used for surveillance are wasted in the sense that they are unavailable for the achievement of collective goals.

D4. A deployable agent is one who is uncritically obedient to directives perceived as charismatically legitimate (Selznick 1960). A deployable agent can be relied on to continue to carry out the wishes of the collectivity regardless of his own hedonic interests and in the absence of any external controls. Deployability can be operationalized as the likelihood that the individual will continue to comply with hitherto ego-dystonic demands of the collectivity (e.g., mending, ironing, mowing the lawn, smuggling, rape, child abuse, murder) when not under surveillance.

D5. Brainwashing is an observable set of transactions between a charismatically structured collectively and an isolated agent of the collectivity, with the goal of transforming the agent into a deployable agent. Brainwashing is thus a process of ideological resocialization carried out within a structure of charismatic authority.

The brainwashing process may be operationalized as a sequence of well-defined and potentially observable phases. These hypothesized phases are (1) identity stripping, (2) identification, and (3) symbolic death/rebirth. The operational definition of brainwashing refers to the specific activities attempted, whether or not they are successful, as they are either observed directly by the ethnographer or reported in official or unofficial accounts by members or ex-members. Although the exact order of phases and specific steps within phases may vary from group to group, we should always expect to see the following features, or their functional equivalents, in any brainwashing system: (1) the constant fluctuation between assault and leniency; and (2) the seemingly endless process of confession, re-education, and refinement of confession.

D6. Hyper-credulity is defined as a disposition to accept uncritically all charismatically ordained beliefs. All lovers of literature and poetry are familiar with ‘that willing suspension of disbelief for the moment, which constitutes poetic faith’ (Coleridge 1970: 147). Hyper-credulity occurs when this state of mind, which in most of us is occasional and transitory, is transformed into a stable disposition. Hyper-credulity falls between hyper-suggestibility on the one hand and stable conversion of belief on the other.2 Its operational hallmark is plasticity in the assumption of deeply held convictions at the behest of an external authority. This is an other-directed form of what Robert Lifton (1968) has called the protean identity state.

D7. Relational Enmeshment is a state of being in which self-esteem  depends upon belonging to a particular collectivity (Bion 1959; Bowen 1972; Sirkin and Wynne 1990). It may be operationalized as immersion in a relational network with the following characteristics: exclusivity (high ratio of in-group to out-group bonds), interchangeability (low level of differentiation in affective ties between one alter and another), and dependency (reluctance to sever or weaken ties for any reason). In a developmental context, something similar to this has been referred to by Bowlby (1969) as anxious attachment.
D8. Exit Costs are the subjective costs experienced by an individual who is contemplating leaving a collectivity. Obviously, the higher the perceived exit costs, the greater will be the reluctance to leave. Exit costs may be operationalized as the magnitude of the bribe necessary to overcome them. A person who is willing to leave if we pay him $1,000 experiences lower exit costs than one who is not willing to leave for any payment less than $1,000,000. With regard to cults, the exit costs are most often spiritual and emotional rather than material, which makes measurement in this way more difficult but not impossible.

Hypotheses

Not all charismatic organizations engage in brainwashing. We therefore need a set of hypotheses that will allow us to test empirically whether any particular charismatic system attempts to practise brainwashing and with what effect. The brainwashing model asserts twelve hypotheses concerning the role of brainwashing in the production of uncritical obedience. These hypotheses are all empirically testable. A schematic diagram of the model I propose may be found in Figure 1.

p. 186This model begins with an assumption that charismatic leaders are capable of creating organizations that are easy and attractive to enter (even though they may later turn out to be difficult and painful to leave). There are no hypotheses, therefore, to account for how charismatic cults obtain members. It is assumed that an abundant pool of potential recruits to such groups is always available. The model assumes charismatic leaders, using nothing more than their own intrinsic attractiveness and persuasiveness, are initially able to gather around them a corps of disciples sufficient for the creation of an attractive social movement. Many ethnographies (Lofland 1996; Lucas 1995) have shown how easy it is for such charismatic movement organizations to attract new members from the general pool of anomic ‘seekers’ that can always be found within the population of an urbanized mobile society.

Hieromonk Ephraim & Kids

The model does attempt to account for how some percentage of these ordinary members are turned into deployable agents. The initial attractiveness of the group, its vision of the future, and/or its capacity to bestow seemingly limitless amounts of love and esteem on the new member are sufficient inducements in some cases to motivate a new member to voluntarily undergo this difficult and painful process of resocialization.

H1. Ideological totalism is a necessary but not sufficient condition for the brainwashing process. Brainwashing will be attempted only in groups that are structures totalistically. However, not all ideologically totalist groups will attempt to brainwash their members. It should be remembered that brainwashing is merely a mechanism for producing deployable agents. Some cults may not want deployable agents or have other ways of producing them. Others may want them but feel uncomfortable about using brainwashing methods to obtain them, or they may not have discovered the existence of brainwashing methods.

H2. The exact nature of this resocialization process will differ from group to group, but, in general, will be similar to the resocialization process that Robert Lifton (1989) and Edgar Schein (1961) observed in Communist re-education centres in the 1950s. For whatever reasons, these methods seem to come fairly intuitively to charismatic leaders and their staffs. Although the specific steps and their exact ordering differ from group to group, their common elements involve a stripping away of the vestiges of an old identity, the requirement that repeated confessions be made either orally or in writing, and a somewhat random and ultimately debilitating alternation of the giving and the withholding of ‘unconditional’ love and approval. H2 further states that the maintenance of this program involves the expenditure of a measurable quantity of the collectivity’s resources. This quantity is known as C, where C equals the cost of the program and should be measurable at least at an ordinal level.

The resocialization process has baffled many observers, in my opinion because it proceeds simultaneously along two distinct but parallel tracks, one involving cognitive functioning and the other involving emotional networking. These two tracks lead to the attainment of states of hyper-credulity and relational enmeshment, respectively. The group member learns to accept with suspended critical judgement the often shifting beliefs espoused by the charismatic leader. At the same time, the group member becomes strongly attached to and emotionally dependent upon the charismatic leader and (often especially) the other group members, and cannot bear to be shunned by them.

Hidden in the dark

H3. Those who go through the process will be more likely than those who do not to reach a state of hyper-credulity. This involves the shedding of old convictions and the assumption of a zealous loyalty to these beliefs of the moment, uncritically seized upon, so that all such beliefs become not mere ‘beliefs’ but deeply held convictions.

Under normal circumstances, it is not easy to get people to disown their core convictions. Convictions, once developed, are generally treated not as hypotheses to test empirically but as possessions to value and cherish. There are often substantial subjective costs to the individual in giving them up. Abelson (1986: 230) has provided convincing linguistic evidence that most people treat convictions more as valued possessions than as ways of testing reality. Cognitive dissonance theory predicts with accuracy that when subject to frontal attack, attachment to convictions tends to harden (Festinger, Riechen et. al. 1956; O’Leary 1994). Therefore, a frontal attack on convictions, without first undermining the self-image foundation of these convictions, is doomed to failure. An indirect approach through brainwashing is often more effective.

Nevins-Graduation1
Scott Nevins.

When the state of hyper-credulity is achieved, it leaves the individual strongly committed to the charismatic belief of the moment but with little or no critical inclination to resist charismatically approved new or contradictory beliefs in the future and little motivation to attempt to form accurate independent judgments of the consequences of assuming new beliefs. The cognitive track of the resocialization process begins by stripping away the old convictions and associating them with guilt, evil, or befuddlement. Next, there is a traumatic exhaustion of the habit of subjecting right-brain convictions to left-brain rational scrutiny. This goes along with an increase in what Snyder (1974) has called self-monitoring, implying a shift from central route to peripheral route processing of information in which the source rather than the content of the message becomes all important.

H4. As an individual goes through the brainwashing process, there will be an increase in relational enmeshment with measurable increases occurring at the completion of each of the three stages. The purging of convictions is a painful process and it is reasonable to ask why anybody would go through it voluntarily. The payoff is the opportunity to feel more connected with the charismatic relational network. These people have also been through it, and only they really understand what you are going through. So cognitive purging leads one to seek relational comfort, and this confort becomes enmeshing. The credulity process and the enmeshing process depend on each other.

The next three hypotheses are concerned with the fact that each of the three phases of brainwashing achieves plateaus in both of these processes. The stripping phase creates the vulnerability to this sort of transformation. The identification phase creates realignment, and the rebirth phase breaks down the barrier between the two so that convictions can be emotionally energized and held with zeal, while emotional attachments can be sacralized in terms of the charismatic ideology. The full brainwashing model actually provides far more detailed hypotheses concerning the various steps within each phase of the process. Space constraints make it impossible to discuss these here. An adequate technical discussion of the manipulation of language in brainwashing, for example, would require a chapter at least the length of this one. Figure 2 provides a sketch of the steps within each phase. Readers desiring more information about these steps are referred to Lifton (1989: chap. 5).

P. 190 (scrprnt)
The Stages of Brainwashing & Their Effect on Hyper-credulity and Emotional Enmeshment

page 191

H5. The stripping phase. The cognitive goal of the stripping phase is to destroy prior convictions and prior relationships of belonging. The emotional goal of the stripping phase is to create the need for attachments. Overall, at the completion of the stripping phase, the situation is such that the individual is hungry for convictions and attachments and dependent upon the collectivity to supply them. This sort of credulity and attachment behaviour is widespread among prisoners and hospital patients (Goffman 1961).

H6. The identification phase. The cognitive goal of the identification phase is to establish imitative search for conviction and bring about the erosion of the habit of incredulity. The emotional goal of the identification phase is to instill the habit of acting out through attachment. Overall, at the completion of the identification phase of the individual has begun the practice of relying on the collectivity for beliefs and for a cyclic emotional pattern of arousal and comfort. But, at this point this reliance is just one highly valued form of existence. It is not yet viewed as an existential necessity.

H7. The symbolic death and rebirth phase. In the death and rebirth phase, the cognitive and emotional tracks come together and mutually support each other. This often gives the individual a sense of having emerged from a tunnel and an experience of spiritual rebirth.3 The cognitive goal of this phase is to establish a sense of ownership of (and pride of ownership in) the new convictions. The emotional goal is to make a full commitment to the new self that is no longer directly dependent upon hope of attachment or fear of separation. Overall, at the completion of the rebirth phase we may say that the person has become a fully deployable agent of the charismatic leader. The brainwashing process is complete.

H8 states that the brainwashing process results in a state of subjectivity elevated exit costs. These exit costs cannot, of course, be observed directly. But they can be inferred from the behavioural state of panic or terror that arises in the individual at the possibility of having his or her ties to the group discontinued. The cognitive and emotional states produced by the brainwashing process together bring about a situation in which the perceived exit costs for the individual increase sharply. This closes the trap for all but the most highly motivated individuals, and induces in many a state of uncritical obedience. As soon as exit from a group (or even from its good graces) ceases to be a subjectively palatable option, it makes sense for the individual to comply with almost anything the group demands–even to the point of suicide in some instances. Borrowing from Sartre’s insightful play of that name, I refer to this situation as the ‘no exit’ syndrome. When demands for compliance are particularly harsh, the hyper-credulity aspect of the process sweetens the pill somewhat by allowing the individual to accept uncritically the justifications offered by the charismatic leader and/or charismatic organization for making these demands, however far-fetched these justifications might appear to an outside observer.

H9 states that the brainwashing process results in a state of ideological obedience in which the individual has a strong tendency to comply with any behavioural demands made by the collectivity, especially if motivated by the carrot of approval and the stick of threatened expulsion, no matter how life-threatening these demands may be and no matter how repugnant such demands might have been to the individual in his or her pre-brainwashed state.

H10 states that the ‘brainwashing process results in increased deployability. Deployability extends the range of ideological obedience in the temporal dimension. It states that the response continues after the stimulus is removed. This hypothesis will be disconfirmed in any cult within which members are uncritically obedient only while they are being brainwashed but not thereafter. The effect need not be permanent, but it does need to result in some measurable increase in deployability over time.

H11 states that the ability of the collectivity to rely on obedience without surveillance will result in a measurable decrease in surveillance. Since surveillance involves costs, this decrease will lead to a quantity S, where S equals the savings to the collectivity due to diminished surveillance needs and should be measurable at least to an ordinal level.

H12 states that S will be greater than C. In other words, the savings to the collectivity due to decreased surveillance needs is greater than the cost of maintaining the brainwashing program. Only where S is greater than C does it make sense to maintain a brainwashing program. Cults with initially high surveillance costs, and therefore high potential savings due to decreased surveillance needs [S], will tend to be more likely to brainwash, as will cults structured so that the cost of maintaining the brainwashing system [C] are relatively low.

Holy Archangel Monks socializing over wine

Characteristics of a Good Theory

There is consensus in the social sciences that a good inductive qualitative theory is one that is falsifiable, internally consistent, concrete, potentially generalizable, and has a well-defined dependent variable (king, Keohane et. al. 1994). I think it should be clear from the foregoing that this theory meets all of these conditions according to prevailing standards in the social and behavioural sciences. However, since brainwashing theory has received much unjustified criticism for its lack of falsifiability and its lack of generalizability, I will briefly discuss the theory from these two points of view.

The criterion of falsifiability, as formulated primarily by Popper (1968), is the essence of what separates theory from dogma in science. Every theory must be able to provide an answer to the question of what evidence would falsify it. If the answer is that there is no possible evidence that would lead us to reject a so-called theory, we should conclude that it is not really a theory at all but just a piece of dogma.

Although Dawson (1998) and Richardson (1993) have included the falsifiability problem in their critiques of brainwashing; this criticism is associated mainly with the work of Dick Anthony (1996). Anthony’s claim that brainwashing theory is unfalsifiable is based upon  two related misunderstandings. First, he argues that it is impossible to prove that a person is acting with free will so, to the extent that brainwashing theory rests on the overthrow of free will, no evidence can ever disprove it. Second, he applies Popper’s criterion to cults in a way more appropriate for a highly developed deductive theoretical system. He requires that either brainwashing explain all ego-dystonic behaviour in cults or acknowledge that it can explain none of it. But, as we have seen, brainwashing is part of an inductive multifactorial approach to the study of obedience in cults and should be expected to explain only some of the obedience produced in some cults.

With regard to generalizability, cultic brainwashing is part of an important general class of phenomena whose common element is what Anthony Giddens has called ‘disturbance of ontological security’ in which habits and routines cease to function as guidelines for survival (Cohen 1989: 53). This class of phenomena includes the battered spouse syndrome (Barnett and LaViolette 1993), the behaviour of concentration camp inmates (Chodoff 1966), the Stockholm Syndrome (Kuleshnyk 1984; Powell 1986), and, most importantly, behaviour within prisoner of war camps and Communist Chinese re-education centres and ‘revolutionary universities’ (Lifton 1989; Sargant 1957;  Schein 1961). There exist striking homologies in observed responses across all of these types of events, and it is right that our attention be drawn to trying to understand what common theme underlies them all. As Oliver Wendell Holmes (1891: 325) attempted to teach us more than a century ago, the interest of the scientist should be guided, when applicable, by ‘the plain law of homology which declares that like must be compared with like.’

Cats of St. Nektarios

NOTES

  1. Because of space limitations, I cannot give this important subject the attention it deserves in this chapter. Readers not familiar with the concept are referred to the much fuller discussion of this subject in the book by Robert Lifton as cited.
  2. Students of cults have sometimes been misled into confusing this state of hyper vrdulity with either hyper suggestibility on the one hand or a rigid ‘true belief’ system on the other. But at least one study has shown that neither the hyper-suggestible, easily hypnotized person nor the structural true believer are good candidates for encapsulation in a totalist cult system (Solomon 1981: 111-112). True believers (often fundamentalists who see in the cult a purer manifestation of their own worldview than they have seen before) do not do well in cults and neither do dye-in-the-wool sceptics who are comfortable with their scepticism. Rather it is those lacking convictions but hungering for them that are the best candidates.
  3. Hopefully, no reader will think that I am affirming the consequent by stating that all experiences of spiritual rebirth must be caused by brainwashing. This model is completely compatible with the assumption that most spiritual rebirth experiences have nothing to do with brainwashing. The reasoning here is identical to that connecting epilepsy with visions of the holy. The empirical finding that seizures can be accompanied by visions of the holy does not in any way imply that such visions are always a sign of epilepsy.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s