Evidence of Brainwashing in Cults (Benjamin Zablocki, 2001)

NOTE: The following article is taken from the 5th chapter of Misunderstanding Cults: Searching for Objectivity in a Controversial Field, entitled, Towards a Demystified and Disinterested Scientific Theory of Brainwashing.


I have attempted to test the model as much as possible with the limited data that currently exist. I have relied on three sources of evidence. The first and most important of these consists of ethnographic studies of a wide variety of contemporary American charismatic cults conducted by myself and others. The first-hand opportunities I have had to watch (at least the public face of) charismatic resocialization in numerous cult situations has convinced me of the need to theorize about this phenomenon. The second source of data consists of interviews with former leaders of charismatic groups. Although I have only a handful of such interviews, they are particularly valuable for elucidating the process from the perspective of ‘management,’ rather than from the perspective of the subjects. The third source of data consists of reports of ex-members of cults, drawing heavily on scientifically sampled interviews that my students and I have conducted. Most of these respondents were interviewed at least twice over a roughly twenty-five-year period.

Because evidence in this field of study tends to be so bitterly contested, it is perhaps necessary to point out that my own studies in this area were all subject to rigorous and competitive peer review. Five of my studies were reviewed and funded by three organizations — the National Institute of Mental Health (2), and the National Institute of Health (1) — over a period extending from 1964 to 2001. On all of these I was the principal investigator, and the research designs are in the public record. During this same period, other research of mine in this same field of study was funded by peer-reviewed faculty research grants from all of the universities with which I have been affiliated: the University of California at Berkeley, the California Institute of Technology, Columbia University, and Rutgers University. It is a strange anomaly that this body of work seems to be generally respected throughout the social and behavioural sciences, with the exception of a small field, the sociology of new religious movements, where some try their best to hold it up to ridicule and disesteem.

Ethnographic Accounts

Bainbridge (1997) has argued that most ethnographic studies of cults have failed to find evidence of brainwashing. But it is more accurate to say that ethnographers have been divided on this subject. Lalich, Ofshe, Kent, and myself have found such evidence abundantly (Kent and Krebs 1998; Lalich 1993; Ofshe, Eisenberg et. al. 1974; Zablocki 1980).  Even Barker, Beckford, and Richardson, who are among the most hostile to the brainwashing conjecture, have found evidence of attempted brainwashing, although they have claimed that these attempts are largely or entirely unsuccessful (Barker 1984; Beckford 1985; Richardson, Harder et. al. 1972). Still other ethnographers (Balch 1985; Rochford, Purvis et. al. 1989) seem ambivalent on the subject and not sure what to make of the evidence. Others such as Palmer (1994) and Hall (1987, 2000) have been fairly clear about the absence of brainwashing in their observations.

Such disparity is to be expected. There is no reason to believe that all cults practise brainwashing any more than that all cults are violent or that all cults make their members wear saffron robes. Most ethnographers who did discover evidence of brainwashing in the cults they investigated were surprised by the finding. The fact that evidence of this sort has been repeatedly discovered by researchers who were not particularly looking for it suggests that the process really exists in some cults. I have observed fully developed brainwashing processes in some cults, partially developed ones in others, and none whatsoever in others. As ethnographic work in cults continues to accumulate, we should expect to find a similar degree of heterogeneity in published reports. certainly , there is abundant evidence of uncritically obedient behaviour in charismatic cults (Ayella 1990; Davis 2000; Katchen 1997; Lalich 1999; Lifton 1999; Wallis 1977), and this behaviour needs to be explained. The presence or absence of brainwashing may ultimately turn out to contribute to such an explanation.

When I first studied the Bruderhof thirty-five years ago, using ethnographic methods, I noticed a strong isomorphism between the phases of Bruderhof resocialization and the phases of brainwashing in Chinese re-education centres described by Lifton. Since I could think of no other reason why the Bruderhof would support such a costly and labour-intensive resocialization program if it were not to create deployable agents with long-term loyalty to the community, I hypothesized that something akin to brainwashing must be going on. My observations over the next thirty-five years have only strengthened my confidence in the correctness of this hypothesis. Bruderhof members were never kept from leaving by force or force threat. But the community put a lot of time and energy into assuring that defections would made rare and difficult by imbuing in its members an uncritical acceptance of the teachings of the community and a terror of life outside the community.1

Some (but not all) of the other cultic groups I have lived with as a participant-observer have shown signs of a brainwashing process at work. Individuals being plucked suddenly out of the workday routine of the group, appearing to become haggard with lack of sleep for prolonged periods, secretiveness and agitation, alternating periods of shunning and warm communal embrace, all suggest the presence of such a process. Some of these people, years later, having left the cult, have confirmed to me that such a process is what they went through when I observed them under this stress. According to my ethnographic observations, some sort of fully or partially developed brainwashing process figures in the resocialization of at least half of the cults I have studied during at least some phases of their history.

Geronda Dositheos with personalized 'Geronda' sweater

Leader Accounts

A second source of evidence may be found in reports given by people who were actually responsible for practising brainwashing with their fellow cult members. Several cult leaders  who left their groups have since apologized to other ex-members for having subjected them to brainwashing methods. One such former cult leader put it this way:

“What you have to understand is that, for us, breaking the spirit … emptying out our ego, is very very important. And any means to that end … well, we would have said it was justified. And over the years we developed [by trial and error] ways of accomplishing this [task]. It was only after I was finished with [the cult] and living in the world again that I did some reading and realized how similar [our techniques] were to what the Communists did – to brainwashing. I think you would have to say that what we did was a kind of brainwashing even if we didn’t mean it to be so.’

In another case I interviewed the widow of a cult leader who had died and whose cult had disbanded soon thereafter. She said the following:

‘Those kinds of things definitely happened [on quite a few occasions]. It’s not like we sat down and said, hey we’re going to brainwash everybody. That would have been crazy. It’s more like we knew how important our mission was and how [vulnerable it was] to treachery. I think we got a little paranoid about being overcome by treachery within, especially after Gabe and Helen left and started saying those things about us. So everybody had to be tested. I had to be tested. Even he [the leader] had to be tested. We all knew it and we all [accepted it]. So we would pull a person out of the routine and put him in solitary for awhile. Nobody could talk to him except [my husband] and maybe a few others. I couldn’t even talk to him when I brought him meals. That was usually my job … At first it was just isolation and observation and having deep long talks far into the night about the mission. We didn’t know anything about brainwashing or any of that stuff. But gradually the things you describe got in there too somehow. Especially the written confessions. I had to write a bunch of them towards the end when [X] was sick. Whatever you wrote was not enough. They always wanted more, and you always felt you were holding out on them. Finally your confessions would get crazy, they’d come from your wildest fantasies of what they might want. At the end I confessed that I was killing [my husband] by tampering with his food because I wanted to – I don’t know – be the leader in his place I guess. All of us knew it was bullshit but somehow it satisfied them when I wrote that … And, even though we knew it was bullshit, going through that changed us. I mean I know it changed me. It burned a bridge … [T]here was no going back. You really did feel you changed into being a different person in a weird sort of way.’

Perhaps the closest thing I have found to a smoking gun in this regard has to do with a sociology professor who became a charismatic cult leader. Two of this cult leader’s top lieutenants independently spoke to me on this subject. Both of these respondents described in great detail how they assisted in concerted campaigns to brainwash fellow cult members. Both felt guilty about this and found the memory painful to recount. One of them indicated that the brainwashing attempt was conscious and deliberate:

‘During her years in academia, Baxter became very interested in mass social psychology and group behaviour modification. She studied Robert Jay Lifton’s work on thought reform; she studied and admired ‘total’ communities such as Synanon, and directed methods of change, such as Alcoholic Anonymous. She spoke of these techniques as positive ways to change people.’ (Lalich 1993: 55)

In this cult, which has since disbanded, there seems to be general consensus among both leaders and followers that systematic brainwashing techniques were used on a regular basis and were successful in their aim of producing deployable agents.

Geronda Ephraim, Monks, Devotees

Ex-member Accounts

Our third source of evidence is the most controversial. There has been a misguided attempt to deny the validity of negative ex-member accounts as a source of data about cults. They’ve been condemned as ‘atrocity tales’ (Richardson 1998: 172), and Johnson (1998: 118) has dismissed them categorically by alleging that ‘the autobiographical elements of apostate narratives are further shaped by a concern that the targeted religious groups be painted in the worst possible light.’


The apostate role has been defined by Bromley (1997) largely in terms of the content of attitudes towards the former cult. If these attitudes are negative and expressed collectively in solidarity with other negatively disposed ex-members, they constitute evidence that the person must not be an ordinary ex-member but an ‘apostate.’ This is a direct violation of Robert Meron’s (1968) admonition that role sets be defined in terms of shared structural characteristics, not individual attitudes. What if this same logic were used to denigrate abused spouses who choose to be collectively vocal in their complaints? Nevertheless, this perspective on so-called ‘apostate accounts’ has been widely influential among cult scholars.

David Bromley is a sociologist theorist of great personal integrity but limited field experience. I think that if Bromley and his followers could just once sit down with a few hundred of these emotionally haunted ex-members whom they blithely label ‘apostates’,’ and listen to their stories, and see for themselves how badly most of them would like nothing more than to be able to put the cult experience behind them and get on with their lives, they would be deeply ashamed of the way they have subverted role theory to deny a voice to a whole class of people.

Fr. Silouanos Coutavas

Dawson (1995) has correctly pointed out that there are methodological problems involved in using accounts of any kind as data. We need to be careful not to rely only on ex-member accounts. Triangulation of data sources is essential. But even the reports of professional ethnographers are nothing more than accounts, and thus subject to the same sort of limitations. Ex-member accounts have been shown to have reliability and validity roughly equivalent to the accounts given by current cult members (Zablocki 1996).

Solomon (1981) has provided some empirical support for the argument that those with stormy exits from cults and those with anti-cult movement affiliations are more likely to allege that they have been brainwashed than those with relatively uneventful exits and no such affiliation. ‘Cult apologists’ have made much of the finding that ex-members affiliated with anti-cult organizations are more likely to allege brainwashing than those who are not. Their hatred of the anti-cult movement has blinded them to two important considerations: (1) The causal direction is by no means obvious — it is at least as likely that those who were brainwashed are more likely to seek out anti-cult organizations as support groups as that false memories of brainwashing are implanted by anti-cult groups into those ex-members who fall into their clutches; and (2) Although the percentages may be lower, some ex-members who don’t affiliate with anti-cult groups still allege brainwashing.


Many ex-members of cults find brainwashing the most plausible explanation of their own cult experiences. While some might be deluding themselves to avoid having to take responsibility for their own mistakes, it strains credulity to imagine that all are doing so. Here, just by way of example, are excerpts from interviews done with five ex-members of five different cults. None of these respondents was ever affiliated, even marginally, with an anti-cult organization:

‘They ask you to betray yourself so gradually that you never notice you’re giving up everything that makes you who you are and letting them fill you up with something they think is better and that they’ve taught you to believe is something better.’

‘What hurts most is that I thought these people were my new friends, my new family.  It wasn’t until after that I realized how I was manipulated little step by little step. Just like in Lifton; it’s really amazing when you think of it … couldn’t just be a coincidence …  I don’t know if you can understand it, but what hurts most is not that they did it but realizing that they planned it out so carefully from the beginning. That was so cold.’

‘I’ve never been able to explain it to people who weren’t there. I don’t really understand it myself. But black was white, night was day, whatever they told us to believe, it was like a test. The more outrageous the idea the greater the victory, when I could wrap my mind around it and really believe it down to my toes. And, most important, be prepared to act on it just like if it was proven fact. That’s the really scary part when I look back on it.’

‘In the frame of mind I was in [at the time], I welcomed the brainwashing. I thought of it like a purge. I needed to purge my old ways, my old self. I hated it and I felt really violent toward it … I wanted to wash it all away and make myself an empty vehicle for [the guru’s] divine plan … [Our] ideal was to be unthinking obedient foot soldiers in God’s holy army.’

Many wax particularly eloquent on this subject when interviewed in the aftermath of media events involving cultic mass suicides or murders. The fifth respondent said the following:

‘It makes me shudder and … thank God that I got out when I did. ‘Cause that could have been me doing that, could have been any of us. [I have] no doubt any one of us would have done that in the condition we all were in — killed ourselves, our kids, any that [the leaders] named enemies.’

I have quoted just five ex-members because of limitations of space. Many more could be found. Thousands of ex-members of various groups (only a small minority of whom have ever been interviewed by me) have complained of being brainwashed. Contrary to the allegations of some ‘cult apologists,’ very few of these are people who had been deprogrammed (and presumably brainwashed into believing that they had been brainwashed). The accounts of these people tend often to agree on the particulars of what happened to them, even though these people may never have talked with one another.

Another striking aspect of these brainwashing accounts by ex-members is that they are held to consistently for many years. I have interviewed many ex-cult members twenty to thirty years after leaving the cult, and have yet to have a single case of a person who alleged brainwashing immediately after leaving the cult, later recant and say it wasn’t true after all. More than anything else, this consistency over extended periods of time convinces me that ex-member accounts often may be relied on. Even if some of the details have been forgotten or exaggerated with the passage of time, the basic outline of what happened to them is probably pretty accurate. All in all, therefore, I think it is fair to conclude, both from accumulated ethnographic and ex-member data, that brainwashing happens to at least some people in some cults.

Greek Coffee (TX)

Incidence and Consequences

Finally, we come to the aspect of brainwashing theory for which our data are sketchiest, the one most in need of further research. How often does brainwashing actually occur (incidence) 2 and how significant are its consequences?

Defining what we mean by incidence is far from a simple matter. In the reporting of brainwashing there are numerous false positives and false negatives, and no consensus as to whether these errors lead to net underestimation or net overestimation. Several factors can produce false positives. Unless the term is precisely defined to respondents, some answers will reflect folk definitions of the term. It might mean little more to them than that they believe they were not treated nicely by their former cults. Other respondents may share our definition of the term, but answer falsely out of a desire to lay claim to the victim role or out of anger towards the cult. False negatives also can occur for several reasons. Most significantly, current members (as well as ex-members who still sympathize with the cult) may deny brainwashing to protect the cult. Others may understand the term differently than do the interviewers, and still others may be embarrassed to admit that they had been brainwashed. These errors can be minimized but hardly eliminated by in-depth interviewing in which respondents are asked not merely to label but to describe the process they went through.

There is insufficient space in this chapter to discuss these important methodological issues. I will therefore merely state the criteria upon which I base my own measurement. I treat incidence as a ratio of X to Y. in Y are included all those who were fully committed members of a cult for a year or more, but who are currently no longer affiliated with any cult. 3 In X are included those members of the Y set who both claim to have been brainwashed and who are able to give evidence of the particulars of their own brainwashing experience (at least through phase 2) consistent with those discussed in the previous section of this chapter.

In the handful of systematic studies that have been done, estimates of brainwashing incidence seem to cluster around 10% (plus or minus 5%) of former cult members (Katchen 1997; Wright 1987; Zablocki, Hostetler et. al. in press). However, there is tremendous variation in estimates for this number given by people working in this field. Ignoring those scholars who deny that brainwashing is ever attempted or ever successful, I have heard anecdotal estimates as low as <0.1% and as high as 80%, given by ethnographers.

Stuart Wright’s (1987) data on voluntarily exiting ex-members indicate that 9% say they had been brainwashed. This study is noteworthy because it examined ex-members of a variety of different cults rather than just one. It relied, however, on each respondent’s own definition of what it meant to be brainwashed.

My national longitudinal study (Zablocki 1980) relied primarily on a two-stage sampling procedure in which geographical regions were first selected and groups then sampled within these regions. I have followed 404 cases, most of them surveyed at least twice over intervals extending up to twenty-five years. Of those who were interviewed, 11% meet the criteria for having been brainwashed discussed above. Interestingly, all those in my sample who claim to have been brainwashed stick to their claims even after many years have passed. My own study is the only one that I know of that has repeatedly interviewed members and former members over several decades.

Another issue is whether overall incidence among the ex-member population is the most meaningful statistic to strive for given the heterogeneity among cults and types of cult member. Cults vary in the proportion of their members they attempt to brainwash from 0% to 100%. Since brainwashing significantly increases exit costs (according to hypothesis 8), it follows that examples of brainwashed individuals will be somewhat over-represented among current cult members and somewhat under-represented among ex-members.

The incidence, among ex-members, is higher (24% in my sample) when the relevant population is confined to a cult’s ‘inner circle,’ the core membership surrounding the leader. In an important  and neglected article, Wexler (1995) makes the point that it is simplistic to think of a cult as comprising only a leader and a homogeneous mass of followers. Most cults have a third category of membership, a corps of lieutenants, surrounding the leader, which Wexler refers to as a ‘decision elite.’ It follows from the hypotheses discussed earlier that we should expect attempts to brainwash to be concentrated among members in this category.

One study suggests that incidence is also higher among adults who grew up in cults (Katchen 1997). My own ethnographic observation supports the last point, and further suggests that cults under extreme stress become more likely to engage in brainwashing or to extend already existing brainwashing programs to a much wider circle of members.

With regard to consequences, we must distinguish between obedience consequences and traumatic consequences. Uncritical obedience is extinguished rapidly, certainly within a year of exiting if not sooner. The popular idea that former cult members can be programmed to carry obedience compulsions for specific acts to be performed long after membership in the cult has ceased is, in my opinion, wholly a myth based largely on a movie, The Manchurian Candidate. I know of nobody who has ever seen even a single successful instance of such programming. However, many brainwashed ex-members report that they would not feel safe visiting the cult, fearing that old habits of obedience might quickly be reinstilled.

There is evidence, in my data set, of persistent post-traumatic effects. The majority of those who claim to have been brainwashed say that they never fully get over the psychological insult, although its impact on their lives diminishes over time. The ability to form  significant bonds with others takes a long time to heal, and about a third wind up (as much as a quarter of a century later) living alone with few significant social ties. This is more than double the proportion of controls (cult participants who appeared not to have been brainwashed) that are socially isolated twenty-five years later. Visible effects also linger in the ability to form new belief commitments. In about half there is no new commitment to a belief community after two years. By twenty-five years, this has improved, although close to 25% still have formed no such commitment. Occupationally, they tend to do somewhat better, but often not until having separated from the cult for five to ten years.

Baptism at Holy Archangels (TX)


We can conclude from all of the above that those who claim that cultic brainwashing does not exist and those who claim it is pandemic to cults are both wrong. Brainwashing is an administratively costly and not always effective procedure that some cults use on some of their members. A few cults rely heavily on brainwashing and put all their members through it. Other cults do not use the procedure at all. During periods of stressful confrontation, either with external enemies or among internal factions, or in attempts to cope with failed apocalyptic prophecies, it is not uncommon for brainwashing suddenly to come to play a central role in the cult’s attempts to achieve order and social control. At such times, risk of uncritically obedient violent aggression or mass suicide may be heightened.

Hopefully, it will be clear from this chapter that brainwashing has absolutely nothing to do with the overthrow of ‘free will’ or any other such mystical or non-scientific concept. People who have been brainwashed are ‘not free’ only in the sense that all of us, hemmed in on all sides as we are by social and cultural constraints, are not free. The kinds of social constraints involved in brainwashing are much more intense than those involved in socializing many of us to eat with knives and forks rather than our hands. But the constraints involved differ only in magnitude and focus, not in kind. Any brainwashed cult member always retains the ability to leave the cult or defy the cult as long as he or she is willing to pay the mental and emotional price (which may be considerable) that the cult is able to exact for so doing.

As I finish this chapter, a number of European nations are debating the advisability of anti-brainwashing laws, some of which eventually may be used to inhibit freedom of religious expression. In light of this trend a number of colleagues have criticized me, not on the grounds that my facts are incorrect, but that my timing is unfortunate. One socked me with the following, particularly troubling, complaint: “Ben, if you had discovered evidence, in 1942, of a higher prevalence among Jews than non-Jews of the Tay-Sachs genetic defect, would you have published your findings in a German biology journal?” Ultimately, although I respect the sentiments behind my colleagues’ concerns, I must respectfully disagree with their fastidious caution. It never works to refuse to look at frightening facts. They only become larger, more   frightening, and more mystically permeated when banished to one’s peripheral vision. A direct, honest acknowledgement of the limited but significant role that brainwashing plays in producing uncritical obedience in some cults will serve, in the long run, to lessen paranoid reactions to ‘the threat of the cults,’ rather than increase them.


  1. Bruderhof members, particularly those in responsible positions, are never fully trusted until they have gone through the ordeal of having been put into the great exclusion (being sent away) and then spiritually fought their way back to the community. Such exclusion serves as the ultimate test of deployability. Is the conversion deep enough to hold even when away from daily reinforcement by participation in community life? The degree to which the Bruderhof stresses the importance of this ideal serves as additional evidence that the creation of deployable agents is a major aim of the socialization process.
  2. A related question is what portion of those a cult attempts to brainwash actually get brainwashed. No data have been collected on this issue to the best of my knowledge.
  3. I do not distinguish between voluntary and involuntary mode of exit in my measure because my sample includes only an insignificant number (less than one-half of one percent) who were deprogrammed out of their cults.

Brainwashing as a Scientific Concept (Benjamin Zablocki, 2001)

NOTE: The following article is taken from the 5th chapter of Misunderstanding Cults: Searching for Objectivity in a Controversial Field, entitled, Towards a Demystified and Disinterested Scientific Theory of Brainwashing.


What I am presenting here is not a ‘new’ theory of brainwashing but a conceptual model of the foundational theory developed in the mid-twentieth century by Lifton, Schein, and Sargant as it applies to charismatic collectivities. Because its scientific stature has been so frequently questioned, I will err on the side of formality by presenting a structured exposition of brainwashing theory in terms of eight definitions and twelve hypotheses. Each definition includes an operationalized form by which the trait may be observed. If either of the first two hypotheses disconfirmed, we must conclude that brainwashing is not being attempted in the cult under investigation. If any of the twelve hypotheses is disconfirmed, we must conclude that brainwashing is not successful in meeting its goals within that cult.

I do not pretend that the model outlined here is easy to test empirically, particularly for those researchers who either who either cannot or will not spend time immersing themselves in the daily lives of cults, or for those who are not willing, alternatively, to use as data the detailed retrospective accounts of ex-members. However, it should be clear that the model being proposed here stays grounded in what is empirically testable and does not involve mystical notions such as loss of free will or information disease (Conway and Siegelman 1978) that have characterized many of the extreme ‘anti-cult models.’

Nor do I pretend that this model represents the final and definitive treatment of this subject. Charismatic influence is still a poorly understood subject on which much additional research is needed. With few exceptions, sociology has treated it as if it were what engineers call a ‘black box,’ with charismatic inputs coming in one end and obedience outputs going out the other. What we have here is a theory that assists in the process of opening  this black box to see what is inside. It is an inductive theory, formed largely from the empirical generalizations of ethnographers and interviewers. The model itself presents an ideal-type image of brainwashing that does not attempt to convey the great variation among specific obedience-inducing processes that occur across the broad range of existing cults. Much additional refinement in both depth and breadth will certainly be needed.


EE 2018

D1. Charisma is defined, using the classical Weberian formula, as a condition of ‘devotion to the specific and exceptional sanctity, heroism, or exemplary character of an individual person, of the normative patterns or order revealed or ordained by him’ (Weber 1947: 328). Being defined this way, as a condition of devotion, leads us to recognize that charisma is not to be understood simply in terms of the characteristics of the leader, as it has come to be in popular usage, but requires an understanding of the relationship between leader and followers. In other words, charisma is a relational variable. It is defined operationally as a network of relationships in which authority is justified (for both superordinates and subordinates) in terms of the special characteristics discussed above.

D2. Ideological Totalism is a sociocultural system that places high valuation on total control over all aspects of the outer and inner lives of participants for the purpose of achieving the goals of an ideology defined as all important. Individual rights either do not exist under ideological totalism or they are clearly subordinated to the needs of the collectivity whenever the two come into conflict. Ideological totalism has been operationalized in terms of eight observable characteristics: milieu control, mystical manipulation, the demand for purity, the cult of confession, ‘sacred science,’ loading the language, doctrine over person, and the dispensing of existence (Lifton 1989: chap. 22).1

D3. Surveillance is defined as keeping watch over a person’s behaviour, and, perhaps, attitudes. As Hechter (1987) has shown, the need for surveillance is the greatest obstacle to goal achievement among ideological collectivities organized around the production of public goods. Surveillance is not only costly, it is also impractical for many activities in which agents of the collectivity may have to travel to act autonomously and at a distance. It follows from this that all collectivities pursuing public goals will be motivated to find ways to decrease the need for surveillance. Resources used for surveillance are wasted in the sense that they are unavailable for the achievement of collective goals.

D4. A deployable agent is one who is uncritically obedient to directives perceived as charismatically legitimate (Selznick 1960). A deployable agent can be relied on to continue to carry out the wishes of the collectivity regardless of his own hedonic interests and in the absence of any external controls. Deployability can be operationalized as the likelihood that the individual will continue to comply with hitherto ego-dystonic demands of the collectivity (e.g., mending, ironing, mowing the lawn, smuggling, rape, child abuse, murder) when not under surveillance.

D5. Brainwashing is an observable set of transactions between a charismatically structured collectively and an isolated agent of the collectivity, with the goal of transforming the agent into a deployable agent. Brainwashing is thus a process of ideological resocialization carried out within a structure of charismatic authority.

The brainwashing process may be operationalized as a sequence of well-defined and potentially observable phases. These hypothesized phases are (1) identity stripping, (2) identification, and (3) symbolic death/rebirth. The operational definition of brainwashing refers to the specific activities attempted, whether or not they are successful, as they are either observed directly by the ethnographer or reported in official or unofficial accounts by members or ex-members. Although the exact order of phases and specific steps within phases may vary from group to group, we should always expect to see the following features, or their functional equivalents, in any brainwashing system: (1) the constant fluctuation between assault and leniency; and (2) the seemingly endless process of confession, re-education, and refinement of confession.

D6. Hyper-credulity is defined as a disposition to accept uncritically all charismatically ordained beliefs. All lovers of literature and poetry are familiar with ‘that willing suspension of disbelief for the moment, which constitutes poetic faith’ (Coleridge 1970: 147). Hyper-credulity occurs when this state of mind, which in most of us is occasional and transitory, is transformed into a stable disposition. Hyper-credulity falls between hyper-suggestibility on the one hand and stable conversion of belief on the other.2 Its operational hallmark is plasticity in the assumption of deeply held convictions at the behest of an external authority. This is an other-directed form of what Robert Lifton (1968) has called the protean identity state.

D7. Relational Enmeshment is a state of being in which self-esteem  depends upon belonging to a particular collectivity (Bion 1959; Bowen 1972; Sirkin and Wynne 1990). It may be operationalized as immersion in a relational network with the following characteristics: exclusivity (high ratio of in-group to out-group bonds), interchangeability (low level of differentiation in affective ties between one alter and another), and dependency (reluctance to sever or weaken ties for any reason). In a developmental context, something similar to this has been referred to by Bowlby (1969) as anxious attachment.
D8. Exit Costs are the subjective costs experienced by an individual who is contemplating leaving a collectivity. Obviously, the higher the perceived exit costs, the greater will be the reluctance to leave. Exit costs may be operationalized as the magnitude of the bribe necessary to overcome them. A person who is willing to leave if we pay him $1,000 experiences lower exit costs than one who is not willing to leave for any payment less than $1,000,000. With regard to cults, the exit costs are most often spiritual and emotional rather than material, which makes measurement in this way more difficult but not impossible.


Not all charismatic organizations engage in brainwashing. We therefore need a set of hypotheses that will allow us to test empirically whether any particular charismatic system attempts to practise brainwashing and with what effect. The brainwashing model asserts twelve hypotheses concerning the role of brainwashing in the production of uncritical obedience. These hypotheses are all empirically testable. A schematic diagram of the model I propose may be found in Figure 1.

p. 186This model begins with an assumption that charismatic leaders are capable of creating organizations that are easy and attractive to enter (even though they may later turn out to be difficult and painful to leave). There are no hypotheses, therefore, to account for how charismatic cults obtain members. It is assumed that an abundant pool of potential recruits to such groups is always available. The model assumes charismatic leaders, using nothing more than their own intrinsic attractiveness and persuasiveness, are initially able to gather around them a corps of disciples sufficient for the creation of an attractive social movement. Many ethnographies (Lofland 1996; Lucas 1995) have shown how easy it is for such charismatic movement organizations to attract new members from the general pool of anomic ‘seekers’ that can always be found within the population of an urbanized mobile society.

Hieromonk Ephraim &amp; Kids

The model does attempt to account for how some percentage of these ordinary members are turned into deployable agents. The initial attractiveness of the group, its vision of the future, and/or its capacity to bestow seemingly limitless amounts of love and esteem on the new member are sufficient inducements in some cases to motivate a new member to voluntarily undergo this difficult and painful process of resocialization.

H1. Ideological totalism is a necessary but not sufficient condition for the brainwashing process. Brainwashing will be attempted only in groups that are structures totalistically. However, not all ideologically totalist groups will attempt to brainwash their members. It should be remembered that brainwashing is merely a mechanism for producing deployable agents. Some cults may not want deployable agents or have other ways of producing them. Others may want them but feel uncomfortable about using brainwashing methods to obtain them, or they may not have discovered the existence of brainwashing methods.

H2. The exact nature of this resocialization process will differ from group to group, but, in general, will be similar to the resocialization process that Robert Lifton (1989) and Edgar Schein (1961) observed in Communist re-education centres in the 1950s. For whatever reasons, these methods seem to come fairly intuitively to charismatic leaders and their staffs. Although the specific steps and their exact ordering differ from group to group, their common elements involve a stripping away of the vestiges of an old identity, the requirement that repeated confessions be made either orally or in writing, and a somewhat random and ultimately debilitating alternation of the giving and the withholding of ‘unconditional’ love and approval. H2 further states that the maintenance of this program involves the expenditure of a measurable quantity of the collectivity’s resources. This quantity is known as C, where C equals the cost of the program and should be measurable at least at an ordinal level.

The resocialization process has baffled many observers, in my opinion because it proceeds simultaneously along two distinct but parallel tracks, one involving cognitive functioning and the other involving emotional networking. These two tracks lead to the attainment of states of hyper-credulity and relational enmeshment, respectively. The group member learns to accept with suspended critical judgement the often shifting beliefs espoused by the charismatic leader. At the same time, the group member becomes strongly attached to and emotionally dependent upon the charismatic leader and (often especially) the other group members, and cannot bear to be shunned by them.

Hidden in the dark

H3. Those who go through the process will be more likely than those who do not to reach a state of hyper-credulity. This involves the shedding of old convictions and the assumption of a zealous loyalty to these beliefs of the moment, uncritically seized upon, so that all such beliefs become not mere ‘beliefs’ but deeply held convictions.

Under normal circumstances, it is not easy to get people to disown their core convictions. Convictions, once developed, are generally treated not as hypotheses to test empirically but as possessions to value and cherish. There are often substantial subjective costs to the individual in giving them up. Abelson (1986: 230) has provided convincing linguistic evidence that most people treat convictions more as valued possessions than as ways of testing reality. Cognitive dissonance theory predicts with accuracy that when subject to frontal attack, attachment to convictions tends to harden (Festinger, Riechen et. al. 1956; O’Leary 1994). Therefore, a frontal attack on convictions, without first undermining the self-image foundation of these convictions, is doomed to failure. An indirect approach through brainwashing is often more effective.

Scott Nevins.

When the state of hyper-credulity is achieved, it leaves the individual strongly committed to the charismatic belief of the moment but with little or no critical inclination to resist charismatically approved new or contradictory beliefs in the future and little motivation to attempt to form accurate independent judgments of the consequences of assuming new beliefs. The cognitive track of the resocialization process begins by stripping away the old convictions and associating them with guilt, evil, or befuddlement. Next, there is a traumatic exhaustion of the habit of subjecting right-brain convictions to left-brain rational scrutiny. This goes along with an increase in what Snyder (1974) has called self-monitoring, implying a shift from central route to peripheral route processing of information in which the source rather than the content of the message becomes all important.

H4. As an individual goes through the brainwashing process, there will be an increase in relational enmeshment with measurable increases occurring at the completion of each of the three stages. The purging of convictions is a painful process and it is reasonable to ask why anybody would go through it voluntarily. The payoff is the opportunity to feel more connected with the charismatic relational network. These people have also been through it, and only they really understand what you are going through. So cognitive purging leads one to seek relational comfort, and this confort becomes enmeshing. The credulity process and the enmeshing process depend on each other.

The next three hypotheses are concerned with the fact that each of the three phases of brainwashing achieves plateaus in both of these processes. The stripping phase creates the vulnerability to this sort of transformation. The identification phase creates realignment, and the rebirth phase breaks down the barrier between the two so that convictions can be emotionally energized and held with zeal, while emotional attachments can be sacralized in terms of the charismatic ideology. The full brainwashing model actually provides far more detailed hypotheses concerning the various steps within each phase of the process. Space constraints make it impossible to discuss these here. An adequate technical discussion of the manipulation of language in brainwashing, for example, would require a chapter at least the length of this one. Figure 2 provides a sketch of the steps within each phase. Readers desiring more information about these steps are referred to Lifton (1989: chap. 5).

P. 190 (scrprnt)
The Stages of Brainwashing & Their Effect on Hyper-credulity and Emotional Enmeshment

page 191

H5. The stripping phase. The cognitive goal of the stripping phase is to destroy prior convictions and prior relationships of belonging. The emotional goal of the stripping phase is to create the need for attachments. Overall, at the completion of the stripping phase, the situation is such that the individual is hungry for convictions and attachments and dependent upon the collectivity to supply them. This sort of credulity and attachment behaviour is widespread among prisoners and hospital patients (Goffman 1961).

H6. The identification phase. The cognitive goal of the identification phase is to establish imitative search for conviction and bring about the erosion of the habit of incredulity. The emotional goal of the identification phase is to instill the habit of acting out through attachment. Overall, at the completion of the identification phase of the individual has begun the practice of relying on the collectivity for beliefs and for a cyclic emotional pattern of arousal and comfort. But, at this point this reliance is just one highly valued form of existence. It is not yet viewed as an existential necessity.

H7. The symbolic death and rebirth phase. In the death and rebirth phase, the cognitive and emotional tracks come together and mutually support each other. This often gives the individual a sense of having emerged from a tunnel and an experience of spiritual rebirth.3 The cognitive goal of this phase is to establish a sense of ownership of (and pride of ownership in) the new convictions. The emotional goal is to make a full commitment to the new self that is no longer directly dependent upon hope of attachment or fear of separation. Overall, at the completion of the rebirth phase we may say that the person has become a fully deployable agent of the charismatic leader. The brainwashing process is complete.

H8 states that the brainwashing process results in a state of subjectivity elevated exit costs. These exit costs cannot, of course, be observed directly. But they can be inferred from the behavioural state of panic or terror that arises in the individual at the possibility of having his or her ties to the group discontinued. The cognitive and emotional states produced by the brainwashing process together bring about a situation in which the perceived exit costs for the individual increase sharply. This closes the trap for all but the most highly motivated individuals, and induces in many a state of uncritical obedience. As soon as exit from a group (or even from its good graces) ceases to be a subjectively palatable option, it makes sense for the individual to comply with almost anything the group demands–even to the point of suicide in some instances. Borrowing from Sartre’s insightful play of that name, I refer to this situation as the ‘no exit’ syndrome. When demands for compliance are particularly harsh, the hyper-credulity aspect of the process sweetens the pill somewhat by allowing the individual to accept uncritically the justifications offered by the charismatic leader and/or charismatic organization for making these demands, however far-fetched these justifications might appear to an outside observer.

H9 states that the brainwashing process results in a state of ideological obedience in which the individual has a strong tendency to comply with any behavioural demands made by the collectivity, especially if motivated by the carrot of approval and the stick of threatened expulsion, no matter how life-threatening these demands may be and no matter how repugnant such demands might have been to the individual in his or her pre-brainwashed state.

H10 states that the ‘brainwashing process results in increased deployability. Deployability extends the range of ideological obedience in the temporal dimension. It states that the response continues after the stimulus is removed. This hypothesis will be disconfirmed in any cult within which members are uncritically obedient only while they are being brainwashed but not thereafter. The effect need not be permanent, but it does need to result in some measurable increase in deployability over time.

H11 states that the ability of the collectivity to rely on obedience without surveillance will result in a measurable decrease in surveillance. Since surveillance involves costs, this decrease will lead to a quantity S, where S equals the savings to the collectivity due to diminished surveillance needs and should be measurable at least to an ordinal level.

H12 states that S will be greater than C. In other words, the savings to the collectivity due to decreased surveillance needs is greater than the cost of maintaining the brainwashing program. Only where S is greater than C does it make sense to maintain a brainwashing program. Cults with initially high surveillance costs, and therefore high potential savings due to decreased surveillance needs [S], will tend to be more likely to brainwash, as will cults structured so that the cost of maintaining the brainwashing system [C] are relatively low.

Holy Archangel Monks socializing over wine

Characteristics of a Good Theory

There is consensus in the social sciences that a good inductive qualitative theory is one that is falsifiable, internally consistent, concrete, potentially generalizable, and has a well-defined dependent variable (king, Keohane et. al. 1994). I think it should be clear from the foregoing that this theory meets all of these conditions according to prevailing standards in the social and behavioural sciences. However, since brainwashing theory has received much unjustified criticism for its lack of falsifiability and its lack of generalizability, I will briefly discuss the theory from these two points of view.

The criterion of falsifiability, as formulated primarily by Popper (1968), is the essence of what separates theory from dogma in science. Every theory must be able to provide an answer to the question of what evidence would falsify it. If the answer is that there is no possible evidence that would lead us to reject a so-called theory, we should conclude that it is not really a theory at all but just a piece of dogma.

Although Dawson (1998) and Richardson (1993) have included the falsifiability problem in their critiques of brainwashing; this criticism is associated mainly with the work of Dick Anthony (1996). Anthony’s claim that brainwashing theory is unfalsifiable is based upon  two related misunderstandings. First, he argues that it is impossible to prove that a person is acting with free will so, to the extent that brainwashing theory rests on the overthrow of free will, no evidence can ever disprove it. Second, he applies Popper’s criterion to cults in a way more appropriate for a highly developed deductive theoretical system. He requires that either brainwashing explain all ego-dystonic behaviour in cults or acknowledge that it can explain none of it. But, as we have seen, brainwashing is part of an inductive multifactorial approach to the study of obedience in cults and should be expected to explain only some of the obedience produced in some cults.

With regard to generalizability, cultic brainwashing is part of an important general class of phenomena whose common element is what Anthony Giddens has called ‘disturbance of ontological security’ in which habits and routines cease to function as guidelines for survival (Cohen 1989: 53). This class of phenomena includes the battered spouse syndrome (Barnett and LaViolette 1993), the behaviour of concentration camp inmates (Chodoff 1966), the Stockholm Syndrome (Kuleshnyk 1984; Powell 1986), and, most importantly, behaviour within prisoner of war camps and Communist Chinese re-education centres and ‘revolutionary universities’ (Lifton 1989; Sargant 1957;  Schein 1961). There exist striking homologies in observed responses across all of these types of events, and it is right that our attention be drawn to trying to understand what common theme underlies them all. As Oliver Wendell Holmes (1891: 325) attempted to teach us more than a century ago, the interest of the scientist should be guided, when applicable, by ‘the plain law of homology which declares that like must be compared with like.’

Cats of St. Nektarios


  1. Because of space limitations, I cannot give this important subject the attention it deserves in this chapter. Readers not familiar with the concept are referred to the much fuller discussion of this subject in the book by Robert Lifton as cited.
  2. Students of cults have sometimes been misled into confusing this state of hyper vrdulity with either hyper suggestibility on the one hand or a rigid ‘true belief’ system on the other. But at least one study has shown that neither the hyper-suggestible, easily hypnotized person nor the structural true believer are good candidates for encapsulation in a totalist cult system (Solomon 1981: 111-112). True believers (often fundamentalists who see in the cult a purer manifestation of their own worldview than they have seen before) do not do well in cults and neither do dye-in-the-wool sceptics who are comfortable with their scepticism. Rather it is those lacking convictions but hungering for them that are the best candidates.
  3. Hopefully, no reader will think that I am affirming the consequent by stating that all experiences of spiritual rebirth must be caused by brainwashing. This model is completely compatible with the assumption that most spiritual rebirth experiences have nothing to do with brainwashing. The reasoning here is identical to that connecting epilepsy with visions of the holy. The empirical finding that seizures can be accompanied by visions of the holy does not in any way imply that such visions are always a sign of epilepsy.

Cultural Contention over the Concept of Brainwashing (Benjamin Zablocki, 2001)

NOTE: The following article is taken from the 5th chapter of Misunderstanding Cults: Searching for Objectivity in a Controversial Field, entitled, Towards a Demystified and Disinterested Scientific Theory of Brainwashing.

That Word ‘Brainwashing’

The word brainwashing is, in itself, controversial and arouses hostile feelings. Since there is no scientific advantage in using one word rather than another for any concept, it may be reasonable in the future to hunt around for another word that is less polemical. We need a universally recognized term for a concept that stands for a form of influence manifested in a deliberately and systematically applied traumatizing and obedience-producing process of ideological resocialization.


Currently, brainwashing is the generally accepted term for this process, but I see no objection to finding another to take its place. There are in fact other terms, historically, that have been used instead, like ‘thought reform’ and ‘coercive persuasion.’ Ironically, it has been those scholars who complain the most about ‘the B-word’ who have also been the most insistent that none of the alternatives is any better. As long as others in the field insist on treating all possible substitute constructions as nothing more than gussied-up synonyms for a mystified concept of brainwashing (see, for example, Introvigne 1998: 2), there is no point as yet in trying to introduce a more congenial term.

An overly literal reading of the word brainwashing (merely a literal translation of the accepted Chinese term shi nao) could be misleading, as it seems to imply the ability to apply some mysterious biochemical cleanser to people’s brains. However, the word has never been intended as a literal designator but as a metaphor. It would be wise to heed Clifford Geertz’s (1973: 210) warning in this connection, to avoid such a ‘flattened view of other people’s mentalities [that] more complex meanings than [a] literal reading suggests [are] not even considered.’

Thus, please don’t allow yourself to become prejudiced by a visceral reaction to the word instead of attending to the underlying concept. There is a linguistic tendency, as the postmodernist critics have taught us, for the signified to disappear beneath the signifier. But the empirically based social sciences must resist this tendency by defining terms precisely. The influence of media-driven vulgarizations of concepts should be resisted. This chapter argues for the scientific validity of a concept, not a word. If you are interested in whether the concept has value, but you gag on the word, feel free to substitute a different word in its place. I myself have no particular attachment to the word brainwashing.

But if all we are talking about is an extreme form of influence, why do we need a special name for it at all? The name is assigned merely for convenience. This is a common and widely accepted practise in the social sciences. For example, in economics a recession is nothing more than a name we give to two consecutive quarters of economic contraction. There is nothing qualitatively distinctive about two such consecutive quarters as opposed to one or three. The label is assigned arbitrarily at a subjective point at which many economists begin to get seriously worried about economic performance. This label is nevertheless useful as long as we don’t reify it by imagining that it stands for some real ‘thing’ that happens to the economy when it experiences precisely two quarters of decline. Many other examples of useful definitions marking arbitrary points along a continuum could be cited. There is no objective way to determine the exact point at which ideological influence becomes severe and encompassing enough, and its effects long lasting enough, for it to be called brainwashing. Inevitably, there will be marginal instances that could be categorized either way. But despite the fact that the boundary is not precisely defined, it demarcates a class of events worthy of systematic study.

The Reciprocal Moral Panic

Study of brainwashing has been hampered by partisanship and tendentious writing on both sides of the conflict. In one camp, there are scholars who very badly don’t want there to be such a thing as brainwashing. Its non-existence, they believe, will help assure religious  liberty, which can only be procured by defending the liberty of the most unpopular religions. If only the non-existence of brainwashing can be proved, the public will have to face up to the hard truth that some citizens choose to follow spiritual paths that may lead them in radical directions. This camp has exerted its influence within academia. But, instead of using its academic skills to refute the  brainwashing conjecture, it has preferred to attack a caricature of brainwashing supplied by anti-cult groups for litigational rather than scientific purposes.

Ecological Fallacy

In the other camp, we find scholars who equally badly do want there to be such a thing as brainwashing. Its existence, they believe, will give them a rationale for opposition to groups they consider dangerous. A typical example of their reasoning can be found in the argument put forth by Margaret Singer that ‘Despite the myth that normal people don’t get sucked into cults, it has become clear over the years that everyone is susceptible to the lure of these master manipulators’ (Singer 1995: 17). Using a form of backward reasoning known as the ecological fallacy, she argues from the known fact that people of all ages, social classes, and ethnic backgrounds can be found in cults to the dubious conclusion that everyone must be susceptible. These scholars must also share some of the blame for tendentious scholarship. Lacking positions of leadership in academia, scholars on this side of the dispute have used their expertise to influence the mass media, and they have been successful because sensational allegations of mystical manipulative influence make good journalistic copy.

It’s funny in a dreary sort of way that both sides in this debate agree that it is a David and Goliath situation, but each side fancies itself to be the David courageously confronting  the awesome power of the opposition. Each side makes use of an exaggerated fear of the other’s influence to create the raw materials of a moral panic (Cohen 1972; Goode and Ben Yehudah 1994). Thus, a disinterested search for truth falls victim to the uncompromising hostility created by each side’s paranoid fear of the power of the other.

David with the Head of Goliath
David with the head of Goliath.

The ‘cult apologists’ picture themselves as fighting an underdog battle against hostile lords of the media backed by their armies of ‘cult-bashing’ experts. The ‘cult bashers’ picture themselves as fighting an underdog battle for a voice in academia in which apologists seem to hold all the gatekeeper positions. Each side justifies its rhetorical excesses and hyperbole by reference to the overwhelming advantages held by the opposing side within its own arena. But over the years a peculiar symbiosis has developed between these two camps. They have come to rely on each other to define their positions. Each finds it more convenient to attack the positions of the other than to do the hard work of finding out what is really going on in cults. Thomas Robbins (19888: 74) has noted that the proponents of these two models ‘tend to talk past each other since they employ differing interpretative frameworks, epistemological rules, definitions… and underlying assumptions.’ Most of the literature on the subject has been framed in terms of rhetorical disputes between these two extremist models. Data-based models have been all but crowded out.

Between these two noisy and contentious camps, we find the curious but disinterested scientist who wants to find out if there is such a thing as brainwashing but will be equally satisfied with a positive or negative answer. I believe that there can and should be a moderate position on the subject. Such a position would avoid the absurdity of denying any reality to what thousands of reputable ex-cult members claim to have experienced–turning this denial into a minor cousin of holocaust denial. At the same time, it would avoid the mystical concept of an irresistible and overwhelming force that was developed by the extremist wing of the anti-cult movement.

One of the most shameful aspects of this whole silly affair is the way pro-religion scholars have used their academic authority to foist off the myth that the concept of brainwashing needs no further research because it has already been thoroughly debunked. Misleadingly, it has been argued (Introvigne forthcoming; Melton forthcoming) that the disciplines of psychology and sociology, through their American scholarly associations, have officially declared the concept of brainwashing to be so thoroughly discredited that no further research is needed. Introvigne, by playing fast and loose with terminology, attempts to parlay a rejection of a committee report into a rejection of the brainwashing concept by the American Psychological Association. He argues that ‘To state that a report “lacks scientific rigor” is tantamount to saying that it is not scientific’ (Introvigne 1998: 3), gliding over the question of whether the ‘it’ in question refers to the committee report or the brainwashing concept.2 Conveniently, for Introvigne, the report in question was written by a committee chaired by Margaret Singer, whose involuntarist theory of brainwashing is as much a distortion of the foundational concept as Introvigne’s parody of it.

The truth is that both of these scholarly associations (American Psychological Association and American Sociological Association) were under intense pressure by a consortium of pro-religious scholars (a.k.a. NRM scholars) to sign an amicus curiae brief alleging consensus within their fields that brainwashing theory had been found to be bunk. This was in regard to a case concerning Moonie brainwashing that was before the United States Supreme Court (Molko v Holly Spirit Ass’n., Supreme Court of Calif. SF 25038; Molko v Holly Spirit Ass’n, 762 p.2d 46 [Cal. 1988], cert. Denied, 490 U.S. 1084 [1989]). The bottom line is that both of the associations, after bitter debate, recognized that there was no such consensus and refused to get involved. Despite strenuous efforts of the NRM scholars to make it appear otherwise, neither professional association saw an overwhelming preponderance of evidence on either side. Both went on the record with a statement virtually identical to my argument in this chapter: that not nearly enough is known about this subject to be able to render a definitive scientific verdict, and that much more research is needed. A few years later, the Society for the Scientific Study of Religion went on record with a similar statement, affirming ‘the agnostic position’ on this subject and calling for more research (Zablocki 1997: 114).

Although NRM scholars have claimed to be opposed only to the most outrageously sensationalized versions of brainwashing theory, the result, perhaps unintended, of their campaign has been to bring an entire important area of social inquiry to a lengthy halt. Evidence of this can be seen in the fact that during the period of 1962 to 2000, a time when cults flourished, not a single article supportive of brainwashing has been published in the two leading American journals devoted to the sociology of religion, although a significant number of such articles have been submitted to those journals and more than a hundred such articles have appeared in journals marginal to the field (Zablocki 1998: 267)

Crime Scene Photo of Heaven's Gate Bodies Found in Rancho Santa Fe, CA (1)
Crime Scene Photo of Heaven’s Gate Bodies Found in Rancho Santa Fe, CA.


The erroneous contention that brainwashing theory has been debunked by social science research has been loudly and frequently repeated, and this ‘big lie’ has thus come to influence the thinking of neutral religion scholars. For example, even Winston Davis, in an excellent article on suicidal obedience in Heaven’s Gate, expresses characteristic ambivalence over the brainwashing concept:

‘Scholarship in general no longer accepts the traditional, simplistic theory of brainwashing… While the vernacular theory of brainwashing may no longer be scientifically viable, the general theory of social and psychological conditioning is still rather in good shape… I therefore find nothing objectionable [sic] in Benjamin Zablocki’s revised theory of brainwashing as ‘a set of transactions between a charismatically led collectivity and an isolated agent of the collectivity with the goal of transforming the agent into a deployable agent.’ The tale I have to tell actually fits nicely into several of Robert Lifton’s classical thought reform categories (Davis 2000: 241-2).

The problem with this all too typical way of looking at things is the fact that I am not presenting some new revised theory of brainwashing but simply a restatement of Robert Lifton’s (1989, 1999) careful and rigorous theory in sociological terms.

There are, I believe, six issues standing in the way of our ability to transcend this reciprocal moral panic. Let us look closely at each of these issues with an eye to recognizing that both sides in this conflict may have distorted the scientifically grounded theories of the foundational theorists–Lifton (1989), Sargant (1957), and Schein (1961)– as they apply to cults.

The Influence Continuum

The first issue has to do with the contention that brainwashing is a newly discovered form of social influence involving a hitherto unknown social force. There is nothing about charismatic influence and the obedience it instills that is mysterious or asks us to posit the existence of a new force. On the contrary, everything about brainwashing can be explained entirely in terms of well-understood scientific principles. As Richard Ofshe has argued: ‘Studying the reform process demonstrates that it is no more or less difficult to understand than any other complex social process and produces no results to suggest that something new has been discovered. The only aspect of the reform process that one might suggest is new, is the order in which the influence procedures are assembled and the degree to which the target’s environment is manipulated in the service of social control. This is at most an unusual arrangement of commonplace bits and pieces’ (1992: 221-2).

Would-be debunkers of the brainwashing concept have argued that brainwashing theory is not just a theory of ordinary social influence intensified under structural conditions of ideological totalism, but is rather a ‘special’ kind of influence theory that alleges that free will can be overwhelmed and individuals brought to a state of mind in which they will comply with charismatic directives involuntarily, having surrendered the capability of saying no. Of course, if a theory of brainwashing really did rely upon such an intrinsically untestable notion, it would be reasonable to reject it outright.

The attack on this so-called involuntarist theory of brainwashing figures prominently in the debunking efforts of a number of scholars (Barker 1989; Hexham and Poewe 1997; Melton forthcoming), but is most closely identified with the work of Dick Anthony (1996), for whom it is the linchpin of the debunking argument. Anthony argues, without a shred of evidence that I have been able to discover, that the foundational work of Lifton and Schein and the more recent theories of myself (1998), Richard Ofshe (1992), and Stephen Kent (Kent and Krebs 1998) are based upon what he calls the ‘involuntarism assumption.’ It is true that a number of prominent legal cases have hinged on the question of whether the plaintiff’s free will had been somehow overthrown (Richardson and Ginsburg 1998). But nowhere in the scientific literature has there been such a claim. Foundational brainwashing theory has not claimed that subjects were robbed of their free will. Neither the presence nor the absence of free will can ever be proved or disproved. The confusion stems from the difference between the word free as it is used in economics as an antonym for costly, and as it is used in philosophy as an antonym for deterministic. When brainwashing theory speaks of individuals losing the ability to freely decide to obey, the word is being used in the economic sense. Brainwashing imposes costs, and when a course of action has costs it is no longer free. The famous statement by Rousseau (1913, p.3) that ‘Man is born free, and everywhere he is in chains,’ succinctly expresses the view that socialization can impose severe constraints on human behaviour. Throughout the social sciences, this is accepted almost axiomatically. It is odd that only in the sociology of new religious movements is the importance of socialization’s ability to constrain largely ignored.    

Geronda Ephraim (AZ)      

Unidirectional versus Bi-directional Influence

The second issue has to do with controversy over whether there are particular personality types drawn to cults and whether members are better perceived as willing and active seekers or as helpless and victimized dupes, as if these were mutually exclusive alternatives. Those who focus on the importance of the particular traits that recruits bring to their cults tend to ignore the resocialization process (Anthony and Robbins 1994).3 Those who focus on the resocialization process often ignore personal predispositions (Singer and Ofshe 1990).

All this reminds me of being back in high school when people used to gossip about girls who ‘got themselves pregnant.’ Since that time, advances in biological theory have taught us to think more realistically of ‘getting pregnant’ as an interactive process involving influence in both directions. Similarly, as our understanding of totalistic influence in cults matures, I think we will abandon undirectional explanations of cultic obedience in favour of more realistic, interactive ones. When that happens, we will find ourselves able to ask more interesting questions than we do now. Rather than asking whether it is the predisposing trait or a manipulative process that produces high levels of uncritical obedience, we will ask just what predisposing traits of individuals interact with just what manipulative actions by cults to produce this outcome.

A number of the debunking authors use this artificial and incorrect split between resocialization and predisposing traits to create a divide between cult brainwashing theory and foundational brainwashing theory as an explanation for ideological influence in China and Korea in the mid-twentieth century. Dick Anthony attempts to show that the foundational literature really embodied two distinct theories. One, he claims, was a robotic control theory that was mystical and sensationalist. The other was a theory of totalitarian influence that was dependent for its success upon pre-existing totalitarian beliefs of the subject which the program was able to reinvoke (Anthony 1996: i). Anthony claims that even though cultic brainwashing theory is descendant from the former, it claims its legitimacy from its ties to the latter.

The problem with this distinction is that it is based upon a misreading of the foundational literature (Lifton1989; Schein 1961). Lifton devotes chapter 5 of his book to a description of the brainwashing process. In chapter 22 he describes the social structural conditions that have to be present for this process to be effective. Anthony misunderstands this scientific distinction. He interprets it instead as evidence that Lifton’s work embodies two distinct theories: one bad and one good (Anthony and Robbins 1994). The ‘bad’ Lifton, according to Anthony, is the chapter 5 Lifton who describes a brainwashing process that may have gone on in  Communist reindoctrination centres, but which, according to Anthony, has no applicability to contemporary cults. The ‘good’ Lifton, on the other hand, describes in chapter 22 a structural situation that Anthony splits off and calls a theory of thought reform. Anthony appears to like this ‘theory’ better because it does not involve anything that the cult actually does to the cult participant (Anthony and Robbins 1995). The cult merely creates a totalistic social structure that individuals with certain predisposing traits may decide that they want to be part of.

Unfortunately for Anthony, there are two problems with such splitting. One is that Lifton himself denies any such split in his theory (Lifton 1995, 1997). The second is that both  an influence process and the structural conditions conducive to that process are necessary for any theory of social influence. As Lifton demonstrates in his recent application of his theory to a Japanese terrorist cult (Lifton 1999), process cannot be split off from structure in any study of social influence.

Geronda Ephraim, Monks, Devotees

Condemnatory Label versus Contributory Factor

The third issue has to do with whether brainwashing is meant to replace other explanatory variables or work alongside them. Bainbridge (1997) and Richardson (1993) worry about the former, complaining that brainwashing explanations are intrinsically unifactoral, and thus inferior to the multifactoral explanations preferred by modern social science. But brainwashing theory has rarely, if ever, been used scientifically as a unifactoral explanation. Lifton (1999) does not attempt to explain all the obedience generated in Aum Shinrikyo by the brainwashing mechanism. My explanation of the obedience generated by the Nruderhof relies on numerous social mechanisms of which brainwashing is only one (Zablocki 1980). The same can be said for Ofshe’s explanation of social control in Synanon (1976). Far from being unifactoral, brainwashing is merely one essential element in a larger strategy for understanding how charismatic authority is channelled into obedience.

James Thurber once wrote a fable called The Wonderful (1957), which depicted the cultural collapse of a society that was free to express itself using twenty-five letters of the alphabet but was forbidden to use the letter O for any reason. The intellectual convolutions forced on Thurber’s imaginary society by this ‘slight’ restriction are reminiscent of the intellectual  convolutions forced on the NRM scholars by their refusal to include brainwashing in their models. It is not that these scholars don’t often have considerable insight into cult dynamics, but the poor mugs are, nevertheless, constantly getting overwhelmed by events that their theories are unable to predict or explain. You always find them busy playing catch-up as they scramble to account for each new cult crisis as it develops on an ad hoc basis. The inadequacy of their models cries out ‘specification error’ in the sense that a key variable has been left out.

The Thurberian approach just does not work. We have to use the whole alphabet of social influence concepts from Asch to Zimbardo (including the dreaded B-word) to understand cultic obedience. Cults are a complex social ecology of forces involving attenuation effects (Petty 1994), conformity (Asch 1951), crowd behaviour (Coleman 1990), decision elites (Wexler 1995), deindividuation (Festinger, Pepitone et. al. 1952), extended exchange (Stark 1999), groupthink (Janis 1982), ritual (Turner (1969), sacrifice and stigma (Iannaccone 1992), situational pressures (Zimbardo and Anderson 1993), social proof (Cialdini 1993), totalism (Lifton 1989), and many others. Personally, I have never seen a cult that was held together only by brainwashing and not also by other psychological factors, as well as genuine loyalty to ideology and leadership.

Arguments that brainwashing is really a term of moral condemnation masquerading as a scientific concept have emerged as a reaction to the efforts of some anti-cultists (not social scientists) to use brainwashing as a label to condemn cults rather than as a concept to understand them. Bromley (1998) has taken the position that brainwashing is not a variable at all but merely a peremptory label of stigmatization–a trope for an ideological bias, in our individualistic culture, against people who prefer to live and work more collectivistically. Others have focused on the observe danger of allowing brainwashing to be used as an all-purpose moral excuse (It wasn’t my fault. I was brainwashed!), offering blanket absolution for people who have been cult members–freeing them from the need to take any responsibility for their actions (Bainbridge 1997; Hexham and Poewe 1997; Introvigne forthcoming; Melton forthcoming). While these allegations represent legitimate concerns about potential abuse of the concept, neither is relevant to the scientific issue. A disinterested approach will first determine whether a phenomenon exists before worrying about whether its existence is politically convenient.

Geronda Ephraim baptism

Obtaining Members versus Retaining Members

The fourth issue has to do with a confusion over whether brainwashing explains how cults obtain members or how they retain them. Some cults have made use of manipulative practices like love-bombing and sleep deprivation (Galanti 1993), with some degrees of success, in order to obtain new members. A discussion of these manipulative practices for obtaining members is beyond the scope of this chapter. Some of these practices superficially resemble techniques used in the earliest phase of brainwashing. But these practices, themselves, are not brainwashing. This point must be emphasized because a false attribution of brainwashing to newly obtained cult recruits, rather than to those who have already made a substantial commitment to the cult, figures prominently in the ridicule of the concept by NRM scholars. A typical straw man representation of brainwashing as a self-evidently absurd concept is as follows: ‘The new convert is held mentally captive in a state of alternate consciousness due to “trance-induction techniques” such as meditation, chanting, speaking in tongues, self-hypnosis, visualization, and controlled breathing exercises … the cultist is [thus] reduced to performing religious duties in slavish obedience to the whims of the group and its authoritarian or maniacal leader’ (Wright 1998: 98).

Foundational brainwashing theory was not concerned with such Svengalian conceits, but only with ideological influence in the service of the retaining function. Why should the foundational theorists, concerned as they were with coercive state-run institutions like prisons, ‘re-education centres,’ and prisoner-of-war camps have any interest in explaining how participants were obtained? Participants were obtained at the point of a gun.4 The motive of these state enterprises was to retain the loyalties of these participants after intensive resocialization ceased. As George Orwell showed so well in his novel 1984, the only justification for the costly indoctrination process undergone by Winston Smith was not that he love Big Brother while Smith was in prison, but that Big Brother be able to retain that love after Smith was deployed back into society. Nevertheless, both ‘cult apologists’ and ‘cult bashers’ have found it more convenient to focus on the obtaining function.

Geronda Ephraim with child

If one asks why a cult would be motivated to invest resources in brainwashing, it should be clear that this can not be to obtain recruits, since these are a dime a dozen in the first place, and, as Barker (1984) has shown, they don’t tend to stick around long enough to repay the investment. Rather, it can only be to retain loyalty, and therefore decrease surveillance costs for valued members who are already committed. In small groups bound together only by normative solidarity, as Hechter (1987) has shown, the cost of surveillance of the individual by the group is one of the chief obstacles to success. Minimizing these surveillance costs is often the most important organizational problem such groups have to solve in order to survive and prosper. Brainwashing makes sense for a collectivity only to the extent that the resources saved through decreased surveillance costs exceed the resources invested in the brainwashing process. For this reason, only high-demand charismatic groups with totalistic social structures are ever in a position to benefit from brainwashing.5

This mistaken ascription of brainwashing to the obtaining to the obtaining function rather than the retaining function is directly responsible for two of the major arguments used by the ‘cult apologists’ in their attempt to debunk brainwashing. One has to do with a misunderstanding of the role of force and the other has to do with the mistaken belief that brainwashing can be studied with data on cult membership turnover.

The widespread belief that force is necessary for brainwashing is based upon a misreading of Lifton (1989) and Schein (1961). A number of authors (Dawson 1998; Melton forthcoming; Richardson 1993) have based their arguments, in part, on the contention that the works of foundational scholarship on brainwashing are irrelevant to the study of cults because the foundational literature studied only subjects who were forcibly incarcerated. However, Lifton and Schein have both gone on public record as explicitly denying that there is anything about their theories that requires the use of physical force or threat of force. Lifton has specifically argued (‘psychological manipulation is the heart of the matter, with or without the use of physical force’ [1995: xi]) that his theories are very much applicable to cults.6 The difference between the state-run institutions that Lifton and Schein studied in the 1950s and 1960s and the cults that Lifton and others study today is in the obtaining function not in the retaining function. In the Chinese and Korean situations, force was used for obtaining and brainwashing was used for retaining. In cults, charismatic appeal is used for obtaining and brainwashing is used, in some instances, for retaining.

A related misconception has to do with what conclusions to draw from the  very high rate of turnover among new and prospective recruits to cults. Bainbridge (1997), Barker (1989), Dawson (1998), Introvigne (forthcoming), and Richardson (1993) have correctly pointed out that in totalistic religious organizations very few prospective members go on to become long-term members. They argue that this proves that the resocialization process cannot be irresistible and therefore it cannot be brainwashing. But nothing in the brainwashing model predicts that it will be attempted with all members, let alone successfully attempted. In fact, the efficiency of brainwashing, operationalized as the expected yield of deployable agents7  per 100 members, is an unknown (but discoverable) parameter of any particular cultic system and may often be quite low. For the system to be able to perpetuate itself (Hechter 1987), the yield need only produce enough value for the system to compensate it for the resources required to maintain the brainwashing process.

Moreover, the high turnover rate in cults is more complex than it may seem. While it is true that the membership turnover is very high among recruits and new members, this changes after two or three years of membership when cultic commitment mechanisms begin to kick in. this transition from high to low membership turnover is known as the Bainbridge Shift, after the sociologist who first discovered it (Bainbridge 1997: 141-3). After about three years of membership, the annual rate of turnover sharply declines and begins to fit a commitment model rather than a random model.8

Membership turnover data is not the right sort of data to tell us whether a particular cult practises brainwashing. The recruitment strategy whereby many are called but few are chosen is a popular one among cults. In several groups in which I have observed  the brainwashing process, there was very high turnover among initial recruits. Brainwashing is too expensive to waste on raw recruits. Since brainwashing is a costly process, it generally will not pay for a group to even attempt to brainwash one of its members until that member has already demonstrated some degree of staying power on her own.9

Geronda Ephraim &amp; Geronda Paisios

Psychological Traces

The fifth issue has to do with the question of whether brainwashing leaves any long-lasting measurable psychological traces in those who have experienced it. Before we can ask this question in a systematic way, we have to be clear about what sort of traces we should be looking for. There is an extensive literature on cults and mental health. But whether cult involvement causes psychological problems is a much more general question than whether participation in a traumatic resocialization process leaves any measurable psychological traces.

There has been little consensus on what sort of traces to look for. Richardson and Kilbourne (1983: 30) assume that brainwashing should lead to insanity. Lewis (1983: 30) argues that brainwashing should lead to diminished IQ scores. Nothing in brainwashing theory would lead us to predict either of these outcomes. In fact, Schein points out that ‘The essence of coercive persuasion is to produce ideological and behavioral change in a fully conscious, mentally intact individual’ (1959: 437). Why in the world would brainwashers invest scarce resources to produce insanity and stupidity in their followers? However, these aforementioned authors (and others) have taken the absence of these debilitative effects as ‘proof’ that brainwashing doesn’t happen in cults. At the same time, those who oppose cults have had an interest, driven by litigation rather than science, in making exaggerated claims for mental impairment directly resulting from brainwashing. As Farrell has pointed out, ‘From the beginning, the idea of traumatic neurosis has been accompanied by concerns about compensation’ (1998: 7).

Studies of lingering emotional, cognitive, and physiological effects on ex-members have thus far shown inconsistent results (Katchen 1997; Solomon 1981; Ungerleider and Wellisch 1983). Researchers studying current members of religious groups have found no significant impairment or disorientation. Such results have erroneously been taken as evidence that the members of these groups could, therefore, not possibly have been brainwashed. However, these same researchers found these responses of current members contaminated by elevations on the ‘Lie’ scale, exemplifying ‘an intentional attempt to make a good impression and deny faults’ (Ungerleider and Wellisch 1983: 208). On the other hand, studies of ex-members have tended to show ‘serious mental and emotional dysfunctions that have been directly caused by cultic beliefs and practices (Saliba 1993: 106). The sampling methods of these latter studies have been challenged (Lewis and Bromley 1987; Solomon 1981), however, because they have tended to significantly over-sample respondents with anti-cult movement ties. With ingenious logic, this has led Dawson (1998: 121) to suggest in the same breath that cult brainwashing is a myth but that ex-member impairment may be a result of brainwashing done by deprogrammers.

All this controversy is not entirely relevant to our question, however, because there is no reason to assume that a brainwashed person is going to show elevated scores on standard psychiatric distress scales. In fact, for those for whom making choices is stressful, brainwashing may offer psychological relief. Galanter’s research has demonstrated that a cult ‘acts like a psychological pincer, promoting distress while, at the same time, providing relief’ (1989: 93). As we shall see below, the brainwashing model predicts impairment and disorientation only for people during some of the intermediate stages, not at the end state. The popular association of brainwashing with zombie or robot states comes out of a misattribution of the characteristics of people going through the traumatic brainwashing process to people going through the traumatic brainwashing process to people who have completed the process. The former really are, at times, so disoriented that they appear to resemble caricatures of zombies or robots. The glassy eyes, inability to complete sentences, and fixed eerie smiles are characteristics of disoriented people under randomly varying levels of psychological stress. The latter, however, are, if the process was successful, functioning and presentable deployable agents.

geron Efraim1

Establishing causal direction in the association between cult membership and mental health is extremely tricky, and little progress has been made thus far. In an excellent article reviewing the extensive literature in this area, Saliba (1993: 108) concludes: ‘The study of the relationship between new religious movements and mental health is in its infancy.’ Writing five years later, Dawson (1998: 122) agrees that this is still true, and argues that ‘the inconclusiveness results of the psychological study of members and ex-members of NRMs cannot conceivably be used to support either the case for or against brainwashing.’ Saliba calls for prospective studies that will establish baseline mental health measurements for individuals before they join cults, followed by repeated measures during and afterward. While this is methodologically sensible, it is impractical because joining a cult is both a rare and unexpected event. This makes the general question of how cults affect mental health very difficult to answer.

Fortunately, examining the specific issue of whether brainwashing leaves psychological traces may be easier. The key is recognizing that brainwashing is a traumatic process, and, therefore, those who have gone through it should experience an increasing likelihood in later years of post-traumatic stress disorder. The classic clinical symptoms of PTSD — avoidance, numbing, and increased arousal (American Psychiatric Association 1994: 427) — have been observed in many ex-cult members regardless of their mode of exit and current movement affiliations (Katchen 1997; Zablocki 1999). However, these soft and somewhat subjective symptoms should be viewed with some caution given recent controversies over the ease with which symptoms such as these can be iatrogenically implanted, as, for example, false memories (Loftus and Ketcham 1994).

In the future, avenues for more precise neurological tracking may become available. Judith Herman (1997: 238) has demonstrated convincingly that ‘traumatic exposure can produce lasting alterations in the endocrine, autonomic, and central nervous systems … and un the function and even the structure of specific areas of the brain.’ It is possible in the future that direct evidence of brainwashing may emerge from brain scanning using positron emission tomography. Some preliminary research in this area has suggested that, during flashbacks, specific areas of the brain involved with language and communication may be inactivated (Herman 1997: 240; Rauch van der Kolk, et. al. 1996). Another promising area of investigation of this sort would involve testing for what van der Kolk and McFarlene (1996) have clinically identified as ‘the black hole of trauma.’ It should be possible to determine, once measures have been validated, whether such traces appear more often in individuals who claim to have gone through brainwashing than in a sample of controls who have been non-brainwashed members of cults for equivalent periods of time.

elder ephraim pascha

Separating the Investigative Steps

The final issue is a procedural one. There are four sequential investigative steps required to resolve controversies like the one we have been discussing.these steps are concerned with attempt, existence, incidence, and consequence. A great deal of confusion comes from nothing more than a failure to recognize that these four steps need to be kept analytically distinct from one another.

To appreciate the importance of this point, apart from the heat of controversy, let us alter the scene for a moment and imagine that the scientific conflict we are trying to resolve is over something relatively innocuous — say, vegetarianism. Let us imagine that on one side we have a community of scholars arguing that vegetarianism is a myth, that nobody would voluntarily choose to live without eating meat and that anyone who tried would quickly succumb to an overpowering carnivorous urge. On the other side, we have another group of scholars arguing that they had actually seen vegetarians and observed their non-meat-eating behavior over long periods of time, and that, moreover, vegetarianism is a rapidly growing social problem with many new converts each year being seduced by this enervating and debilitating diet.

It should be clear that any attempt to resolve this debate scientifically would have to proceed through the four sequential steps mentioned above. First, we would have to find out if anybody ever deliberately attempts to be a vegetarian. Maybe those observed not eating meat were simply unable to obtain it. If nobody could be found voluntarily attempting to follow a vegetarian diet, we would next have to observe him carefully enough and long enough to find out whether he succeeds in abstaining from meat. If we observe even one person successfully abstaining from meat, we would have to conclude that vegetarianism exists, increasing our confidence in the theory of the second group of researchers. But the first group could still argue, well, maybe you are right that a few eccentric people here and there do practise vegetarianism, but not enough to constitute a social phenomenon worth investigating. So, the next step would be to measure the incidence of vegetarianism in the population. Out of every million people, how many do we find following a vegetarian diet? If it turns out to be very few, we can conclude that, while vegetarianism may exist as a social oddity, it does not rise to the level of being a social phenomenon worthy of our interest. If, however, we find a sizeable number of vegetarians, we still need to ask, ‘So what?’ This is the fourth of our sequential steps. Does the practice of vegetarianism have any physical, psychological, or social consequences? If so, are these consequences worthy of our concern?

Each of these investigative steps requires attention focused on quite distinct sets of substantive evidence. For this reason, it is important that we not confuse them with one another as is so often done in ‘apologist’ writing about brainwashing, where the argument often seems to run as follows: Brainwashing doesn’t exist, or at least it shouldn’t exist, and even if it does the numbers involved are so few, and everybody in modern society gets brainwashed  to some extent, and the effects, if any, are impossible to measure. Such arguments jump around, not holding still long enough to allow for orderly and systematic confirmation or disconfirmation of each of the steps.

Once we recognize the importance of keeping the investigative steps methodologically distinct distinct from one another, it becomes apparent that the study of brainwashing is no more problematic (although undoubtedly much more difficult) than the study of an advertising campaign for a new household detergent. It is a straightforward question to ask whether or not some charismatic groups attempt to practise radical techniques of socialization designed to turn members into deployable agents. If the answer is no, we stop because there can be no brainwashing. If the answer is yes, we go on to a second question: Are these techniques at least sometimes effective in producing uncritical obedience? If the answer to this question is ye (even for a single person), we know that brainwashing exists, although it may be so rare as to be nothing more than a sociological oddity. therefore, we have to take a third step and ask. How frequently is it effective? What proportion of those who live in cults are subjected to brainwashing, and what proportion of these respond by becoming uncritically obedient? And, finally, we need to ask a fourth important question: How long do the effects last? Are the effects transitory, lasting only as long as the stimulus continues to be applied, or are they persistent for a period of time thereafter, and, if so, how long? Let us keep in mind the importance of distinguishing attempt from existence, from incidence, from consequences.

To be continued…



  1. When I speak of ego dystonic behaviour, I refer to behaviour that was ego dystonic to the person before joining the cult and after leaving the cult.
  2. I have no doubt that Introvigne, who is a European attorney, is sincere in his desire to stifle brainwashing research out of fear that any suggestion that brainwashing might possibly occur in cults will be seized on by semi-authoritarian government committees eager to suppress religious liberty. Personally, I applaud Introvigne’s efforts to protect the fragile tree of religious freedom of choice in the newly emerging democracies of Eastern Europe. But I don’t appreciate his doing so by (perhaps inadvertently) sticking his  thumb on the scales upon which social scientists attempt to weigh evidence.
  3. The Anthony and Robbins article cited demonstrates how little we really know about traits that may predispose people to join cults. They say ‘…some traditionally conservative religious groups attract people who score highly on various measures of totalitarianism, e.g., the F scale or Rokeach’s Dogmatism scale… It seems likely that these results upon certain Christian groups would generalize to alternative religious movements or cults, as many of them have theological and social beliefs that seem similar to those in some fundamentalist denominations’ (1994:470).. Perhaps, but perhaps not. No consensus has yet emerged from numerous attempts to find a cult personality type, but this seems like a promising area of research to continue.
  4. Some, it is true, were nominally volunteers into re-education programs. However, the power of the state to make their lives miserable if they did not volunteer cannot be ignored.
  5. Unfortunately, however, uncritical obedience can be wayward and dangerous. It can be useful to a cult leader when the cult is functioning well. But it often has been perverted to serve a destructive or self-destructive agenda in cults that have begun to disintegrate.
  6. Some confusion on this subject has emerged from the fact that Lifton has distanced himself from those attempting to litigate against cults because of alleged brainwashing. He has constantly argued (and I wholeheartedly agree) that brainwashing, in and of itself, where no force is involved, should not be a matter for the law courts.
  7. Formal definitions for this and other technical terms will be presented in the next section of this chapter.
  8. In other words, the probability of a person’s leaving is inversely dependent upon the amount of time he or she has already spent as a member.
  9. The ‘cult-basher’ version of brainwashing theory has played into this misunderstanding by confounding manipulative recruitment techniques (like sleep deprivation and ‘love-bombing’) with actual brainwashing. While there may be some overlap in the actual techniques used, the former is a method for obtaining new members, whereas brainwashing is a method for retaining old members.

Do We Need to Know Whether Cults Engage in Brainwashing? (Benjamin Zablocki, 2001)

NOTE: The following article is taken from the 5th chapter of Misunderstanding Cults: Searching for Objectivity in a Controversial Field, entitled, Towards a Demystified and Disinterested Scientific Theory of Brainwashing.


Nobody likes to lose a customer, but religions get more touchy than most when faced with the risk of losing devotees they have come to define as their own. Historically, many religions have gone to great lengths to prevent apostasy, believing virtually any means justified to prevent wavering parishioners from defecting and thus losing hope of eternal salvation. In recent centuries, religion in our society has evolved from a system of territorially based near-monopolies into a vigorous and highly competitive faith marketplace in which many churches, denominations, sects, and cults vie with one another for the allegiance of ‘customers’ who are free to pick and choose among competing faiths. Under such circumstances, we should expect to find that some of the more tight-knit and fanatical religions in this rough-and-tumble marketplace will have developed sophisticated persuasive techniques are known in the literature by the controversial term ‘brainwashing.’ This chapter is devoted to a search for a scientific definition of brainwashing and an examination of the evidence for the existence of brainwashing in cults. I believe that research on this neglected subject is important for a fuller understanding of religious market dynamics.1 And, ultimately, research on this subject may yield a wider dividend as well, assisting us in our quest for a fuller understanding of mass charismatic movements such as Fascism, Nazism, Stalinism, and Maoism.

Do We Need to Know Whether Cults Engage in Brainwashing?

The question of why people obey the sometimes bizarrely insane commands of charismatic leaders, even unto death, is one of the big unsolved mysteries of history and the social sciences. If there are deliberate techniques that charismatic leaders (and charismatically led organizations) use to induce high levels of uncritical loyalty and obedience in their followers, we should try to understand what these techniques are and under what circumstances and how well they work.

This chapter is about nothing other than the process of inducing ideological obedience in charismatic groups. Many people call this process brainwashing, but the label is unimportant. What is important is that those of us who want to understand cults develop models that recognize the importance that some cults give to strenuous techniques of socialization designed to induce uncritical obedience to ideological imperatives regardless of the cost to the individual.

Obedience is Life

The systematic study of obedience has slowed down considerably within the behavioural sciences. Early laboratory studies of obedience-inducing mechanisms got off to a promising start in the 1960s and 1970s, but were correctly criticized by human rights advocates for putting laboratory subjects under unacceptable levels of stress (Kelman and Hamilton 1989; Milgram 1975; Zimbardo 1973). Permission to do obedience-inducing experiments on naive experimental subjects became almost impossible to obtain and these sort of laboratory experiments virtually ceased. However, large numbers of charismatic cultic movements appeared on the scene just in time to fill this vacuum left by abandoned laboratory studies. Being naturally occurring social ‘experiments,’ obedience-induction in such groups could be studied ethnographically without raising the ethical objections that had been raised concerning laboratory studies.

Social theorists are well aware that an extremely high degree of obedience to authority is a reliably recurring feature of charismatic cult organizations (Lindholm 1990; Oakes 1997). But most social scientists interested in religion declined this opportunity. For reasons having more to do with political correctness than scientific curiosity, most of them refused to design research focused on obedience-induction. Many even deny that deliberate programs of obedience-induction ever occur in cults.

The existence of a highly atypical form of obedience to the dictates of charismatic leaders is not in question. Group suicides at the behest of a charismatic leader are probably the most puzzling of such acts of obedience (Hall 2000; Lalich 1999; Weightman 1983), but murder, incest, child abuse, and child molestation constitute other puzzling examples for which credible evidence is available (Bugliosi and Gentry 1974; Lifton 1999; Rochford 1998). However, agreement on these facts is not matched, as we shall see, by agreement on the causes of the obedience, its pervasiveness among cult populations, or the rate at which it decays after the influence stimuli are removed.

AZ - Monastic Procession with bishop

But given the fact that only a small proportion of the human population ever join cults, why should we care? The answer is that the sociological importance of cults extends far beyond their numerical significance. Many cults are harmless and fully deserving of protection of their religious and civil liberties. However, events of recent years have shown that some cults are capable of producing far more social harm than one might expect from the minuscule number of their adherents. The U.S. Department’s annual report on terrorism for the year 2000 concludes that ‘while Americans were once threatened primarily by terrorism sponsored states, today they face greater threats from loose networks of groups and individuals motivated more by religion or ideology than by politics’ (Miller 2000:1).

In his recent study of a Japanese apocalyptic cult, Robert Jay Lifton (1999: 343) has emphasized this point in the following terms:   


‘Consider Asahara’s experience with ultimate weapons…With a mad guru and a few hundred close followers, it is much easier to see how the very engagement with omnicidal weapons, once started upon, takes on a psychological momentum likely to lead either to self-implosion or to world explosion…Asahara and Aum have changed the world, and not for the better. A threshold has been crossed. Thanks to this guru, Aum stepped over a line that few had even known was there. Its members can claim the distinction of being the first group in history to combine ultimate fanaticism with ultimate weapons in a project to destroy the world. Fortunately, they were not up to the immodest task they assigned themselves. But whatever their bungling, they did cross that line, and the world will never quite be the same because, like it or not, they took the rest of us with them.’

Potentially fruitful scientific research on obedience in cultic settings has been stymied by the well-intentioned meddling of two bitterly opposed, but far from disinterested, scholarly factions. On the one hand, there has been an uncompromising outcry of fastidious naysaying by a tight-knit faction of pro-religion scholars. Out of a fear that evidence of powerful techniques for inducing obedience might be used by religion’s enemies to suppress the free expression of unpopular religions, the pro-religion faction has refused to notice the obvious and had engaged in a concerted (at times almost hysterical) effort to sweep under the rug any cultic-obedience studies not meeting impossibly rigorous controlled experimental standards (Zablocki 1997).On the other hand, those scholars who hate or fear cults have not been blameless in the pathetic enactment of this scientific farce. Some of them have tried their best to mystically transmute the obedience-inducing process that goes on in some cults from a severe and concentrated form of ordinary social influence into a magic spell that somehow allows gurus to snap the minds and enslave the wills of any innocent bystander unlucky enough to come into eye contact. By so doing, they have marginalized themselves academically and provided a perfect foil for the gibes of pro-religion scholars.

Brainwashing is the most commonly used word for the process whereby a charismatic group systematically induces high levels of ideological obedience. It would be naively reductionistic to try to explain cultic obedience entirely in terms of brainwashing. Other factors, such as simple conformity and ritual, induce cultic obedience as well. But it would be an equally serious specification error to leave deliberate cultic manipulation of personal convictions out of any model linking charismatic authority to ideological obedience.

EE &amp; Nuns in NA

However, the current climate of opinion, especially within the sociology of new religious movements, is not receptive to rational discussion of the concept of brainwashing, and still less to research in this area. Brainwashing has for too long been a mystified concept, and one that has been the subject of tendentious writing (thinly disguised as theory testing) by both its friends and enemies. My aim in this chapter is to rescue for social science a concept of brainwashing freed from both mystification and tendentiousness. I believe it is important and long overdue to restore some detachment and objectivity to this field of study.

The goal of achieving demystification will require some analysis of the concept’s highly freighted cultural connotations, with particular regard to how the very word brainwash became a shibboleth in the cult wars. It is easy to understand how frightening it may be to imagine that there exists some force that can influence one down to the core level of basic beliefs, values, and worldview. Movies like The Manchurian Candidate have established in the popular imagination the idea that there exists some mysterious technique, known only to a few that confers such power. Actually, as we will see, the real process of brainwashing involves only well-understood processes of social influence orchestrated in a particularly intense way. It still is, and should be, frightening in its intensity and capacity for extreme mischief, but there is no excuse for refusing to study something simply because its frightening.

EE Planting Trees

The goal of establishing scientific disinterest will require the repositioning of the concept more fully in the domain of behavioural and social science rather than its present domain, which is largely that of civil and criminal legal proceedings. It is in this domain that it has been held hostage and much abused for more than two decades. The maxim of scholarly disinterest requires the researcher to be professionally indifferent as to whether our confidence in any given theory (always tentative at best) is increased or decreased by research. But many scholarly writers on this subject have become involved as expert witnesses, on one side or the other, in various law cases involving allegations against cult leaders or members (where witnesses are paid to debate in an arena in which the only possible outcomes are victory or defeat). This has made it increasingly difficult for these paid experts to cling to a disinterested theoretical perspective.

In my opinion, the litigational needs of these court cases have come, over the years, to drive the scientific debate to an alarming degree. There is a long and not especially honourable history of interest groups that are better armed with lawyers than with scientific evidence, and that use the law to place unreasonable demands on science. One need only think of the school segregationists’ unreasonable demands, fifty years ago, that science prove that any specific child was harmed in a measurable way by a segregated classroom; or the tobacco companies’ demands, forty years ago, that science demonstrate the exact process at the molecular level by which tobacco causes lung cancer. Science can serve the technical needs of litigation, but, when litigation strategies set the agenda for science, both science and the law are poorer for it.

My own thirty-six years of experience doing research on new religious movements has convinced me beyond any doubt that brainwashing is practised by some cults some of the time on some of their members with some degrees of success. Even though the number of times I have used the vague term some in the previous sentence gives testimony to the fact that there remain many still-unanswered questions about this phenomenon, I do not personally have any doubt about brainwashing’s existence. But I have also observed many cults that do not practise brainwashing, and I have never observed a cult in which brainwashing could be reasonably described as the only force holding the group together. My research (Zablocki 1971; 1991; 1996; Zablocki and Aidala 1991) has been ethnographic, comparative, and longitudinal. I have lived among these people and watched the brainwashing process with my own eyes. I have also interviewed people who participated in the process (both as perpetrators and subjects). I have interviewed many of these respondents not just one time but repeatedly over a course of many years. My selection of both cults and individuals to interview has been determined by scientific sampling methods (Zablocki 1980: app A), not guided by convenience nor dictated by the conclusions I hoped to find. Indeed, I have never had an axe to grind in this field of inquiry. I didn’t begin to investigate cults in the hope of finding brainwashing. I was surprised when I first discovered it. I insist on attempting to demonstrate its existence not because I am either for or against cults but only because it seems to me to be an incontrovertible, empirical fact.

Although my own ethnographic experience leads me to believe that there is overwhelming evidence that brainwashing is practised in some cults, my goal in this chapter is not to ‘prove’ that brainwashing exists, but simply to rescue it from the world of bogus ideas to which it has been banished unfairly, and to reinstate it as a legitimate topic of social science inquiry. My attempt to do so in this chapter will involve three steps. First, I will analyse the cultural misunderstandings that have made brainwashing a bone of contention rather than a topic of inquiry. Second, I will reconstruct the concept in a scientifically useful and empirically testable form within the framework of social influence theory. Third, I will summarize the current state of evidence (which seems to me to be quite compelling) that some cults do in fact engage in brainwashing with some degrees of success.

To be continued…

St. Nektarios Xenonas


  1. Most of the examples in this chapter will be drawn from studies of religious cults because these are ones with which I am most familiar through my research. But it should be noted that cults need not be religious, and that there are plenty of examples of brainwashing in political and psychotherapeutic cults as well.

“Cuckoo’s Nest”: Grigoriou Monastery on the Holy Mountain (Vasos Vasileiou, 2010)

NOTE: The following article is taken from the Cypriot newspaper “Phileleftheros” (Ό Φιλελεύθερος), December 18, 2010, p. 23. The article contains the accusations of a hieromonk who was ousted after 22 years of control methods via the administration of psychiatric drugs. http://www.zougla.gr/page.ashx?pid=80&aid=227195&cid=122

Monk Christodoulos
Fr. Christodoulos

The monk, who “was expelled” from the Grigoriou monastery1 on Mount Athos 22 years after his admittance, denounces methods reminiscent of the movie One Flew Over the Cuckoo’s Nest.2 According to his allegations, methods of controlling the monks were applied with the administration of psychiatric drugs. The complaints come from Father Christodoulos3 who also produced a movie clip which shows him tied with leg padlocks to a bed, in a room of the Thessaloniki Hospital, where he was brought for “treatment.”

Fr. Christodoulos maintains he was of sound mind. He cites the opinion of Cypriot psychiatrist, Yiangou Mikelidis,4 who states that he examined Father Christodoulos and “he is not suffering from any serious mental illness and has no need of treatment.”

“The monk’s so-called mental illness reacheded,” as he says, “up to the Prefect of Thessaloniki whose testimony was invoked to register a complaint against the Abbot of Gregoriou5 Monastery for slanderous libel. Furthermore, he accuses the monastery’s administration of not returning money that he secured from the sale of his own real estate. Father Christodoulos was not the “typical” type of monk since he sought the Abbot’s resignation, he went on a hunger strike twice and while he was not as obedient, he remained an administrator of the monastery.

Yiangos passed away in August 2014 at the age of 68.

When they gave him a certificate of discharge and he refused to leave the monastery, the Monastery’s administration called up policemen from Karyes who accompanied him off the Holy Mountain. They transferred the monk’s belongings to Karyes; these numbered 47 boxes with various personal items and were not delivered to him upon his expulsion.

“They tried to make me crazy”

Fr. Christodoulos (Nicholas Diamantopoulos in the world) spoke to “Φ” about everything he claims happened in Grigoriou Monastery:6

“I joined the Grigoriou Monastery in 1987 at age 30. In 2003 I did a hunger strike demanding the resignation of the Abbot because he could not exercise his duties completely. The Abbot gave me a handwritten letter in which he resigned and asked me to pass it to the elderly congregation (a copy was given to the “Φ”).

Archimandrite George Kapsanis
Geronda George Kapsanis, former Abbot of Grigoriou Monastery (d. June 2014)

“I raised the issue of resignation before the elderly assembly (composed of seven monks) but I was told they did not accept it. I returned to the Abbot and asked to be heard by the whole fraternity consisting of about 70 monks. I developed my position before them and they thereupon prepared a document calling the Public Prosecutor of Thessaloniki to lock me in a mental hospital. With the mobilization of the police, they lead me the mental hospital. The psychiatrist chanced to be a fellow student of the Monastery’s doctor; the one who sent me to the psychiatric hospital. I mentioned to the psychiatrist that I have differences with the Monastery’s administration. I explained that this administration wants to use him to make me out as crazy.

“I called my brother from a phone booth and explained that they wanted to declare me insane. So until he came, they tied me to a bed with the help of security guard. They used straps and padlock. When my two brothers came to ask me what happened, they were paid no attention to. When they saw me tied up, they made a clip with a camera and warned those responsible at the hospital they would be given to the public if they continued to have me bound. My brothers said they would take me to another psychiatrist who is not influenced by the monastery. It took three days of contacts and interventions to allow me to leave.

Monk Christodoulos strapped with padlocks to a bed in Thessaloniki Hospital.


“I went to another psychiatrist who, after a month of visits, advised that I am suffering from mixed personality disorder which has nothing to do with mental illness or any other serious illness. The doctor told me that I can go to the monastery with no problem. I returned to the monastery where they accepted me. (Last May I went to a psychiatrist, Giagkos Mikellides, who after examining me, opined in writing that I do not suffer from any serious mental illness and have no need of treatment. A copy of the advice was made available to “Φ”).

“In 2004 a priest-monk threatened me, saying they would expel me from the monastery. I started a hunger strike and sought the Abbot’s resignation. An assembly occurred, minus the Abbot who was then outside Mount Athos and I was told that either they would deport me from Mount Athos or I would go to a psychiatrist in Patras.

“I told them that I accept going to a psychiatrist. I went off to my hometown in Peloponnese without seeing a psychiatrist. When I returned a week later, the Abbot didn’t say anything to me nor ask me what happened with the psychiatrist. This means that two powers co-exist in the monastery. On the one hand, the Abbot and on the other an elderly congregation that insists on making me a mental patient.

“The elderly congregation has a problem because when I go out with permission, I travel abroad instead of only in Greece. With the “indiscipline” I require small chastisements for my “indiscipline” such as refraining from chanting, etc.

“In 2006 they changed the exit certificate and restricted my travels to only in Greece. On one occasion the Abbot obliged me to give him 500 prayer ropes, which I made, to enable me to go on a pilgrimage to the Patriarchate. Since then, when I go out with permission, I travel abroad without the blessing of the Abbot.


“This year in March I went to the abbot and asked him to convene the fraternity and invite anyone who has something against me to say it before all. He threatened me with a curse (that he would curse me) because I ask things beyond obedience. When he threatened me with a curse, I wrote a curse. I noted that if I am right then the curse is to fall upon the head of the Abbot; if not, then the curse would fall to mine.7 After that, I came to Cyprus where I spent Pascha and when I returned I was called to the synaxis and they asked me for an explanation about my behavior.

“I told them that I cannot respect them to the depth they want; when in 2003 they tried to make me crazy.

“Afterwards, they gave me a certificate for insult and contempt towards the Abbot, but I returned it because it did not have his signature.

“They insisted that I leave. I didn’t leave and they brought the police in and they escorted me to Karyes.

“The Abbot told the Prefect of Thessaloniki, Mr. Psomiadi, that I’m a mental patient. Then I registered a lawsuit against the Abbot for slander which is pending before the Court.8

Fr. Christodoulos on Mount Athos
Fr. Christodoulos


To this day, Fr. Christodoulos still speaks out and references the injustices he suffered while living as a monk at Grigoriou Monastery. Here is a recent example, dated January 15, 2016:

“Many who know the details of my monastic life urge me to write an autobiography. If I decide to do such a “crazy thing”, the dead will roll in their graves, as well as the bones of those who are alive—the guileful, treacherous rassaphore monks of Grigoriou Monastery, Mt. Athos who through plots and intrigues that even the Italian Kamora would envy, continually tried to shut my mouth, slander me, humiliate me, ridicule me with processes that reach beyond the limits of a murder attempt at my expense.”

“I have evidence and documents stored electronically that would overturn the thrones of Churches (and not just sovereigns) if I were to publish them!!!”

Fr. George Kapsanis died on Pentecost, June 2014.


  1. Grigoriou Monastery (Greek: Γρηγορίου) is situated on the southwest side of the Athos Peninsula in northern Greece, between the monasteries of Dionysiou and Simonopetra. Grigoriou originally was dedicated to the St. Nicholas but later was renamed in honor of its founder, Gregory. It is ranked seventeenth in the hierarchical order of the twenty monasteries located on the Mount Athos peninsula. Grigoriou is reputed to be one of the most well-organized and strict coenobitic monasteries on the Mount Athos peninsula.
  2. One Flew Over the Cuckoo’s Nest is a 1975 American drama film directed by Miloš Forman, based on the 1962 novel One Flew Over the Cuckoo’s Nest by Ken Kesey. Now considered to be one of the greatest films ever made, One Flew Over the Cuckoo’s Nest is No. 33 on the American Film Institute’s 100 Years… 100 Movies list. In 1993, it was deemed “culturally, historically, or aesthetically significant” by the United States Library of Congress and selected for preservation in the National Film Registry.
  3. A blog exists under the name of Χριστόδουλος Μοναχός Γρηγοριάτης—though there is no validation that the Monk Christodoulos actually wrote the posts contained therein (especially since he continues to make comments about the monastery to this day). Notably, to this day, he still speaks out about his experiences at Grigoriou Monastery. A month after Abbot George’s resignation, the following retraction was posted on this blog, “My esteemed Geronda, beloved fathers and brothers, please consider everything I posted on this blog as invalid. I recall all of my posts and have deleted them! I seek forgiveness from all of you, hoping that I will obtain favorable treatment. Pray for my salvation as I too! My Metanoia [repentance or prostration] to all of you! Monk Christodoulos. http://monaxoschristodoulos.blogspot.com/2014/03/blog-post.html
  4. Yiangos passed away in August 2014 at the age of 68. http://www.attacktv.gr/news/Pages/view.aspx?nID=28222
  5. Archimandrite George Kapsanis resigned from his abbacy in February 2014 for reasons unknown. He died on the day of Pentecost later that year (June 8, 2014).
  6. In April 2014, a blog existing under the name of “Monk Christodoulos Grigoriatis”, posted “My Second Sorry to Grigoriou Monastery.” This “Epistle of Repentance to Geronda George Kapsanis and the Holy Monastery of Grigoriou, Mount Athos” sounds more like a PR campaign contrived by the monastery. To this day, when talking about his experiences at Grigoriou Monastery,  Christodoulos speaks quite differently than the content found in this epistle. Here is the epistle in its entirety:


My esteemed Geronda,


Since April 2010, I have written and published on the internet or notifications by means of mass media (television, radio, newspapers) that gave me step of speech, ungrounded, obscene and other charges against you and against the brothers of the Monastery.

I recognize fully that both you and the brothers of the monastery are persons above reproach in every respect and that my accusations were untrue. But now I am fully aware of the truth and repent for what they did. I confess that I caused you great grief and psychic pain, but also scandalized many people who did not know the ethos of Grigoriou Monastery. I publicly apologize for this, both to you and the brothers of the monastery and the people I scandalized.

As a minimum indication of my practical repentance, I’ve already deleted my website that I maintained with the unjust and false accusations which I address to you and the brothers of the monastery, and I have posted two letters of apology online (this and the preceding that I have sent).

I hope that in this way I can restore, albeit slightly, the harm I caused you.

Because I am monk and I look forward to my salvation, I put my metanoia [repentance or prostration] and ask for your blessing.

I wish you a good and blessed Pascha in love of the Lord!

My repentance towards my former monastery Fr. George Kapsanis, Elder Fr. Christopher, the Fathers of the Holy Assembly and all the fathers of the monastery. Evlogeite your blessing!!

The signing that follows is genuinely mine.

Monk Christodoulos


  1. Geronda Ephraim teaches that cursing clergymen never works and it always falls back on the curser seven-fold. However, a curse by a clergyman always sticks due to the grace of ordination. In this case, both participants are ordained priests; thus, the curse by whichever hieromonk is in the right would have stuck.
  2. There does not seem to be any information about these proceedings available on the web.



Biderman’s Chart of Coercion

NOTE: This article is based on the writings of Albert D. Biderman, a sociologist who worked for the USAF in the 1950s. Biderman showed how Chinese and Korean interrogators used techniques including sleep deprivation, darkness or bright light, insults, threats, and exposure far more than physical force to break prisoners. A link to the entire pdf can be found at the end of the article.

Biderman book

“Most people who brainwash…use methods similar to those of prison guards who recognize that physical control is never easily accomplished without the cooperation of the prisoner. The most effective way to gain that cooperation is through subversive manipulation of the mind and feelings of the victim, who then becomes a psychological, as well as a physical, prisoner.” from an Amnesty International publication, “Report on Torture“, which depicts the brainwashing of prisoners of war.



  • Deprives individual of social support, effectively rendering him unable to resist
  • Makes individual dependent upon interrogator
  • Develops an intense concern with self.

Once a person is away from longstanding emotional support and thus reality checks, it is fairly easy to set a stage for brainwashing. Spiritually abusive groups work to isolate individuals from friends and family, whether directly, by requiring the individuals to forsake friends and family for the sake of the “Kingdom” (group membership), or indirectly, by preaching the necessity to demonstrate one’s love for God by “hating” one’s father, mother, family, friends.

Abusive groups are not outward-looking, but inward-looking, insisting that members find all comfort and support and a replacement family within the group. Cut off from friends, relatives, previous relationships, abusive groups surround the recruits and hammer rigid ideologies into their consciousnesses, saturating their senses with specific doctrines and requirements of the group.

Isolated from everyone but those within the group, recruits become dependent upon group members and leaders and find it difficult if not impossible to offer resistance to group teachings. They become self-interested and hyper-vigilant, very fearful should they incur the disapproval of the group, which now offers the only support available to them which has group approval.

Monks and nuns from the various monasteries under Geronda Ephraim during St. Anthony Monastery’s Feast Day (ca. 2006)

Warning signs
The seed of extremism exists wherever a group demands all the free time of a member, insisting he be in church every time the doors are open and calling him to account if he isn’t, is critical or disapproving of involvements with friends and family outside the group, encourages secrecy by asking that members not share what they have seen or heard in meetings or about church affairs with outsiders, is openly, publicly, and repeatedly critical of other churches or groups (especially if the group claims to be the only one which speaks for God), is critical when members attend conferences, workshops or services at other churches, checks up on members in any way, i.e., to determine that the reason they gave for missing a meeting was valid, or makes attendance at all church functions mandatory for participating in church ministry or enjoying other benefits of church fellowship.

Once a member stops interacting openly with others, the group’s influence is all that matters. He is bombarded with group values and information and there is no one outside the group with whom to share thoughts or who will offer reinforcement or affirmation if the member disagrees with or doubts the values of the group. The process of isolation and the self-doubt it creates allow the group and its leaders to gain power over the members. Leaders may criticize major and minor flaws of members, sometimes publically, or remind them of present or past sins. They may call members names, insult them or ignore them, or practice a combination of ignoring members at some times and receiving them warmly at others, thus maintaining a position of power (i.e., the leaders call the shots.)

The sense of humiliation makes members feel they deserve the poor treatment they are receiving and may cause them to allow themselves to be subjected to any and all indignities out of gratefulness that one as unworthy as they feel is allowed to participate in the group at all. When leaders treat the member well occasionally, they accept any and all crumbs gratefully. Eventually, awareness of how dependent they are on the group and gratitude for the smallest attention contributes to an increasing sense of shame and degradation on the part of the members, who begin to abuse themselves with “litanies of self-blame,” i.e., “No matter what they do to me, I deserve it, as sinful and wretched as I am. I deserve no better. I have no rights but to go to hell. I should be grateful for everything I receive, even punishment.”

St. Anthony's Monastery Feast Day (early - mid-2000s)
In the monasteries it is taught that the most ideal way for someone to practice Orthodoxy is through blind obedience to a Geronda (or Gerondissa).

Monopolization of Perception

  • Fixes attention upon immediate predicament; fosters introspection
  • Eliminates stimuli competing with those controlled by captor
  • Frustrates all actions not consistent with compliance

Abusive groups insist on compliance with trival demands related to all facets of life: food, clothing, money, household arrangements, children, conversation. They monitor members’ appearances, criticize language and childcare practices. They insist on precise schedules and routines, which may change and be contradictory from day to day or moment to moment, depending on the whims of group leaders.

At first, new members may think these expectations are unreasonable and may dispute them, but later, either because they want to be at peace or because they are afraid, or because everyone else is complying, they attempt to comply. After all, what real difference does it make if a member is not allowed to wear a certain color, or to wear his hair in a certain way, to eat certain foods, or say certain words, to go certain places, watch certain things, or associate with certain individuals. In the overall scheme of things, does it really matter? In fact, in the long run, the member begins to reason, it is probably good to learn these disciplines, and after all, as they have frequently been reminded, they are to submit to spiritual authority as unto the Lord.. Soon it becomes apparent that the demands will be unending, and increasing time and energy are focused on avoiding group disapproval by doing something “wrong.” There is a feeling of walking on eggs. Everything becomes important in terms of how the group or its leaders will respond, and members’ desires, feelings and ideas become insignificant. Eventually, members may no longer even know what they want, feel or think. The group has so monopolized all of the members’ perceptions with trivial demands that members lose their perspective as to the enormity of the situation they are in.

The leaders may also persuade the members that they have the inside track with God and therefore know how everything should be done. When their behavior results in disastrous consequences, as it often does, the members are blamed. Sometimes the leaders may have moments, especially after abusive episodes, when they appear to humble themselves and confess their faults, and the contrast of these moments of vulnerability with their usual pose of being all-powerful endears them to members and gives hope for some open communication.

Threats sometimes accompany all of these methods. Members are told they will be under God’s judgment, under a curse, punished, chastised, chastened if they leave the group or disobey group leaders. Sometimes the leaders, themselves, punish the members, and so members can never be sure when leaders will make good on the threats which they say are God’s idea. The members begin to focus on what they can do to meet any and all group demands and how to preserve peace in the short run. Abusive groups may remove children from their parents, control all the money in the group, arrange marriages, destroy personal items of members or hide personal items.


Warning signs:
Preoccupation with trivial demands of daily life, demanding strict compliance with standards of appearance, dress codes, what foods are or are not to be eaten and when, schedules, threats of God’s wrath if group rules are not obeyed, a feeling of being monitored, watched constantly by those in the group or by leaders. In other words, what the church wants, believes and thinks its members should do becomes everything, and you feel preoccupied with making sure you are meeting the standards. It no longer matters whether you agree that the standards are correct, only that you follow them and thus keep the peace and in the good graces of leaders.

TX Synodia
The monks of Holy Archangels Monastery (TX).

Induced Debility and Exhaustion

People subjected to this type of spiritual abuse become worn out by tension, fear and continual rushing about in an effort to meet group standards. They must often avoid displays of fear, sorrow or rage, since these may result in ridicule or punishment. Rigid ministry demands and requirements that members attend unreasonable numbers of meetings and events makes the exhaustion and ability to resist group pressure even worse.

The Gerondia (Head) Table at St. Nektarios Monastery (NY)

Warning Signs:
Feelings of being overwhelmed by demands, close to tears, guilty if one says no to a request or goes against a church standards. Being intimidated or pressured into volunteering for church duties and subjected to scorn or ridicule when one does not “volunteer.” Being rebuked or reproved when family or work responsibilities intrude on church responsibilities.

St. Nektarios Brotherhood at The Russian Synodal Building, NY 2010

Occasional Indulgences

  • Provides motivation for compliance

Leaders of abusive groups often sense when members are making plans to leave and may suddenly offer some kind of indulgence, perhaps just love or affection, attention where there was none before, a note or a gesture of concern. Hope that the situation in the church will change or self doubt (“Maybe I’m just imagining it’s this bad,”) then replace fear or despair and the members decide to stay a while longer. Other groups practice sporadic demonstrations of compassion or affection right in the middle of desperate conflict or abusive episodes. This keeps members off guard and doubting their own perceptions of what is happening.

Some of the brainwashing techniques described are extreme, some groups may use them in a disciplined, regular manner while others use them more sporadically. But even mild, occasional use of these techniques is effective in gaining power.

CA nuns procession 5

Warning Signs:
Be concerned if you have had an ongoing desire to leave a church or group you believe may be abusive, but find yourself repeatedly drawn back in just at the moment you are ready to leave, by a call, a comment or moment of compassion. These moments, infrequent as they may be, are enough to keep hope in change alive and thus you sacrifice years and years to an abusive group.

Feast Day of St. Thekla, 2013, Canada.

Devaluing the Individual

  • Creates fear of freedom and dependence upon captors
  • Creates feelings of helplessness
  • Develops lack of faith in individual capabilities

Abusive leaders are frequently uncannily able to pick out traits church members are proud of and to use those very traits against the members. Those with natural gifts in the areas of music may be told they are proud or puffed up or “anxious to be up front” if they want to use their talents and denied that opportunity. Those with discernment are called judgmental or critical, the merciful are lacking in holiness or good judgment, the peacemakers are reminded the Lord came to bring a sword, not peace. Sometimes efforts are made to convince members that they really are not gifted teachers or musically talented or prophetically inclined as they believed they were. When members begin to doubt the one or two special gifts they possess which they have always been sure were God-given, they begin to doubt everything else they have ever believed about themselves, to feel dependent upon church leaders and afraid to leave the group. (“If I’ve been wrong about even *that*, how can I ever trust myself to make right decisions ever again?”).

CA Nuns choir 3
There are 21 nuns residing at Life-Giving Spring Monastery.

Warning Signs:
Unwillingness to allow members to use their gifts. Establishing rigid boot camp-like requirements for the sake of proving commitment to the group before gifts may be exercised. Repeatedly criticizing natural giftedness by reminding members they must die to their natural gifts, that Paul, after all, said, “When I’m weak, I’m strong,” and that they should expect God to use them in areas other than their areas of giftedness. Emphasizing helps or service to the group as a prerequisite to church ministry. This might take the form of requiring that anyone wanting to serve in any way first have the responsibility of cleaning toilets or cleaning the church for a specified time, that anyone wanting to sing in the worship band must first sing to the children in Sunday School, or that before exercising any gifts at all, members must demonstrate loyalty to the group by faithful attendance at all functions and such things as tithing. No consideration is given to the length of time a new member has been a Christian or to his age or station in life or his unique talents or abilities. The rules apply to everyone alike. This has the effect of reducing everyone to some kind of lowest common denominator where no one’s gifts or natural abilities are valued or appreciated, where the individual is not cherished for the unique blessing he or she is to the body of Christ, where what is most highly valued is service, obedience, submission to authority, and performance without regard to gifts or abilities or, for that matter, individual limitations.

Bishop Joseph at St. John the Forerunner Monastery
Bishop Joseph at St. John the Forerunner Monastery

Biderman Chart

Pitfalls in the Sociological Study of Cults (Janja Lalich, 2001)

NOTE: This article is taken from Misunderstanding Cults: Searching for Objectivity in a Controversial Field, pp. 123-155. It has been condensed.

Misunderstanding Cults

One of the things that cults do well is the construction of inspiring and exciting alternative worldviews. They do this passionately and with great skill, and the most successful of them are also skilled at creating internally consistent social and cultural contexts to make these worldviews visible and attractive both to their members and to their audiences. Consequently, researchers attempting to study cults are confronted with a set of problems beyond those encountered by ethnographers studying other types of social organizations. Researchers of cults are faced with a kind of hall of mirrors in which they must contend with multiple layers of reality construction. In this chapter I discuss the potential pitfalls inherent in doing research on such groups.

First, let me attempt to define what I mean by cult, that problematic word:

A cult can be either a sharply bounded social group or a diffusely bounded social movement held together through shared commitment to a charismatic leader. It upholds a transcendent ideology (often but not always religious in nature) and requires a high level of personal commitment from its members in words and deeds.

At certain times in its history a cult can be a precisely defined group with clear boundaries separating members from outsiders, and at other times it can take the form of a more amorphous social movement with fuzzy concentric boundaries shading off imperceptibly from totally committed inner-circle members to fellow travellers to vaguely interested spectators.

When one turns the viewing lens on a single cult in order to extract a thick definition of the forces that hold it together, one inevitably sees charismatic relationships and devotion to transcendent ideology as the important defining features.

Howard P. Becker's church-sect typology, based on Ernst Troeltsch original theory and upon which the modern concept of cults, sects, and new religious movements is based.
Howard P. Becker’s church-sect typology, based on Ernst Troeltsch original theory and upon which the modern concept of cults, sects, and new religious movements is based.

Cults Try to Prevent You from Coming Backstage after the Show

Often cults are found to be mystical, grandiose, secretive, and multi-layered. Such characteristics have been noted by various researchers.1 There is no way to know how many times researchers have been successfully ‘fooled’ by such groups, in the sense that researchers were shown a version of reality that either differed from the typical daily life or hid from view the negative or controversial aspects. But if we assume that a researcher wants to present a thorough descriptive account, then how best achieve that goal? Whether doing content analysis of documents, participant observation, or interviews, in addition to abiding by generally accepted standards of research in the social sciences, an important first step would be for the researcher to acknowledge that there might be some distortion going on, meant either to impress or to hide, or both.

Over the years there has been a surprising likeness in reports of systems of control and influence used in cultic groups, which have served to misinform, disinform, or obfuscate in one way or another. Those efforts at information control and impression management might be called the group’s ‘mask of normalcy.’ This mask can serve to keep researchers at arm’s length, impeding an inside look at what really transpires. For that reason a researcher must be methodical, thorough, and grounded, and have a solid but flexible plan or approach.

Wearing the Mask of Normalcy
Wearing the Mask of Normalcy

An initial task involves acquiring basic knowledge of the group in question. Know as much as possible about the group beforehand (its doctrine, practices and rituals, lingo, history, lineage, controversies, and crises); then be ready to entertain various interpretations of findings. A central challenge, of course, is gaining access.

I offer here a glimpse of the various strategies used some of the time by some groups, with examples of the types of occurrences that might derail the researcher…These manipulative strategies pose four categories of problems for the researcher who would not be deceived: (1) tricks and setups; (2) demands, restrictions, and intimidation; (3) informants as spin masters; and (4) researcher susceptibility to the cult’s appeal.2 I will discuss each of them in turn.

Impression Management

Problem 1: Tricks and Setups

Researchers must remember the ease with which a group can trick visitors and outsiders. This can happen through selected interviewees, selected topics of discussion, and staged events.

Selected Interviewees

When visiting a group facility or location, a researcher may believe that she is free to interview or observe whomever is there, and as a result may feel that she has been given free reign. In many cases, however, only trusted members are allowed in those locations during the time the researcher (or, in some cases, the public) will be there. As a result, ‘outsiders’ end up talking to or interviewing only those group members who were preselected by the gatekeeper, or were pre-assigned and trained as spoke-persons for the group.3 Another way this type of control occurs is through either overt or indirect censoring of responses and interactions.4

Selected Topics of Discussion

Researchers or journalists who want to interview a group leader may quickly learn that this is not so easy. One evasive strategy has been to ask researchers to submit their questions in writing to the leader, who then (either himself or through his aides) selectively chooses the questions he wants to answer. Sometimes the questions are rewritten so the leader can talk about his own favourite topics.

Staged Events

These events occur for a variety of reasons: to gain credibility for the group, to recruit, to fund-raise, to keep members busy, or simply to put forth a public face.

Beneath the facade there is often a hidden layer—and, in this case, more than one. While the performance itself is sociologically interesting, equally important is what is being hidden: the backstage, the secret nature of the organization, its purpose, and the control of its membership.

Manipulative strategies are devised for a particular context. The purpose is to impress and recruit, and the members’ dedication, commitment, and idealism are taken advantage of, both to put on a good front and to hide certain less desirable aspects.

Appearances are deceptive...
Appearances are deceptive…

An antidote to the types of tricks and setups that researchers might encounter would be to try to establish beforehand some ways to ensure getting unadulterated data. For example, whenever possible, and, if the setting permits, arrive unannounced or early. Try to visit a group’s various locations, including members’ residences, the leaders’ quarters, and any other special facilities. Also request that you be allowed to randomly select interview subjects, and ask permission to speak with members of different ranks, positions, functions, and lengths of time in the group. If at all possible, conduct your interviews off-site, which may allow members to speak more freely. Naturally, all of this must be done within ethical research standards and in a way that maintains good relations with your subjects. Be sensitive as well to the potential emotional, psychological, or physical risks that may befall your informants—whether they are current or former members. And take that same care yourself.

Problem 2: Demands, Restrictions, and Intimidation

Researchers must be alert to a group’s attempt to put demands on them by restricting visiting times, locations, and access to members, and sometimes even requesting to review and approve the researcher’s results or final reports. If a group so desires and is unable to put its stamp of approval on a report, or if a negative or critical report should surface, harassment of the authors and/or publishers is always a possibility.

Ayella5 advises that researchers be critical of the kind of access they are given. Why is a group allowing you in? Is it looking for a clean bill of health or stamp of approval? Has it been criticized recently and is now seeking outside aid in impression management? Does the group understand and agree with scientific norms of research? And possibly most critical, did the group invite you to do the research? Honest answers to such questions may reveal that a researcher is slated to become an unwitting pawn in someone else’s project, perhaps with a questionable goal, and, potentially, with just as questionable an outcome.

Cultic groups with controversial and secret practices are unlikely to be open to scrutiny. It is not unusual for such groups to work against the airing of information that might be detrimental. Groups much prefer positive puff pieces.6

“The worst thing anybody could have happen to him was to have Dad chastise him publicly. That was the worst fear…that you’d be yelled at.”

Problem 3: Informants as Spin Masters

Researchers may encounter trained behaviour on the part of cult members and adherents. Therefore, while acknowledging this front-stage activity, researchers must also be prepared to seek out backstage behaviours and attitudes. Researchers might also consider ways in which they might evoke a fuller picture of what is going on in the member-informant’s mind. Such investigation requires perseverance, creativity, and critical thinking.

As noted earlier, some cults allow only carefully selected members to speak with the press or outsiders.

There is a type of briefing, grilling, and role-playing that occurs in some groups which is intended to train members to respond in desired ways, rather than as they feel. As Barker so aptly described:7

“Some members of some movements have gone further than concealing the truth—they have denied the truth, blatantly lying to potential converts and ‘outsiders.’ Furthermore, some members of some movements lie to other members of the same movement. It is not unusual for members of certain NRMs not to know what their leaders get up to—how the money is spent, exactly who issues the orders, or what the long-term goals of the leaders are.

“Sometimes members have been instructed to say that they are collecting money, food or other goods for the aged, for young people on drugs, or for poor people in underdeveloped countries. Sometimes these statements are downright lies; at other times, they are twisting the truth; at yet other times, the members may convince themselves that they are telling the truth.”

In researching cultic groups, the text is perhaps not so important as the subtext and nonverbal clues.

Planting members trained to ‘spin’ in the group’s front organizations is a common tactic among cults concerned about their public image.


Problem 4: Researcher Susceptibility to the Cult’s Appeal

Researchers of cultic groups are treading into charismatic environments. Many of these groups have great appeal—through the belief system, the activities and interactions, the members, and, of course, the leaders. Researchers must learn how not to be overly influenced by the charismatic performance of a leader—whether it consists of showing off, or feigned humility, or both. Being dazzled by the leader’s glamour is something many a researcher has experienced, if only momentarily. But the savvy researcher will acknowledge his vulnerabilities and guard against succumbing to this very human foible.

I realize how difficult it is to gain access to a cultic group, let alone remain disinterested. Barker8 spoke of the dangers of getting too close to one’s subject; she cautioned against favouritism in reporting, or researchers acting to protect members from experiencing bad press. Spending time with these groups, attending their services or meetings or events, observing members interact, sitting with them while their leader is lecturing—certainly only someone made of stone would not feel drawn in. How else can the researcher expect to gather data and draw interpretations? A researcher must both yield and hold back—a sometimes tricky mix in what are often extraordinary settings.

Operating with the authority of charisma, some leaders of charismatic groups go so far as to fancy themselves to be above and beyond human.

Many cult leaders are quite gifted at their public performances. It is not surprising, then, that the researcher, a mere mortal, finds herself responding to the charismatic lure. Naturally, the wise cult leader counts on such a response—it is not only self-validating but also likely to achieve some desired outcome. A researcher who finds himself swayed by the prowess, magical powers, wisdom, or flattery of a cult leader is less likely to be ‘objective’ in his recording and reporting. Therefore, the researcher might want to put into place certain safeguards against those automatic emotional responses. The first would be, of course, to admit to one’s susceptibility to that charismatic pull. The next would be to institute checkpoints and outside reminders to help bring you back to earth. For example: (1) ensure that you have sufficient time alone, away from group rituals, practices, and paraphernalia; (2) place regular phone calls home or to colleagues who can give you a reality check; (3) stay on a good diet with plenty of liquids, nourishment, and protein; (4) surround yourself with reminders of your usual life; and (5) engage in regular reviews of research objectives. It is important to remember that the objective is not to get recruited, or even to have a good time, but to collect data and report on your findings.


How Not to Become a Mere Apologist for the Cult You Are Studying

No researcher wants to become a pawn of the group that he or she is studying, but it is all too easy and tempting to fall into just that trap. Cultural anthropologists have long been known to become protective of the tribes they study. The more they come to understand tribes from the inside, the more they realize how vulnerable those distant cultures are to misunderstanding back home. Similarly, with cults, many aspects of their behaviour that seem weird to outsiders are much more understandable within the cult’s own milieu. It is only human nature for the researcher, having worked so hard to become familiar with the cult’s context, to wish to parade this expertise by explaining to the world why certain outrageous cult behaviours really do make sense when looked at from the appropriate perspective. Add to all this the fact that the leaders of totalistic cults are in a position to grant the researcher complete access to the cult with a mere wave of the hand, or just as easily to take it all away, as we have seen. Thus, the urge to ingratiate compounds itself upon the urge to protect the cult. Under such circumstances, it is no wonder that some researchers have found it difficult to resist the pressure to become apologists rather than unbiased observers.

The secrecy that often envelops cultic organizations makes that pressure all the more of an obstacle to objective research. Wilson9 was explicit in advising researchers that the ‘tendency toward secrecy is intensified’ in cults and sects.10 Investigators who are not aware of (or turn a blind eye to) this reality will be doing a disservice to their field and to their research subjects. When researchers don’t make efforts to look at what might be going on behind the scenes or when they accept the front stage at face value, the results will stack up alongside some of the weaker studies and analyses in this field.

A Reassuring Lie

An instance of whitewashing occurred recently when a sociologist of religion was flown out to the West Coast as an expert to do a report on a local controversial group for a lawyer and public relations firm hired by the group (or rather, a member of the group), after two or three days in the area, the scholar asserted in writing that the group was not a cult and there was no evidence of brainwashing. His report was sent to both local and national media, as well as to some families of group members who were being affected by the brewing controversy. The report helped to stave off (and water down) media exposes, and put another wedge between some of the families and their relatives in the group. The scholar’s findings, however, were at odds with other evidence from cross-corroborated, first-hand reports from almost a dozen former members and families of members, documentation from a lawsuit filed several years earlier, and extensive research done by a local investigative reporter, which included access to internal group materials and videotapes that gave direct evidence to support some of the allegations. Nevertheless, the scholar’s report, based on a couple days of visits orchestrated by the group, and supported by a well-paid public relations campaign, as well as legal threats, made it more difficult to shed light on this group’s controversial backstage behaviour.

Some apologists are quick to say that everything from the 913 Jonestown deaths to the allegations of child abuse or sexual improprieties in other groups are nothing but the result of the ‘bigoted and criminal’ anticult movement and a handful of ‘disgruntled and vengeful’ former cult members. Yet, I would argue that such mud merely sullies the waters but does not change the facts. Over the years there has been enough evidence that at least some cultic groups have engaged in illegal and harmful activities, and, on a lesser scale, have created environments held together by intense forms of enforced conformity and rigid methods of controlling and constraining their members. These situations and these aspects of organizational control must not be overlooked if we are to understand these groups and their behaviours, as well as attempt to comprehend the lives and choices of the individual members. We must not be guided by the ‘norms of academia [that] make us reluctant to believe or disseminate negative facets of controversial groups.’11 Rather it is vital to look beyond the surface appeal of any group in order to examine and assess both the individual and societal ramifications and implications. Then, as Barker suggested, one’s interpretation of raw data might ‘become a basis for social action’12 that is preferably positive in nature as opposed to simply being favourable to the group’s perspective.


The work of scholars in this field can and does have real-life implications. Apologies and whitewashing based on inadequate or biased research may  help perpetuate harm by glossing over or covering up questionable practices and activities, and, at worst, a variety of improprieties, abuses, or crimes. A researcher’s job is to produce knowledge and minimize distortion. Data provided by a cultic group should never be taken at face value, and being courted or toured about by the leadership is probably not a reliable avenue to anything other than a superficial view. Every charismatic group has gatekeepers who control the group’s environment, and many groups have vast public relations operations that send out polished views of their corporate world or public face. Similarly, claims by those who appear to be opponents of the group or merely taking a critical stance must also be verified and cross-corroborated; but they should not be ignored or discounted. If Lofland and Lofland’s13 ‘questioning mind-set’ is recommended in everyday research of ‘ordinary’ situations, then it seems only obvious that such an attitude would be all the more necessary when investigating cultic groups.

Use of Whistle-Blowers in Cult Research

Another important issue in researching cultic groups relates to the use of former members as informants and/or as researchers. Generally, in other contexts, a researcher will pursue information and insights from previous participants or other affiliated persons (e.g., consultants, business partners, investors, project collaborators, fellow-travellers, relatives). But in cult research, whether or not to use former members as sources of information has been a subject of much controversy. This controversy has raised such basic questions as the following: Should former member accounts be sought out, ignored, or overtly discounted and discredited? Are such accounts valid and reliable? Why or why not? Do all former cult members express negativity about their experiences, or just the ones who have been ‘deprogrammed’? Do all former cult members who speak critically about their experiences have an unworthy agenda? Why are some scholars adamant about deriding the accounts of those who are critical of their former group?

I love whitsleblowers

Central to this discussion is the issue of reliability and validity of so-called apostate accounts. According to Merriam-Webster’s Collegiate Dictionary (10th ed.), apostasy merely means ‘abandonment of a previous loyalty’ or ‘renunciation of a religious faith.’ Yet some scholars appear determined to discredit the testimony of any and all former cult members.14 They label former-member accounts as atrocity tales, and promote the idea that they should never be taken seriously. Instead, such researchers tend to rely on the accounts of leaders and current members, as well as accepting at face value the group’s literature (when it exists) and explanations.

However, other scholars do believe in truthfulness and the value of the accounts of former members. There is a risk in doing so, however—such as being accused of being an anticult-movement sympathizer, not getting published in certain academic journals, not being accepted as a conference participant, being pressured to conform, or, as discussed earlier, being threatened or harassed by the cult in question.

Former-member reports could be regarded as vital to obtaining a more comprehensive picture of certain cults. Especially taking into account the level at which a person functioned while in the group, a former-member informant who was in leadership or had other kinds of access to privileged locations or information is a valued source of information. Wilson noted that the lack of cooperation on the part of leaders or members will influence ‘what can be discovered and how what is discovered is understood.’15 In this vein, Zablocki16 reminds us that ethnographers rarely see anything but front-stage behaviour. The implication of Wilson’s and Zablocki’s comments is that it is even more crucial to gather data from those who have participated in and left a group. Seeing only front-stage behaviour typically means that a researcher will not get to hear members talk about what is really on their minds.


Many researchers in this field insist on the need for triangulation (using multiple sources of varying viewpoints), although few seem to practice what they preach. This lack of thoroughness has been reinforced by those who strive to delegitimate the entire category of former-member accounts. For such researchers there are two types of former members: (1) ‘good’ former members (called leave-takers) who leave the group quietly, and (2) ‘bad’ former members (labeled apostates) who voice discontent about their experiences.17 Here is but one example of this crude typology:

“The apostate is a defector who is aligned with an oppositional coalition in an effort to broaden a dispute, and embraces public claims-making activities to attack his or her former group. Unlike typical leavetakers whose responses range from indifference to quiet disenchantment, the apostate assumes a vituperative or hostile posture and pursues a moral campaign to discredit the group.”

Bromley,19 Wright,20 Lewis,21 and others put forth the notion that so-called career apostates (those bad former members) have won over and influenced the views of journalists, commentators, and, hence, the general public. Yet, I have seen no evidence of any solid effort on the part of those scholars to ascertain, for example, what percentage of former members actually even speak out about their groups, much less in the exaggerated form attributed to them. It is my contention that the image of the vengeful, fabricating apostate has a shabby foundation.

St. Anthony's Feast Day, January 17, 1997.
Many monks & nuns have left Geronda Ephraim’s monasteries, not a few under bad circumstances and “without a blessing.” However, most are afraid to speak about the true nature of what took place behind closed doors.
In the monasteries it is taught that the most ideal way for someone to practice Orthodoxy is through blind obedience to a Geronda (or Gerondissa).
Former monks & nuns who have spoken out are quickly dismissed by the monasteries as deluded, disgruntled ex-monastics who never did obedience.

Former members are reluctant to speak about their experiences or participate in public forums—not because they do not have important experiences and insights to share, but rather because they are self-critical, cautious, stigmatized, and fearful of lawsuits. In many instances, those who have decided to speak publicly or write about their experiences have chosen not to identify the group or name the leader.

In the end, former members have provided invaluable insights into complex phenomena, making important contributions to our understanding of cults and charismatic relationships.22

Monks and nuns from the various monasteries under Geronda Ephraim during St. Anthony Monastery’s Feast Day (ca. 2006)
What was your daily life like in the community?

The Issue of Reflexivity Bias

Reflexivity bias can occur when the researcher is also a member or an ex-member of the cult being studied and therefore sees the cult in part as reflected through his or her own internalized cult worldview and/or memories of cult experiences. Clearly, such researchers have the benefit of having had an inside look, which can provide insight into other similar situations and is a vantage point not often shared by others. Yet, the insider perspective can also colour what the researcher sees and the conclusions she draws. Along with the opportunities afforded by insider status, doing research on a cult with which one had been affiliated poses a set of unique challenges.

First, and most obviously, it is important to openly acknowledge any personal interest in the subject in general, any personal experiences that may influence objectivity, any residual cult or anticult point of view, as well as any bias one may have concerning the group being studied.23


Researchers who have been members of the groups they are studying have the considerable advantage of already knowing the cult’s private ‘language.’ They know what the leader means by his or her often-coded utterances. They know where to look, and what veils to shake loose. They are familiar with the effect of the leader’s words (spoken or written) on the devotees. They have lived with and shared the group’s attitudes towards outsiders. They may even be aware of the false claims, the tricks, the devices used to sway and convince. For example, in one group, words were manipulated to serve the leader’s sexual urges, so that ‘meditating with swami’ meant engaging in sexual activities with him.24 That usage was understood only by those in his inner circle who were expected to participate in the sexual behaviour. Someone from outside listening to an adherent of this particular swami, or observing behaviour around the ashram, probably would not catch on to the subterranean world of words, glances, gestures, relationships, and so on, whereas a former-member researcher stands to grasp more precisely the meanings of statements and actions.

The ability to comprehend a cult’s literature or spoken word is also enhanced by having shared the insider perspective.

Naturally, as with any research, a former member’s memories and perceptions must be corroborated, and triangulation becomes critical. But having been there lends a perspective and provides insights not otherwise possible. Ultimately, I see no problem in former cult members conducting research on their own group or any other so long as ‘experiences prior to entering the field [are] subjected to analytic reflection.’25

How to Get a Peek Backstage When the Cult Doesn’t Want You To

Cults are private organizations and deserve respect for their privacy. It follows from this that when a cult says ‘No, thank you’ to a request for research access, the ‘no’ response should be respected.  And, perhaps even more obviously, a researcher does not have the ethical right to infiltrate a cult’s circles by pretending to be a devotee. However, it does not follow from this that cults have the right to play it coy with researchers and show the pretty side while hiding the ugly. Researchers have rights too; and, once a researcher has been invited in, there is nothing wrong with trying to see behind the masks and the facade. Doing so during any but the briefest stay at a cult is far from impossible, but it does require that the researcher keep her wits about her.

Inside the altar during Divine Liturgy at St. Anthony's Monastery (AZ)
Inside the altar during Divine Liturgy at St. Anthony’s Monastery (AZ)

Various scholars have presented useful suggestions to help pave the way. More than a decade ago, Balch26 offered a comprehensive guide for the kind of data needed in the studies of these groups. His hope was that these categories would become standard. The categories are (1) demographic characteristics of membership, (2) historical development, (3) structure and content of belief system, (4) leadership, (5) social organization, (6) relationship between members and outsiders, (7) economic system, (8) material culture, (9) patterns of everyday life, (10) talk, (11) sexual relationships, (12) child-rearing, (13) deviance and social control, (14) recruitment strategies, (15) commitments demanded of members, (16) socialization techniques, (17) conversion experiences, and (18) defection. Balch argued that too often in published studies many of those topics were ignored or touched on too lightly to be useful for comparative purposes. In agreement with that perspective, I believe that researchers would be making a far greater contribution to the study of cults if they kept those categories in mind as they went about their work.

More recently, Balch and Langdon had these suggestions: ‘First, scholars who study alternative religions need to be familiar with the charges against them before they begin collecting data. Second, they should not take members’ claims at face value, however reasonable they seem. Third, they need to interview defectors and other critics to get different viewpoints, although here too they must be aware of hidden agendas. Finally, whatever the source of information, statements presented as fact need to be corroborated and verified with independent evidence.’27 Those four points, in my opinion, could serve as an invaluable guide for researchers of cults and controversial new religious and social movements.

  1. Watch how people relate to each other, and especially how they act around the leader.
  2. Ask tough questions—about money, about sex, about decision-making procedures, about time away from the group, about independent thinking. Be ready with specific questions, and don’t let them get deflected or turned back on you. Insist on specific answers, and don’t accept digressions or evasions. Get examples. Consider speaking with former members beforehand, so that you as the researcher will be armed with the types of probes that should generate some useful data.
  3. Look carefully at all mechanisms of conformity and control. Study the living quarters, clothing style, and speech and mannerisms to assess the extent of individual expression. Find out about group dynamics, criticism sessions, confessionals, or other means of using group processes to enforce conformity through humiliation, guilt, shame, and various means of influence and peer pressure.
  4. Determine how the group tolerates—or does not tolerate—dissent. Assess how former members are regarded, whether current members have access to former members or critical reports, and how much contact there is with families and other ‘outsiders.’ Also, find out if there is an internal justice system, a mechanism for feedback, and also one for appeals.
  5. When evaluating documents, use the same reflexive and critical thinking as in any other project. Be sure to review both external (for public consumption) and internal (for members only) documents; in the latter category, there are likely to be tiers of documents meant for members at ascending levels of commitment or trust. Using questions such as those posed in basic research texts would be a good start: ‘How are documents written? How are they read? Who writes them? Who reads them? For what purposes? On what occasions? With what outcomes? What is recorded? What is omitted? What does the writer seem to take for granted about the reader(s)? What do readers need to know in order to make sense of them? The list can be extended readily, and the exploration of such questions would lead the ethnographer inexorably towards a systematic examination of each and every aspect of everyday life in the setting in question.’28
  6. Fact-check everything you can. Group lore transforms easily into self-perpetuating myths that serve the cult’s image. It is important to look beyond the obvious, and use multiple sources of information and verification, including going outside the confines of the information provided by the cult and/or its archives.

Overall my advice is, be more like state investigators who drop in on nursing homes unannounced. Assume that things will be hidden, or prettied up. Be on the lookout for less-than-obvious findings and nonverbal cues. Charismatic leaders don’t need to hold a gun to their followers’ heads to get them to comply, but charismatic magic does only part of the job. Thus, explore specifically how the system works to bind members to the group and/or leader. Who are the key players and what are the crucial interactions? Where and when do they take place? How can you, the researcher, gain access to that?


As Lofland and Lofland cautioned, ‘The researcher is bound to doubt and to check and to hold all claims as simply claims. This creates an unavoidable tension between social scientists, group members, and any champions of those members.’29 An objective look that does not gloss over what is there requires being aware of the ways in which cultic groups can cover up or tone down. To reveal or write about these realities is not an attack on religious deviance or non-mainstream behaviour; rather, it is offering a more complete look at complex phenomena. Whether or not a researcher takes the next step of also providing a critique of certain social practices is an individual choice.

The full article can be read here:


  1. See Balch, Robert W., What’s Wrong with the Study of New Religions and What Can We Do About It?, Balch & Langdon, How the Problem of Malfeasance Gets Overlooked in Studies of New Religions, 1998. Barker, Eileen, New Religious Movements: A Practical Introduction, 1995. Carter, Lewis F., Charisma and Control in Rajneeshpuram: The Role of Shared Values in the Creation of a Community, 1990. Singer, Margaret Thaler, Cults in our Midst: The Hidden Menace in Our Everyday Lives, 1995. Tobias & Lalich, Captive Hearts, Captive Minds: Freedom and Recovery from Cults and Abusive Relationships, 1994. Wilson, Bryan R., Methodological Perspectives in the Study of Religious Minorities, 1988. Zablocki, Benjamin D., Distinguishing Front-stage from Back-stage Behavior in the Study of Religious Communities, 1997.
  2. The topic of researcher as potential convert has been adequately covered by Marybeth Ayella in ‘They Must Be Crazy’: Some of the Difficulties in Researching ‘Cults,’ http://abs.sagepub.com/content/33/5/562.extract
  3. Hammersley & Atkinson, Ethnography: Principles in Practice,
  4. Carter, Lewis F., Charisma and Control in Rajneeshpuram: The Role of Shared Values in the Creation of a Community,
  5. Ayella, Marybeth, ‘They Must Be Crazy’: Some of the Difficulties in Researching ‘Cults,’
  6. In some research textbooks the effort at derailing Wallis’ work on Scientology has become a case example of meddling in a researcher’s results and conclusions (see, for example, Hammersley & Atkinson, Ethnography: Principles in Practice, 1996: 283-4). Being the target of one of these campaigns is never fun. Such experiences have been described by Julius H. Rubin in Techniques for Suppressing Information Used by New Religious Groups, According to Rubin, because the leadership was displeased with what they considered to be critical reports of their group, Rubin was characterized as an enemy and sued for defamation, and other attempts were made to discredit him. In another instance, when Kent’s study on the leader of the Children of God (now The Family) was at the page proof stage, the article was withdrawn from an academic, peer-reviewed, annual publication because of the aggressive actions and threats towards the publisher (Kent & Krebs, Academic Compromise in the Social Scientific Study of Alternative Religions, 1998). Other incidents of harassment and intimidation of researchers and critics are recounted by Singer. Efforts such as these, whose aim is the control and suppression of information, tend to have a chilling effect on research.
  7. Barker, Eileen, New Religious Movements: A Practical Introduction, 1995, pp. 49-50.
  8. Barker, Eileen, The Scientific Study of Religion? You Must Be Joking!,
  9. Wilson, Bryan R., Methodological Perspectives in the Study of Religious Minorities, 1988, p. 238.
  10. In the article Wilson uses ‘religious minorities,’ although it is clear that he is referring to new religious movements, cults, and sects.
  11. Carter, Lewis F., Carriers of Tales: On Assessing Credibility of Apostate and Other Outsider Accounts of Religious Practices, 1998, p. 229.
  12. Barker, Eileen, New Religious Movements: A Practical Introduction, 1995, p. Xi.
  13. Lofland & Lofland, Analyzing Social Settings: A Guide to Qualitative Observation and Analysis,
  14. See, for example Bromley, David G., The Politics of Apostasy: The Role of Apostates in the Transformation of Religious Movements,
  15. Wilson, Bryan R., Methodological Perspectives in the Study of Religious Minorities, 1988, p. 230.
  16. Zablocki, Benjamin D., Reliability and Validity of Apostate Accounts in the Study of Religious Communities,
  17. Zablocki, Benjamin D., A Sociological Theory of Cults,
  18. Wright, Stuart A., Exploring Factors that Shape the Apostate Role, 1998, p. 109.
  19. Bromely, David G., The Social Construction of Contested Exit Roles,
  20. Wright, Stuart A., Another View of the Mt. Carmel Standoff,
  21. Lewis, James R., Self-fulfilling Stereotypes, the Anticult Movement, and the Waco Conflagration,
  22. See, for example, A Collective of Women,
  23. See Burawoy, et. al., Ethnography Unbound: Power and Resistance in the Modern Metropolis, 1991; Sobo & de Munck, The Forest of Methods, 1998; and Steier, Frederick, Reflexivity, Interpersonal Communication, and Interpersonal Communication Research,
  24. Betz, Katherine E., No Place to Go: Life in a Prison Without Bars,
  25. Hammersley & Atkinson, Ethnography: Principles in Practice,
  26. Balch, Robert W., What’s Wrong with the Study of New Religions and What Can We Do About It?,
  27. Balch & Langdon, How the Problem of Malfeasance Gets Overlooked in Studies of New Religions, 1998, p. 207.
  28. Hammersley & Atkinson, Ethnography: Principles in Practice, 1996, pp. 173-74.
  29. Lofland & Lofland, Analyzing Social Settings: A Guide to Qualitative Observation and Analysis, 1995, pp. 154-55

 Cults (1)