Brainwashing as a Scientific Concept (Benjamin Zablocki, 2001)

NOTE: The following article is taken from the 5th chapter of Misunderstanding Cults: Searching for Objectivity in a Controversial Field, entitled, Towards a Demystified and Disinterested Scientific Theory of Brainwashing.


What I am presenting here is not a ‘new’ theory of brainwashing but a conceptual model of the foundational theory developed in the mid-twentieth century by Lifton, Schein, and Sargant as it applies to charismatic collectivities. Because its scientific stature has been so frequently questioned, I will err on the side of formality by presenting a structured exposition of brainwashing theory in terms of eight definitions and twelve hypotheses. Each definition includes an operationalized form by which the trait may be observed. If either of the first two hypotheses disconfirmed, we must conclude that brainwashing is not being attempted in the cult under investigation. If any of the twelve hypotheses is disconfirmed, we must conclude that brainwashing is not successful in meeting its goals within that cult.

I do not pretend that the model outlined here is easy to test empirically, particularly for those researchers who either who either cannot or will not spend time immersing themselves in the daily lives of cults, or for those who are not willing, alternatively, to use as data the detailed retrospective accounts of ex-members. However, it should be clear that the model being proposed here stays grounded in what is empirically testable and does not involve mystical notions such as loss of free will or information disease (Conway and Siegelman 1978) that have characterized many of the extreme ‘anti-cult models.’

Nor do I pretend that this model represents the final and definitive treatment of this subject. Charismatic influence is still a poorly understood subject on which much additional research is needed. With few exceptions, sociology has treated it as if it were what engineers call a ‘black box,’ with charismatic inputs coming in one end and obedience outputs going out the other. What we have here is a theory that assists in the process of opening  this black box to see what is inside. It is an inductive theory, formed largely from the empirical generalizations of ethnographers and interviewers. The model itself presents an ideal-type image of brainwashing that does not attempt to convey the great variation among specific obedience-inducing processes that occur across the broad range of existing cults. Much additional refinement in both depth and breadth will certainly be needed.


EE 2018

D1. Charisma is defined, using the classical Weberian formula, as a condition of ‘devotion to the specific and exceptional sanctity, heroism, or exemplary character of an individual person, of the normative patterns or order revealed or ordained by him’ (Weber 1947: 328). Being defined this way, as a condition of devotion, leads us to recognize that charisma is not to be understood simply in terms of the characteristics of the leader, as it has come to be in popular usage, but requires an understanding of the relationship between leader and followers. In other words, charisma is a relational variable. It is defined operationally as a network of relationships in which authority is justified (for both superordinates and subordinates) in terms of the special characteristics discussed above.

D2. Ideological Totalism is a sociocultural system that places high valuation on total control over all aspects of the outer and inner lives of participants for the purpose of achieving the goals of an ideology defined as all important. Individual rights either do not exist under ideological totalism or they are clearly subordinated to the needs of the collectivity whenever the two come into conflict. Ideological totalism has been operationalized in terms of eight observable characteristics: milieu control, mystical manipulation, the demand for purity, the cult of confession, ‘sacred science,’ loading the language, doctrine over person, and the dispensing of existence (Lifton 1989: chap. 22).1

D3. Surveillance is defined as keeping watch over a person’s behaviour, and, perhaps, attitudes. As Hechter (1987) has shown, the need for surveillance is the greatest obstacle to goal achievement among ideological collectivities organized around the production of public goods. Surveillance is not only costly, it is also impractical for many activities in which agents of the collectivity may have to travel to act autonomously and at a distance. It follows from this that all collectivities pursuing public goals will be motivated to find ways to decrease the need for surveillance. Resources used for surveillance are wasted in the sense that they are unavailable for the achievement of collective goals.

D4. A deployable agent is one who is uncritically obedient to directives perceived as charismatically legitimate (Selznick 1960). A deployable agent can be relied on to continue to carry out the wishes of the collectivity regardless of his own hedonic interests and in the absence of any external controls. Deployability can be operationalized as the likelihood that the individual will continue to comply with hitherto ego-dystonic demands of the collectivity (e.g., mending, ironing, mowing the lawn, smuggling, rape, child abuse, murder) when not under surveillance.

D5. Brainwashing is an observable set of transactions between a charismatically structured collectively and an isolated agent of the collectivity, with the goal of transforming the agent into a deployable agent. Brainwashing is thus a process of ideological resocialization carried out within a structure of charismatic authority.

The brainwashing process may be operationalized as a sequence of well-defined and potentially observable phases. These hypothesized phases are (1) identity stripping, (2) identification, and (3) symbolic death/rebirth. The operational definition of brainwashing refers to the specific activities attempted, whether or not they are successful, as they are either observed directly by the ethnographer or reported in official or unofficial accounts by members or ex-members. Although the exact order of phases and specific steps within phases may vary from group to group, we should always expect to see the following features, or their functional equivalents, in any brainwashing system: (1) the constant fluctuation between assault and leniency; and (2) the seemingly endless process of confession, re-education, and refinement of confession.

D6. Hyper-credulity is defined as a disposition to accept uncritically all charismatically ordained beliefs. All lovers of literature and poetry are familiar with ‘that willing suspension of disbelief for the moment, which constitutes poetic faith’ (Coleridge 1970: 147). Hyper-credulity occurs when this state of mind, which in most of us is occasional and transitory, is transformed into a stable disposition. Hyper-credulity falls between hyper-suggestibility on the one hand and stable conversion of belief on the other.2 Its operational hallmark is plasticity in the assumption of deeply held convictions at the behest of an external authority. This is an other-directed form of what Robert Lifton (1968) has called the protean identity state.

D7. Relational Enmeshment is a state of being in which self-esteem  depends upon belonging to a particular collectivity (Bion 1959; Bowen 1972; Sirkin and Wynne 1990). It may be operationalized as immersion in a relational network with the following characteristics: exclusivity (high ratio of in-group to out-group bonds), interchangeability (low level of differentiation in affective ties between one alter and another), and dependency (reluctance to sever or weaken ties for any reason). In a developmental context, something similar to this has been referred to by Bowlby (1969) as anxious attachment.
D8. Exit Costs are the subjective costs experienced by an individual who is contemplating leaving a collectivity. Obviously, the higher the perceived exit costs, the greater will be the reluctance to leave. Exit costs may be operationalized as the magnitude of the bribe necessary to overcome them. A person who is willing to leave if we pay him $1,000 experiences lower exit costs than one who is not willing to leave for any payment less than $1,000,000. With regard to cults, the exit costs are most often spiritual and emotional rather than material, which makes measurement in this way more difficult but not impossible.


Not all charismatic organizations engage in brainwashing. We therefore need a set of hypotheses that will allow us to test empirically whether any particular charismatic system attempts to practise brainwashing and with what effect. The brainwashing model asserts twelve hypotheses concerning the role of brainwashing in the production of uncritical obedience. These hypotheses are all empirically testable. A schematic diagram of the model I propose may be found in Figure 1.

p. 186This model begins with an assumption that charismatic leaders are capable of creating organizations that are easy and attractive to enter (even though they may later turn out to be difficult and painful to leave). There are no hypotheses, therefore, to account for how charismatic cults obtain members. It is assumed that an abundant pool of potential recruits to such groups is always available. The model assumes charismatic leaders, using nothing more than their own intrinsic attractiveness and persuasiveness, are initially able to gather around them a corps of disciples sufficient for the creation of an attractive social movement. Many ethnographies (Lofland 1996; Lucas 1995) have shown how easy it is for such charismatic movement organizations to attract new members from the general pool of anomic ‘seekers’ that can always be found within the population of an urbanized mobile society.

Hieromonk Ephraim & Kids

The model does attempt to account for how some percentage of these ordinary members are turned into deployable agents. The initial attractiveness of the group, its vision of the future, and/or its capacity to bestow seemingly limitless amounts of love and esteem on the new member are sufficient inducements in some cases to motivate a new member to voluntarily undergo this difficult and painful process of resocialization.

H1. Ideological totalism is a necessary but not sufficient condition for the brainwashing process. Brainwashing will be attempted only in groups that are structures totalistically. However, not all ideologically totalist groups will attempt to brainwash their members. It should be remembered that brainwashing is merely a mechanism for producing deployable agents. Some cults may not want deployable agents or have other ways of producing them. Others may want them but feel uncomfortable about using brainwashing methods to obtain them, or they may not have discovered the existence of brainwashing methods.

H2. The exact nature of this resocialization process will differ from group to group, but, in general, will be similar to the resocialization process that Robert Lifton (1989) and Edgar Schein (1961) observed in Communist re-education centres in the 1950s. For whatever reasons, these methods seem to come fairly intuitively to charismatic leaders and their staffs. Although the specific steps and their exact ordering differ from group to group, their common elements involve a stripping away of the vestiges of an old identity, the requirement that repeated confessions be made either orally or in writing, and a somewhat random and ultimately debilitating alternation of the giving and the withholding of ‘unconditional’ love and approval. H2 further states that the maintenance of this program involves the expenditure of a measurable quantity of the collectivity’s resources. This quantity is known as C, where C equals the cost of the program and should be measurable at least at an ordinal level.

The resocialization process has baffled many observers, in my opinion because it proceeds simultaneously along two distinct but parallel tracks, one involving cognitive functioning and the other involving emotional networking. These two tracks lead to the attainment of states of hyper-credulity and relational enmeshment, respectively. The group member learns to accept with suspended critical judgement the often shifting beliefs espoused by the charismatic leader. At the same time, the group member becomes strongly attached to and emotionally dependent upon the charismatic leader and (often especially) the other group members, and cannot bear to be shunned by them.

Hidden in the dark

H3. Those who go through the process will be more likely than those who do not to reach a state of hyper-credulity. This involves the shedding of old convictions and the assumption of a zealous loyalty to these beliefs of the moment, uncritically seized upon, so that all such beliefs become not mere ‘beliefs’ but deeply held convictions.

Under normal circumstances, it is not easy to get people to disown their core convictions. Convictions, once developed, are generally treated not as hypotheses to test empirically but as possessions to value and cherish. There are often substantial subjective costs to the individual in giving them up. Abelson (1986: 230) has provided convincing linguistic evidence that most people treat convictions more as valued possessions than as ways of testing reality. Cognitive dissonance theory predicts with accuracy that when subject to frontal attack, attachment to convictions tends to harden (Festinger, Riechen et. al. 1956; O’Leary 1994). Therefore, a frontal attack on convictions, without first undermining the self-image foundation of these convictions, is doomed to failure. An indirect approach through brainwashing is often more effective.

Scott Nevins.

When the state of hyper-credulity is achieved, it leaves the individual strongly committed to the charismatic belief of the moment but with little or no critical inclination to resist charismatically approved new or contradictory beliefs in the future and little motivation to attempt to form accurate independent judgments of the consequences of assuming new beliefs. The cognitive track of the resocialization process begins by stripping away the old convictions and associating them with guilt, evil, or befuddlement. Next, there is a traumatic exhaustion of the habit of subjecting right-brain convictions to left-brain rational scrutiny. This goes along with an increase in what Snyder (1974) has called self-monitoring, implying a shift from central route to peripheral route processing of information in which the source rather than the content of the message becomes all important.

H4. As an individual goes through the brainwashing process, there will be an increase in relational enmeshment with measurable increases occurring at the completion of each of the three stages. The purging of convictions is a painful process and it is reasonable to ask why anybody would go through it voluntarily. The payoff is the opportunity to feel more connected with the charismatic relational network. These people have also been through it, and only they really understand what you are going through. So cognitive purging leads one to seek relational comfort, and this confort becomes enmeshing. The credulity process and the enmeshing process depend on each other.

The next three hypotheses are concerned with the fact that each of the three phases of brainwashing achieves plateaus in both of these processes. The stripping phase creates the vulnerability to this sort of transformation. The identification phase creates realignment, and the rebirth phase breaks down the barrier between the two so that convictions can be emotionally energized and held with zeal, while emotional attachments can be sacralized in terms of the charismatic ideology. The full brainwashing model actually provides far more detailed hypotheses concerning the various steps within each phase of the process. Space constraints make it impossible to discuss these here. An adequate technical discussion of the manipulation of language in brainwashing, for example, would require a chapter at least the length of this one. Figure 2 provides a sketch of the steps within each phase. Readers desiring more information about these steps are referred to Lifton (1989: chap. 5).

P. 190 (scrprnt)
The Stages of Brainwashing & Their Effect on Hyper-credulity and Emotional Enmeshment

page 191

H5. The stripping phase. The cognitive goal of the stripping phase is to destroy prior convictions and prior relationships of belonging. The emotional goal of the stripping phase is to create the need for attachments. Overall, at the completion of the stripping phase, the situation is such that the individual is hungry for convictions and attachments and dependent upon the collectivity to supply them. This sort of credulity and attachment behaviour is widespread among prisoners and hospital patients (Goffman 1961).

H6. The identification phase. The cognitive goal of the identification phase is to establish imitative search for conviction and bring about the erosion of the habit of incredulity. The emotional goal of the identification phase is to instill the habit of acting out through attachment. Overall, at the completion of the identification phase of the individual has begun the practice of relying on the collectivity for beliefs and for a cyclic emotional pattern of arousal and comfort. But, at this point this reliance is just one highly valued form of existence. It is not yet viewed as an existential necessity.

H7. The symbolic death and rebirth phase. In the death and rebirth phase, the cognitive and emotional tracks come together and mutually support each other. This often gives the individual a sense of having emerged from a tunnel and an experience of spiritual rebirth.3 The cognitive goal of this phase is to establish a sense of ownership of (and pride of ownership in) the new convictions. The emotional goal is to make a full commitment to the new self that is no longer directly dependent upon hope of attachment or fear of separation. Overall, at the completion of the rebirth phase we may say that the person has become a fully deployable agent of the charismatic leader. The brainwashing process is complete.

H8 states that the brainwashing process results in a state of subjectivity elevated exit costs. These exit costs cannot, of course, be observed directly. But they can be inferred from the behavioural state of panic or terror that arises in the individual at the possibility of having his or her ties to the group discontinued. The cognitive and emotional states produced by the brainwashing process together bring about a situation in which the perceived exit costs for the individual increase sharply. This closes the trap for all but the most highly motivated individuals, and induces in many a state of uncritical obedience. As soon as exit from a group (or even from its good graces) ceases to be a subjectively palatable option, it makes sense for the individual to comply with almost anything the group demands–even to the point of suicide in some instances. Borrowing from Sartre’s insightful play of that name, I refer to this situation as the ‘no exit’ syndrome. When demands for compliance are particularly harsh, the hyper-credulity aspect of the process sweetens the pill somewhat by allowing the individual to accept uncritically the justifications offered by the charismatic leader and/or charismatic organization for making these demands, however far-fetched these justifications might appear to an outside observer.

H9 states that the brainwashing process results in a state of ideological obedience in which the individual has a strong tendency to comply with any behavioural demands made by the collectivity, especially if motivated by the carrot of approval and the stick of threatened expulsion, no matter how life-threatening these demands may be and no matter how repugnant such demands might have been to the individual in his or her pre-brainwashed state.

H10 states that the ‘brainwashing process results in increased deployability. Deployability extends the range of ideological obedience in the temporal dimension. It states that the response continues after the stimulus is removed. This hypothesis will be disconfirmed in any cult within which members are uncritically obedient only while they are being brainwashed but not thereafter. The effect need not be permanent, but it does need to result in some measurable increase in deployability over time.

H11 states that the ability of the collectivity to rely on obedience without surveillance will result in a measurable decrease in surveillance. Since surveillance involves costs, this decrease will lead to a quantity S, where S equals the savings to the collectivity due to diminished surveillance needs and should be measurable at least to an ordinal level.

H12 states that S will be greater than C. In other words, the savings to the collectivity due to decreased surveillance needs is greater than the cost of maintaining the brainwashing program. Only where S is greater than C does it make sense to maintain a brainwashing program. Cults with initially high surveillance costs, and therefore high potential savings due to decreased surveillance needs [S], will tend to be more likely to brainwash, as will cults structured so that the cost of maintaining the brainwashing system [C] are relatively low.

Holy Archangel Monks socializing over wine

Characteristics of a Good Theory

There is consensus in the social sciences that a good inductive qualitative theory is one that is falsifiable, internally consistent, concrete, potentially generalizable, and has a well-defined dependent variable (king, Keohane et. al. 1994). I think it should be clear from the foregoing that this theory meets all of these conditions according to prevailing standards in the social and behavioural sciences. However, since brainwashing theory has received much unjustified criticism for its lack of falsifiability and its lack of generalizability, I will briefly discuss the theory from these two points of view.

The criterion of falsifiability, as formulated primarily by Popper (1968), is the essence of what separates theory from dogma in science. Every theory must be able to provide an answer to the question of what evidence would falsify it. If the answer is that there is no possible evidence that would lead us to reject a so-called theory, we should conclude that it is not really a theory at all but just a piece of dogma.

Although Dawson (1998) and Richardson (1993) have included the falsifiability problem in their critiques of brainwashing; this criticism is associated mainly with the work of Dick Anthony (1996). Anthony’s claim that brainwashing theory is unfalsifiable is based upon  two related misunderstandings. First, he argues that it is impossible to prove that a person is acting with free will so, to the extent that brainwashing theory rests on the overthrow of free will, no evidence can ever disprove it. Second, he applies Popper’s criterion to cults in a way more appropriate for a highly developed deductive theoretical system. He requires that either brainwashing explain all ego-dystonic behaviour in cults or acknowledge that it can explain none of it. But, as we have seen, brainwashing is part of an inductive multifactorial approach to the study of obedience in cults and should be expected to explain only some of the obedience produced in some cults.

With regard to generalizability, cultic brainwashing is part of an important general class of phenomena whose common element is what Anthony Giddens has called ‘disturbance of ontological security’ in which habits and routines cease to function as guidelines for survival (Cohen 1989: 53). This class of phenomena includes the battered spouse syndrome (Barnett and LaViolette 1993), the behaviour of concentration camp inmates (Chodoff 1966), the Stockholm Syndrome (Kuleshnyk 1984; Powell 1986), and, most importantly, behaviour within prisoner of war camps and Communist Chinese re-education centres and ‘revolutionary universities’ (Lifton 1989; Sargant 1957;  Schein 1961). There exist striking homologies in observed responses across all of these types of events, and it is right that our attention be drawn to trying to understand what common theme underlies them all. As Oliver Wendell Holmes (1891: 325) attempted to teach us more than a century ago, the interest of the scientist should be guided, when applicable, by ‘the plain law of homology which declares that like must be compared with like.’

Cats of St. Nektarios


  1. Because of space limitations, I cannot give this important subject the attention it deserves in this chapter. Readers not familiar with the concept are referred to the much fuller discussion of this subject in the book by Robert Lifton as cited.
  2. Students of cults have sometimes been misled into confusing this state of hyper vrdulity with either hyper suggestibility on the one hand or a rigid ‘true belief’ system on the other. But at least one study has shown that neither the hyper-suggestible, easily hypnotized person nor the structural true believer are good candidates for encapsulation in a totalist cult system (Solomon 1981: 111-112). True believers (often fundamentalists who see in the cult a purer manifestation of their own worldview than they have seen before) do not do well in cults and neither do dye-in-the-wool sceptics who are comfortable with their scepticism. Rather it is those lacking convictions but hungering for them that are the best candidates.
  3. Hopefully, no reader will think that I am affirming the consequent by stating that all experiences of spiritual rebirth must be caused by brainwashing. This model is completely compatible with the assumption that most spiritual rebirth experiences have nothing to do with brainwashing. The reasoning here is identical to that connecting epilepsy with visions of the holy. The empirical finding that seizures can be accompanied by visions of the holy does not in any way imply that such visions are always a sign of epilepsy.

Conversion Techniques: Changing Minds & Persuasion (Part 17 – Radicalization)


The process of radicalization, including social, ideological and purpose conversion, is something that is of great concern in times when radicals take extreme action. Here are some notes on how a person may become radicalized.


The process of radicalization often starts with some form of transgression by the other side, breaking rules that the person’s side holds as very important.


A common transgressing action is mistreatment, typically by the authorities or military personnel using methods that cause extreme physical pain or mental distress. The mistreatment may be of the person who hence becomes radicalized, but often it is other people who are lionized as heroes or martyrs.
For example, extreme methods of interrogation of suspects in Northern Ireland in the 1970s led to them becoming radicalized and their story leading to many others taking a strong position. More recently Abu Ghraib and Guantanamo Bay are clear candidates.

Mistreatment can be historical and reasons for radicalization can go back generations. Past wars, massacres, persecutions and so on can fester for hundreds of years.

Mistreatments today such as rape and child abuse are also extreme transgressions that effectively radicalize those who would severely punish the perpetrators. Many of us who think we would never be radicalized still hold extreme views on such topics.


If there is no direct mistreatment then the inherent badness of the other side may be inferred from their transgression of an inviolable law or value.
They may say things or take actions which are shocking and unthinkable, thereby proving their unworthiness. They may have betrayed a trust, defiled a holy object, conducted black rituals, blasphemed or otherwise shown a terrible lack of respect for people or social rules.

Religion has been a source of radicalized conflict for many centuries.


A radical needs a movement, a cause. At some point, the outrage at the transgression is converted into organization for consequent action.


A critical response to transgression is that some people at least are outraged or feel such a strong sense of betrayal to the extent that they seek justice, typically the extreme vengeance of retributive justice that lies outside national laws. This may be because the laws are seen as inadequate or because they represent governments who are the target of the outrage.


At some point, a core organization is set up to drive the ideals and action. This typically happens in two ways. One is where an individual leader starts alone. Secondly, the core may arise more spontaneously as concerned individuals find one another.

People in the core (often a single leader) may write or use a critical text or otherwise use charismatic oration to establish the central message.
Cores can also be diffused, for example where they are based on central texts which are interpreted and acted upon in localized core organizations.


After initial development of the core message and core group, the organization starts to develop. This may be done formally or remain relatively informal. Key parts of this are in promoting the message, recruiting new people and driving action.

This focus leads to a need for more people to spread the message and take action. The purpose of the core is then to sustain the focus and drive the rest of the organization.

The organization may be strictly hierarchical, but it may also be very diffuse, with independent cells adopting the ideals and acting on their own.


When the transgression leads to some people seeking revenge then they may seek to organize in some way, recruiting and converting others to the cause.


The call to arms goes through many channels, typically targeting groups where members may already feel the sense of injustice, such as minority religions, the unemployed, low-status women and so on. Other vulnerable people may well also be targeted.

Communication may include preaching, emails posters, one-to-one calls and so on. While these do not radicalize alone, they often take the first step in communicating urgency or outrage. Later, the volume and intensity of messages create enough tension to trigger action.

Initial communication may be subtle and seemingly about other subjects. Religions can be like this, first creating a desirable place, selling friendship and salvation before radical action.

Sooner or later, the subject of discussion turns to the basic transgression, including the mistreatment or immorality and the consequent sense of outrage. This creates anger and a desire for action.


A key part of the message is to demonize the other side, objectifying the people as subhuman, using amplification, negative stereotypes and simplified schema.

In doing so, the argument is polarized. By showing that the other side is so extreme, the simple conclusion is reached that extreme action is the only possible route forward. The arguments used may well be full of fallacies but the passion and underlying messages are clear.


A critical part of radicalization is often in the way the message is socialized, becoming a central part of everyday conversation outside of the rallying hall.
For socialization to work best, this conversation should be contained, with any contrary messages being kept at bay. Where possible, the people will be isolated to insulate them from external dissuasion. Where this is not possible, inoculation may be used to help them ward off other views.

Groupthink and other social means of ensuring conformance may also be used to keep people on track.


At some point, the need for action is raised and the radicalized person moved towards proving their passion.


The need to act and the required action may appear through the direction of a group leader, though it may also emerge via less structured groups talking about what they might do. Action can range from protest to acts of terrorism and may start small and escalate either with success or frustration at limited success.

A way that commitment is built is with sequential requests, such as Foot In The Door (FITD), where small initial requests that are easy to comply with are later followed up with larger requests.


Fulfilling the requirement is often linked to a promise of glory, from the admiration of peers to a guaranteed place in heaven.

People who have already taken such action are held up as heroes. They and their actions are glorified and the radicalized people made to feel almost in that state of being deeply admired by many. All that is needed is heroic action.
This in par

Conversion Techniques: Changing Minds & Persuasion (Part 16 – Progressive Demands)



Start with small, very reasonable requests. When they comply with these, make requests that require slightly more effort. In this way, gradually increase the requirements on them until they are doing whatever it is you really want them to do.

The requests can also gradually change from optional questions to strong demands. Start with ‘please’ and ‘if you like’, and then move to ‘you must’ and ‘do this now’. Progressive demands may also be used to get people to things which are increasingly further away from their normal values.


A person in a group is asked, in sequence to:

– Help carry leaflets to town
– Help hand out leaflets
– Say a few words about the leaflets
– Speak more about the leaflets
– Ask for a donation
– Use a megaphone to talk about the subject
– Use persuasive methods to get donations
– Pursue people aggressively to get them to ‘donate’

At each stage, there is implied promise of acceptance into the inner circle if just this action is taken.


When you are walking up a convex hill, it often looks like the top of the hill is not far away, but as you walk, the brow just stays out of reach. This is not unlike the experience of progressive demands. Most of the time, you think that you are nearly there.

The overall approach is ‘foot in the door’, where a small request is followed up with a larger request. This works by the consistency principle, whereby having agreed to do something, we believe that we wanted to do it and so start to shift our underlying belief system to align with our actions. Before long, we are effectively brainwashed as we believe that even the extreme actions we have been led to are the right things to do.

This consistency effect may also includes a belief that we must continue doing what we are told. When this leads to increasing dependency, we end up blindly following orders.

Conversion Techniques: Changing Minds & Persuasion (Part 15 – Persistance)


Many groups are nothing if not persistent in hauling in the fish of the new member. Once they have decided that you will make a good member and are ripe for conversion, they will not leave you alone.


Almost magically, the recruiting group members seem to be around where the target person is. They turn up at where they are studying, working or playing. From conversations and other research they track down their prey and then they play their strategy.

Whatever works

When they are in the presence of the person, they then play out their strategic plans for reeling them in.

Their strategic persistence is often very adaptive. If you are seeking meaning, they offer deep meaning. If you want a platform they will listen very attentively. If you are lost, they will show the way. They will also seek out your weaknesses along the way and will use whatever it takes to get you to join the group.
They may start with friendship but may also become strident and insistent. They may play guilt games and seek to create an exchange, for example by telling the target person how they have ‘come all this way’ to see them and obliging them to spend time with the cult member.

They may also play games of hurt and rescue, perhaps engineering situations where the target person gets hurt so that they can be rescued. Like a fisherman reeling in the line and then letting it play out again, they steadily pull in their prey towards the net.

Conversion Techniques: Changing Minds & Persuasion (Part 14 – The Love Bomb)


The Love Bomb is a classic method used by groups (and is particularly associated with cults) in the initial attraction of new members.

Find someone starved of love

The first step is in spotting the potential group member as somebody who is seeking love and affection. They may have been rejected by partners, parents, siblings, peers, or other such developmental problems. A the common factor is a need for affection that they are unable to find in their current relationships.
The cause of them not finding love may or may not be due to some problem on their part. For example the person may be so desperate that they chase people too ardently, effectively chasing them away.

They may also be affected by a personal trauma, feeling depressed or detached from what might seem an uncaring world.

Offer them unconditional affection

The group members approach the target person as if they were their best friend. Whatever the person says is considered remarkable and interesting. Quirks of personality are ignored. Attention and affection are showered on them by all members of the group at every opportunity.

They are invited to simple meetings at which the attention continues and they are made to feel special at every opportunity. If the group members can determine the ideal type of friend that the person is seeking (which they find through empathetic and concerned listening and probing).

Use love as a reward for correct behavior

When the people join the group, then love becomes one of the methods of keeping them there. It is not now as constant as it was before and it is certainly not unconditional. Now love is given as a reward and removed when behavior is not what is required.

Thus, when people in groups are asked about why they stay, they will still talk about it being a loving place. Their attention and the preaching within the group may well be about love, but it is now on a diet, and they are taught that affection is a just reward for correct behavior.

Conversion Techniques: Changing Minds & Persuasion (Part 13 – Isolation)


One of the methods by which groups convert and retain members is by separating them from influences that enable or encourage them to think in contrary ways.


One of the first dilemmas for groups seeking to recruit new members is how to get them in one place long enough to apply sufficient persuasion to cause them to convert (or at least take the next step in the right direction).

The weekend session

One of the most effective ways of doing this is to invite them to a ‘weekend in the country’. The event may be framed as getting to know more friends, discussions, education or other attractive purposes. If they cannot attend the whole weekend, they are invited for one day, then coaxed into staying longer, perhaps by promises of revelations the next day and through social pressure of everyone else staying.

Social events

Another method is through shorter-term sessions, perhaps lasting just one evening, where it may appear that there are a number of other recruits who all are persuaded – whilst the truth might be that they are already full members of the group.

Individual relationships

An even slower method is to build one-to-one relationships, which may even be romantic in nature or may just be based on apparent friendship. Over time, the person persuading steadily moves the other person’s thinking until they are ready for something like the weekend session.

Excluding contrary influence

If a person is provided with persuasive arguments, they may be dissuaded from joining the group or even persuaded to leave by contrary arguments (particularly if the original arguments are shaky).

Physical isolation

The first stage is to isolate people from external influences by moving the people physically away from them. Hence the weekend session is most effectively done when there is no way for the people to escape (for example they were transported there by group members and it is a long way home).
Isolation may also be within the walls of a building within a town or city (although now it is easier for the person to leave) or even within one room. For single meetings, this is often all that is needed. Even meeting in public places is sufficient if no dissuading messages may be seen or heard.

The bottom line is often the question of how far different from the person’s life the persuasive message is. If they are being told how bad the world is, then meeting in a pleasant restaurant is probably not very effective. Yet it may be a good place to talk about the joys of the group.

Mental isolation

There are many ways that a person can be made to feel alone, and hence seek the attention of whoever is there. If they are told that all they have once held to be true, then they will start to feel uncertain.

Emotional isolation occurs when they feel that others who they once trusted actually do not care about them. For example if communications from friends and family are blocked, it may be suggested that they have not communicated because they do not care. Fears about others not caring may be amplified in discussions.

Solitary confinement is known as a severe punishment, as full physical isolation leads to full mental isolation. Confined prisoners may hallucinate, lose track of time and become depressed and desperate. Any feelings of being alone leads to seeking any company and any discussion to fill the intellectual and emotional emptiness.

Control of media

Once physical isolation is achieved, a further step is to use information control to ensure that no contrary messages appear by accident. Thus newspapers, television, books etc. may all be removed, censured or controlled. These can then be replaced with confirming and persuading literature and other media.
If all around them the people see messages that point in one direction, then they are more likely to accept the messages as true. Messages from apparently different sources that all say then same thing can be more persuasive than from one source alone.

Social confirmation

Perhaps the most persuasive message is one that you are told in the corridor by friends who seem not to have any particular axe to grind. Social confirmation occurs when everyone else confirms the core message. What is not always noticed in this is that this is also social isolation – those who would contradict the message are being kept away.

Excluding contrary thinking

A final level of persuasion is to isolate thoughts within the heads of the people being persuaded.

Black-and-white thinking

With the use of polarized values and messages, the group are painted as being whiter than white and everyone else as a darker shade of black. Choices are stark, even in thinking. There is no ‘maybe’. You are either with us or against us.


When values are involved, then the choices are not just between agreement or disagreement – they are about good and bad. Any thought that is against group values and rules is framed as bad, which carries a heavy guilt penalty.
People can thus be persuaded and coerced into feeling a strong sense of shame about every bad thought they have. Phobias may even be induced about contrary thinking and even the thought of having a ‘wrong’ thought can induce panic.


Thought-stopping includes various methods of stopping thinking by distraction or dissuasion. For example a group member who meets someone from outside who tries to get them to leave the group will effectively isolate themselves from the argument by retreating back inside their heads and ignoring the dissuasive argument.

Conversion Techniques: Changing Minds & Persuasion (Part 12 – Incremental Conversion)


Conversion to another set of beliefs may be sudden or it may happen slowly over a period of time.

The incremental personality

A person who is converted to a different set of beliefs over time takes a very different approach to the person who converts suddenly.

They have a need for truth more than certainty, gaining their sense of control through the certainty of understanding rather than the certainty of blind belief. They may thus prefer to question arguments rather than accept external authorities. They may be cynical and untrusting, needing to be persuaded by reason rather than by assertion.

They will see the world in shades of gray, rather than black and white, and hence are able to change their beliefs one step at a time. They may be able to hold both beliefs simultaneously, even if they are diametrically opposed, either by compartmentalizing their thoughts or by having a worldview that sees things in terms of possibility rather than certainty.

Evidence and experience

A key trick in converting incrementally is to pile up the evidence one step a time. Giving a lot of evidence at once is likely to result in the person feeling overloaded and hence ignoring all of the evidence or at least a significant portion.

If attention is paid to the stress people are showing from receiving new information, then the right point at which to stop can be found. This may be determined with experiments in non-important areas.

An effective way of providing evidence is through direct experience. If a person sees, hears or feels something first-hand, then it is far more difficult for them to deny this than if they are told about it second- or third-hand.

Reflection and integration

Even after an experience, the person may not convert or even take a small step. In this case they probably need time in which to reflect on the actual meaning (to them) of what they have heard or have experienced.

Reflection can be enabled by giving the person time to think. It can also be encouraged by open discussion and ‘musings’ that nudge them in the right direction.

When they have made sense of their experience, they still need to integrate this thinking into their current schemas (or ‘mental models’). Again, this needs time, and focused conversation may be used to uncover the relevant schema and subsequently help them fit the new ideas into this framework.