Cultural Contention over the Concept of Brainwashing (Benjamin Zablocki, 2001)

NOTE: The following article is taken from the 5th chapter of Misunderstanding Cults: Searching for Objectivity in a Controversial Field, entitled, Towards a Demystified and Disinterested Scientific Theory of Brainwashing.

That Word ‘Brainwashing’

The word brainwashing is, in itself, controversial and arouses hostile feelings. Since there is no scientific advantage in using one word rather than another for any concept, it may be reasonable in the future to hunt around for another word that is less polemical. We need a universally recognized term for a concept that stands for a form of influence manifested in a deliberately and systematically applied traumatizing and obedience-producing process of ideological resocialization.

brainwash

Currently, brainwashing is the generally accepted term for this process, but I see no objection to finding another to take its place. There are in fact other terms, historically, that have been used instead, like ‘thought reform’ and ‘coercive persuasion.’ Ironically, it has been those scholars who complain the most about ‘the B-word’ who have also been the most insistent that none of the alternatives is any better. As long as others in the field insist on treating all possible substitute constructions as nothing more than gussied-up synonyms for a mystified concept of brainwashing (see, for example, Introvigne 1998: 2), there is no point as yet in trying to introduce a more congenial term.

An overly literal reading of the word brainwashing (merely a literal translation of the accepted Chinese term shi nao) could be misleading, as it seems to imply the ability to apply some mysterious biochemical cleanser to people’s brains. However, the word has never been intended as a literal designator but as a metaphor. It would be wise to heed Clifford Geertz’s (1973: 210) warning in this connection, to avoid such a ‘flattened view of other people’s mentalities [that] more complex meanings than [a] literal reading suggests [are] not even considered.’

Thus, please don’t allow yourself to become prejudiced by a visceral reaction to the word instead of attending to the underlying concept. There is a linguistic tendency, as the postmodernist critics have taught us, for the signified to disappear beneath the signifier. But the empirically based social sciences must resist this tendency by defining terms precisely. The influence of media-driven vulgarizations of concepts should be resisted. This chapter argues for the scientific validity of a concept, not a word. If you are interested in whether the concept has value, but you gag on the word, feel free to substitute a different word in its place. I myself have no particular attachment to the word brainwashing.

But if all we are talking about is an extreme form of influence, why do we need a special name for it at all? The name is assigned merely for convenience. This is a common and widely accepted practise in the social sciences. For example, in economics a recession is nothing more than a name we give to two consecutive quarters of economic contraction. There is nothing qualitatively distinctive about two such consecutive quarters as opposed to one or three. The label is assigned arbitrarily at a subjective point at which many economists begin to get seriously worried about economic performance. This label is nevertheless useful as long as we don’t reify it by imagining that it stands for some real ‘thing’ that happens to the economy when it experiences precisely two quarters of decline. Many other examples of useful definitions marking arbitrary points along a continuum could be cited. There is no objective way to determine the exact point at which ideological influence becomes severe and encompassing enough, and its effects long lasting enough, for it to be called brainwashing. Inevitably, there will be marginal instances that could be categorized either way. But despite the fact that the boundary is not precisely defined, it demarcates a class of events worthy of systematic study.

The Reciprocal Moral Panic

Study of brainwashing has been hampered by partisanship and tendentious writing on both sides of the conflict. In one camp, there are scholars who very badly don’t want there to be such a thing as brainwashing. Its non-existence, they believe, will help assure religious  liberty, which can only be procured by defending the liberty of the most unpopular religions. If only the non-existence of brainwashing can be proved, the public will have to face up to the hard truth that some citizens choose to follow spiritual paths that may lead them in radical directions. This camp has exerted its influence within academia. But, instead of using its academic skills to refute the  brainwashing conjecture, it has preferred to attack a caricature of brainwashing supplied by anti-cult groups for litigational rather than scientific purposes.

Ecological Fallacy

In the other camp, we find scholars who equally badly do want there to be such a thing as brainwashing. Its existence, they believe, will give them a rationale for opposition to groups they consider dangerous. A typical example of their reasoning can be found in the argument put forth by Margaret Singer that ‘Despite the myth that normal people don’t get sucked into cults, it has become clear over the years that everyone is susceptible to the lure of these master manipulators’ (Singer 1995: 17). Using a form of backward reasoning known as the ecological fallacy, she argues from the known fact that people of all ages, social classes, and ethnic backgrounds can be found in cults to the dubious conclusion that everyone must be susceptible. These scholars must also share some of the blame for tendentious scholarship. Lacking positions of leadership in academia, scholars on this side of the dispute have used their expertise to influence the mass media, and they have been successful because sensational allegations of mystical manipulative influence make good journalistic copy.

It’s funny in a dreary sort of way that both sides in this debate agree that it is a David and Goliath situation, but each side fancies itself to be the David courageously confronting  the awesome power of the opposition. Each side makes use of an exaggerated fear of the other’s influence to create the raw materials of a moral panic (Cohen 1972; Goode and Ben Yehudah 1994). Thus, a disinterested search for truth falls victim to the uncompromising hostility created by each side’s paranoid fear of the power of the other.

David with the Head of Goliath
David with the head of Goliath.

The ‘cult apologists’ picture themselves as fighting an underdog battle against hostile lords of the media backed by their armies of ‘cult-bashing’ experts. The ‘cult bashers’ picture themselves as fighting an underdog battle for a voice in academia in which apologists seem to hold all the gatekeeper positions. Each side justifies its rhetorical excesses and hyperbole by reference to the overwhelming advantages held by the opposing side within its own arena. But over the years a peculiar symbiosis has developed between these two camps. They have come to rely on each other to define their positions. Each finds it more convenient to attack the positions of the other than to do the hard work of finding out what is really going on in cults. Thomas Robbins (19888: 74) has noted that the proponents of these two models ‘tend to talk past each other since they employ differing interpretative frameworks, epistemological rules, definitions… and underlying assumptions.’ Most of the literature on the subject has been framed in terms of rhetorical disputes between these two extremist models. Data-based models have been all but crowded out.

Between these two noisy and contentious camps, we find the curious but disinterested scientist who wants to find out if there is such a thing as brainwashing but will be equally satisfied with a positive or negative answer. I believe that there can and should be a moderate position on the subject. Such a position would avoid the absurdity of denying any reality to what thousands of reputable ex-cult members claim to have experienced–turning this denial into a minor cousin of holocaust denial. At the same time, it would avoid the mystical concept of an irresistible and overwhelming force that was developed by the extremist wing of the anti-cult movement.

One of the most shameful aspects of this whole silly affair is the way pro-religion scholars have used their academic authority to foist off the myth that the concept of brainwashing needs no further research because it has already been thoroughly debunked. Misleadingly, it has been argued (Introvigne forthcoming; Melton forthcoming) that the disciplines of psychology and sociology, through their American scholarly associations, have officially declared the concept of brainwashing to be so thoroughly discredited that no further research is needed. Introvigne, by playing fast and loose with terminology, attempts to parlay a rejection of a committee report into a rejection of the brainwashing concept by the American Psychological Association. He argues that ‘To state that a report “lacks scientific rigor” is tantamount to saying that it is not scientific’ (Introvigne 1998: 3), gliding over the question of whether the ‘it’ in question refers to the committee report or the brainwashing concept.2 Conveniently, for Introvigne, the report in question was written by a committee chaired by Margaret Singer, whose involuntarist theory of brainwashing is as much a distortion of the foundational concept as Introvigne’s parody of it.

The truth is that both of these scholarly associations (American Psychological Association and American Sociological Association) were under intense pressure by a consortium of pro-religious scholars (a.k.a. NRM scholars) to sign an amicus curiae brief alleging consensus within their fields that brainwashing theory had been found to be bunk. This was in regard to a case concerning Moonie brainwashing that was before the United States Supreme Court (Molko v Holly Spirit Ass’n., Supreme Court of Calif. SF 25038; Molko v Holly Spirit Ass’n, 762 p.2d 46 [Cal. 1988], cert. Denied, 490 U.S. 1084 [1989]). The bottom line is that both of the associations, after bitter debate, recognized that there was no such consensus and refused to get involved. Despite strenuous efforts of the NRM scholars to make it appear otherwise, neither professional association saw an overwhelming preponderance of evidence on either side. Both went on the record with a statement virtually identical to my argument in this chapter: that not nearly enough is known about this subject to be able to render a definitive scientific verdict, and that much more research is needed. A few years later, the Society for the Scientific Study of Religion went on record with a similar statement, affirming ‘the agnostic position’ on this subject and calling for more research (Zablocki 1997: 114).

Although NRM scholars have claimed to be opposed only to the most outrageously sensationalized versions of brainwashing theory, the result, perhaps unintended, of their campaign has been to bring an entire important area of social inquiry to a lengthy halt. Evidence of this can be seen in the fact that during the period of 1962 to 2000, a time when cults flourished, not a single article supportive of brainwashing has been published in the two leading American journals devoted to the sociology of religion, although a significant number of such articles have been submitted to those journals and more than a hundred such articles have appeared in journals marginal to the field (Zablocki 1998: 267)

Crime Scene Photo of Heaven's Gate Bodies Found in Rancho Santa Fe, CA (1)
Crime Scene Photo of Heaven’s Gate Bodies Found in Rancho Santa Fe, CA.

 

The erroneous contention that brainwashing theory has been debunked by social science research has been loudly and frequently repeated, and this ‘big lie’ has thus come to influence the thinking of neutral religion scholars. For example, even Winston Davis, in an excellent article on suicidal obedience in Heaven’s Gate, expresses characteristic ambivalence over the brainwashing concept:

‘Scholarship in general no longer accepts the traditional, simplistic theory of brainwashing… While the vernacular theory of brainwashing may no longer be scientifically viable, the general theory of social and psychological conditioning is still rather in good shape… I therefore find nothing objectionable [sic] in Benjamin Zablocki’s revised theory of brainwashing as ‘a set of transactions between a charismatically led collectivity and an isolated agent of the collectivity with the goal of transforming the agent into a deployable agent.’ The tale I have to tell actually fits nicely into several of Robert Lifton’s classical thought reform categories (Davis 2000: 241-2).

The problem with this all too typical way of looking at things is the fact that I am not presenting some new revised theory of brainwashing but simply a restatement of Robert Lifton’s (1989, 1999) careful and rigorous theory in sociological terms.

There are, I believe, six issues standing in the way of our ability to transcend this reciprocal moral panic. Let us look closely at each of these issues with an eye to recognizing that both sides in this conflict may have distorted the scientifically grounded theories of the foundational theorists–Lifton (1989), Sargant (1957), and Schein (1961)– as they apply to cults.

The Influence Continuum

The first issue has to do with the contention that brainwashing is a newly discovered form of social influence involving a hitherto unknown social force. There is nothing about charismatic influence and the obedience it instills that is mysterious or asks us to posit the existence of a new force. On the contrary, everything about brainwashing can be explained entirely in terms of well-understood scientific principles. As Richard Ofshe has argued: ‘Studying the reform process demonstrates that it is no more or less difficult to understand than any other complex social process and produces no results to suggest that something new has been discovered. The only aspect of the reform process that one might suggest is new, is the order in which the influence procedures are assembled and the degree to which the target’s environment is manipulated in the service of social control. This is at most an unusual arrangement of commonplace bits and pieces’ (1992: 221-2).

Would-be debunkers of the brainwashing concept have argued that brainwashing theory is not just a theory of ordinary social influence intensified under structural conditions of ideological totalism, but is rather a ‘special’ kind of influence theory that alleges that free will can be overwhelmed and individuals brought to a state of mind in which they will comply with charismatic directives involuntarily, having surrendered the capability of saying no. Of course, if a theory of brainwashing really did rely upon such an intrinsically untestable notion, it would be reasonable to reject it outright.

The attack on this so-called involuntarist theory of brainwashing figures prominently in the debunking efforts of a number of scholars (Barker 1989; Hexham and Poewe 1997; Melton forthcoming), but is most closely identified with the work of Dick Anthony (1996), for whom it is the linchpin of the debunking argument. Anthony argues, without a shred of evidence that I have been able to discover, that the foundational work of Lifton and Schein and the more recent theories of myself (1998), Richard Ofshe (1992), and Stephen Kent (Kent and Krebs 1998) are based upon what he calls the ‘involuntarism assumption.’ It is true that a number of prominent legal cases have hinged on the question of whether the plaintiff’s free will had been somehow overthrown (Richardson and Ginsburg 1998). But nowhere in the scientific literature has there been such a claim. Foundational brainwashing theory has not claimed that subjects were robbed of their free will. Neither the presence nor the absence of free will can ever be proved or disproved. The confusion stems from the difference between the word free as it is used in economics as an antonym for costly, and as it is used in philosophy as an antonym for deterministic. When brainwashing theory speaks of individuals losing the ability to freely decide to obey, the word is being used in the economic sense. Brainwashing imposes costs, and when a course of action has costs it is no longer free. The famous statement by Rousseau (1913, p.3) that ‘Man is born free, and everywhere he is in chains,’ succinctly expresses the view that socialization can impose severe constraints on human behaviour. Throughout the social sciences, this is accepted almost axiomatically. It is odd that only in the sociology of new religious movements is the importance of socialization’s ability to constrain largely ignored.    

Geronda Ephraim (AZ)      

Unidirectional versus Bi-directional Influence

The second issue has to do with controversy over whether there are particular personality types drawn to cults and whether members are better perceived as willing and active seekers or as helpless and victimized dupes, as if these were mutually exclusive alternatives. Those who focus on the importance of the particular traits that recruits bring to their cults tend to ignore the resocialization process (Anthony and Robbins 1994).3 Those who focus on the resocialization process often ignore personal predispositions (Singer and Ofshe 1990).

All this reminds me of being back in high school when people used to gossip about girls who ‘got themselves pregnant.’ Since that time, advances in biological theory have taught us to think more realistically of ‘getting pregnant’ as an interactive process involving influence in both directions. Similarly, as our understanding of totalistic influence in cults matures, I think we will abandon undirectional explanations of cultic obedience in favour of more realistic, interactive ones. When that happens, we will find ourselves able to ask more interesting questions than we do now. Rather than asking whether it is the predisposing trait or a manipulative process that produces high levels of uncritical obedience, we will ask just what predisposing traits of individuals interact with just what manipulative actions by cults to produce this outcome.

A number of the debunking authors use this artificial and incorrect split between resocialization and predisposing traits to create a divide between cult brainwashing theory and foundational brainwashing theory as an explanation for ideological influence in China and Korea in the mid-twentieth century. Dick Anthony attempts to show that the foundational literature really embodied two distinct theories. One, he claims, was a robotic control theory that was mystical and sensationalist. The other was a theory of totalitarian influence that was dependent for its success upon pre-existing totalitarian beliefs of the subject which the program was able to reinvoke (Anthony 1996: i). Anthony claims that even though cultic brainwashing theory is descendant from the former, it claims its legitimacy from its ties to the latter.

The problem with this distinction is that it is based upon a misreading of the foundational literature (Lifton1989; Schein 1961). Lifton devotes chapter 5 of his book to a description of the brainwashing process. In chapter 22 he describes the social structural conditions that have to be present for this process to be effective. Anthony misunderstands this scientific distinction. He interprets it instead as evidence that Lifton’s work embodies two distinct theories: one bad and one good (Anthony and Robbins 1994). The ‘bad’ Lifton, according to Anthony, is the chapter 5 Lifton who describes a brainwashing process that may have gone on in  Communist reindoctrination centres, but which, according to Anthony, has no applicability to contemporary cults. The ‘good’ Lifton, on the other hand, describes in chapter 22 a structural situation that Anthony splits off and calls a theory of thought reform. Anthony appears to like this ‘theory’ better because it does not involve anything that the cult actually does to the cult participant (Anthony and Robbins 1995). The cult merely creates a totalistic social structure that individuals with certain predisposing traits may decide that they want to be part of.

Unfortunately for Anthony, there are two problems with such splitting. One is that Lifton himself denies any such split in his theory (Lifton 1995, 1997). The second is that both  an influence process and the structural conditions conducive to that process are necessary for any theory of social influence. As Lifton demonstrates in his recent application of his theory to a Japanese terrorist cult (Lifton 1999), process cannot be split off from structure in any study of social influence.

Geronda Ephraim, Monks, Devotees

Condemnatory Label versus Contributory Factor

The third issue has to do with whether brainwashing is meant to replace other explanatory variables or work alongside them. Bainbridge (1997) and Richardson (1993) worry about the former, complaining that brainwashing explanations are intrinsically unifactoral, and thus inferior to the multifactoral explanations preferred by modern social science. But brainwashing theory has rarely, if ever, been used scientifically as a unifactoral explanation. Lifton (1999) does not attempt to explain all the obedience generated in Aum Shinrikyo by the brainwashing mechanism. My explanation of the obedience generated by the Nruderhof relies on numerous social mechanisms of which brainwashing is only one (Zablocki 1980). The same can be said for Ofshe’s explanation of social control in Synanon (1976). Far from being unifactoral, brainwashing is merely one essential element in a larger strategy for understanding how charismatic authority is channelled into obedience.

James Thurber once wrote a fable called The Wonderful (1957), which depicted the cultural collapse of a society that was free to express itself using twenty-five letters of the alphabet but was forbidden to use the letter O for any reason. The intellectual convolutions forced on Thurber’s imaginary society by this ‘slight’ restriction are reminiscent of the intellectual  convolutions forced on the NRM scholars by their refusal to include brainwashing in their models. It is not that these scholars don’t often have considerable insight into cult dynamics, but the poor mugs are, nevertheless, constantly getting overwhelmed by events that their theories are unable to predict or explain. You always find them busy playing catch-up as they scramble to account for each new cult crisis as it develops on an ad hoc basis. The inadequacy of their models cries out ‘specification error’ in the sense that a key variable has been left out.

The Thurberian approach just does not work. We have to use the whole alphabet of social influence concepts from Asch to Zimbardo (including the dreaded B-word) to understand cultic obedience. Cults are a complex social ecology of forces involving attenuation effects (Petty 1994), conformity (Asch 1951), crowd behaviour (Coleman 1990), decision elites (Wexler 1995), deindividuation (Festinger, Pepitone et. al. 1952), extended exchange (Stark 1999), groupthink (Janis 1982), ritual (Turner (1969), sacrifice and stigma (Iannaccone 1992), situational pressures (Zimbardo and Anderson 1993), social proof (Cialdini 1993), totalism (Lifton 1989), and many others. Personally, I have never seen a cult that was held together only by brainwashing and not also by other psychological factors, as well as genuine loyalty to ideology and leadership.

Arguments that brainwashing is really a term of moral condemnation masquerading as a scientific concept have emerged as a reaction to the efforts of some anti-cultists (not social scientists) to use brainwashing as a label to condemn cults rather than as a concept to understand them. Bromley (1998) has taken the position that brainwashing is not a variable at all but merely a peremptory label of stigmatization–a trope for an ideological bias, in our individualistic culture, against people who prefer to live and work more collectivistically. Others have focused on the observe danger of allowing brainwashing to be used as an all-purpose moral excuse (It wasn’t my fault. I was brainwashed!), offering blanket absolution for people who have been cult members–freeing them from the need to take any responsibility for their actions (Bainbridge 1997; Hexham and Poewe 1997; Introvigne forthcoming; Melton forthcoming). While these allegations represent legitimate concerns about potential abuse of the concept, neither is relevant to the scientific issue. A disinterested approach will first determine whether a phenomenon exists before worrying about whether its existence is politically convenient.

Geronda Ephraim baptism

Obtaining Members versus Retaining Members

The fourth issue has to do with a confusion over whether brainwashing explains how cults obtain members or how they retain them. Some cults have made use of manipulative practices like love-bombing and sleep deprivation (Galanti 1993), with some degrees of success, in order to obtain new members. A discussion of these manipulative practices for obtaining members is beyond the scope of this chapter. Some of these practices superficially resemble techniques used in the earliest phase of brainwashing. But these practices, themselves, are not brainwashing. This point must be emphasized because a false attribution of brainwashing to newly obtained cult recruits, rather than to those who have already made a substantial commitment to the cult, figures prominently in the ridicule of the concept by NRM scholars. A typical straw man representation of brainwashing as a self-evidently absurd concept is as follows: ‘The new convert is held mentally captive in a state of alternate consciousness due to “trance-induction techniques” such as meditation, chanting, speaking in tongues, self-hypnosis, visualization, and controlled breathing exercises … the cultist is [thus] reduced to performing religious duties in slavish obedience to the whims of the group and its authoritarian or maniacal leader’ (Wright 1998: 98).

Foundational brainwashing theory was not concerned with such Svengalian conceits, but only with ideological influence in the service of the retaining function. Why should the foundational theorists, concerned as they were with coercive state-run institutions like prisons, ‘re-education centres,’ and prisoner-of-war camps have any interest in explaining how participants were obtained? Participants were obtained at the point of a gun.4 The motive of these state enterprises was to retain the loyalties of these participants after intensive resocialization ceased. As George Orwell showed so well in his novel 1984, the only justification for the costly indoctrination process undergone by Winston Smith was not that he love Big Brother while Smith was in prison, but that Big Brother be able to retain that love after Smith was deployed back into society. Nevertheless, both ‘cult apologists’ and ‘cult bashers’ have found it more convenient to focus on the obtaining function.

Geronda Ephraim with child

If one asks why a cult would be motivated to invest resources in brainwashing, it should be clear that this can not be to obtain recruits, since these are a dime a dozen in the first place, and, as Barker (1984) has shown, they don’t tend to stick around long enough to repay the investment. Rather, it can only be to retain loyalty, and therefore decrease surveillance costs for valued members who are already committed. In small groups bound together only by normative solidarity, as Hechter (1987) has shown, the cost of surveillance of the individual by the group is one of the chief obstacles to success. Minimizing these surveillance costs is often the most important organizational problem such groups have to solve in order to survive and prosper. Brainwashing makes sense for a collectivity only to the extent that the resources saved through decreased surveillance costs exceed the resources invested in the brainwashing process. For this reason, only high-demand charismatic groups with totalistic social structures are ever in a position to benefit from brainwashing.5

This mistaken ascription of brainwashing to the obtaining to the obtaining function rather than the retaining function is directly responsible for two of the major arguments used by the ‘cult apologists’ in their attempt to debunk brainwashing. One has to do with a misunderstanding of the role of force and the other has to do with the mistaken belief that brainwashing can be studied with data on cult membership turnover.

The widespread belief that force is necessary for brainwashing is based upon a misreading of Lifton (1989) and Schein (1961). A number of authors (Dawson 1998; Melton forthcoming; Richardson 1993) have based their arguments, in part, on the contention that the works of foundational scholarship on brainwashing are irrelevant to the study of cults because the foundational literature studied only subjects who were forcibly incarcerated. However, Lifton and Schein have both gone on public record as explicitly denying that there is anything about their theories that requires the use of physical force or threat of force. Lifton has specifically argued (‘psychological manipulation is the heart of the matter, with or without the use of physical force’ [1995: xi]) that his theories are very much applicable to cults.6 The difference between the state-run institutions that Lifton and Schein studied in the 1950s and 1960s and the cults that Lifton and others study today is in the obtaining function not in the retaining function. In the Chinese and Korean situations, force was used for obtaining and brainwashing was used for retaining. In cults, charismatic appeal is used for obtaining and brainwashing is used, in some instances, for retaining.

A related misconception has to do with what conclusions to draw from the  very high rate of turnover among new and prospective recruits to cults. Bainbridge (1997), Barker (1989), Dawson (1998), Introvigne (forthcoming), and Richardson (1993) have correctly pointed out that in totalistic religious organizations very few prospective members go on to become long-term members. They argue that this proves that the resocialization process cannot be irresistible and therefore it cannot be brainwashing. But nothing in the brainwashing model predicts that it will be attempted with all members, let alone successfully attempted. In fact, the efficiency of brainwashing, operationalized as the expected yield of deployable agents7  per 100 members, is an unknown (but discoverable) parameter of any particular cultic system and may often be quite low. For the system to be able to perpetuate itself (Hechter 1987), the yield need only produce enough value for the system to compensate it for the resources required to maintain the brainwashing process.

Moreover, the high turnover rate in cults is more complex than it may seem. While it is true that the membership turnover is very high among recruits and new members, this changes after two or three years of membership when cultic commitment mechanisms begin to kick in. this transition from high to low membership turnover is known as the Bainbridge Shift, after the sociologist who first discovered it (Bainbridge 1997: 141-3). After about three years of membership, the annual rate of turnover sharply declines and begins to fit a commitment model rather than a random model.8

Membership turnover data is not the right sort of data to tell us whether a particular cult practises brainwashing. The recruitment strategy whereby many are called but few are chosen is a popular one among cults. In several groups in which I have observed  the brainwashing process, there was very high turnover among initial recruits. Brainwashing is too expensive to waste on raw recruits. Since brainwashing is a costly process, it generally will not pay for a group to even attempt to brainwash one of its members until that member has already demonstrated some degree of staying power on her own.9

Geronda Ephraim & Geronda Paisios

Psychological Traces

The fifth issue has to do with the question of whether brainwashing leaves any long-lasting measurable psychological traces in those who have experienced it. Before we can ask this question in a systematic way, we have to be clear about what sort of traces we should be looking for. There is an extensive literature on cults and mental health. But whether cult involvement causes psychological problems is a much more general question than whether participation in a traumatic resocialization process leaves any measurable psychological traces.

There has been little consensus on what sort of traces to look for. Richardson and Kilbourne (1983: 30) assume that brainwashing should lead to insanity. Lewis (1983: 30) argues that brainwashing should lead to diminished IQ scores. Nothing in brainwashing theory would lead us to predict either of these outcomes. In fact, Schein points out that ‘The essence of coercive persuasion is to produce ideological and behavioral change in a fully conscious, mentally intact individual’ (1959: 437). Why in the world would brainwashers invest scarce resources to produce insanity and stupidity in their followers? However, these aforementioned authors (and others) have taken the absence of these debilitative effects as ‘proof’ that brainwashing doesn’t happen in cults. At the same time, those who oppose cults have had an interest, driven by litigation rather than science, in making exaggerated claims for mental impairment directly resulting from brainwashing. As Farrell has pointed out, ‘From the beginning, the idea of traumatic neurosis has been accompanied by concerns about compensation’ (1998: 7).

Studies of lingering emotional, cognitive, and physiological effects on ex-members have thus far shown inconsistent results (Katchen 1997; Solomon 1981; Ungerleider and Wellisch 1983). Researchers studying current members of religious groups have found no significant impairment or disorientation. Such results have erroneously been taken as evidence that the members of these groups could, therefore, not possibly have been brainwashed. However, these same researchers found these responses of current members contaminated by elevations on the ‘Lie’ scale, exemplifying ‘an intentional attempt to make a good impression and deny faults’ (Ungerleider and Wellisch 1983: 208). On the other hand, studies of ex-members have tended to show ‘serious mental and emotional dysfunctions that have been directly caused by cultic beliefs and practices (Saliba 1993: 106). The sampling methods of these latter studies have been challenged (Lewis and Bromley 1987; Solomon 1981), however, because they have tended to significantly over-sample respondents with anti-cult movement ties. With ingenious logic, this has led Dawson (1998: 121) to suggest in the same breath that cult brainwashing is a myth but that ex-member impairment may be a result of brainwashing done by deprogrammers.

All this controversy is not entirely relevant to our question, however, because there is no reason to assume that a brainwashed person is going to show elevated scores on standard psychiatric distress scales. In fact, for those for whom making choices is stressful, brainwashing may offer psychological relief. Galanter’s research has demonstrated that a cult ‘acts like a psychological pincer, promoting distress while, at the same time, providing relief’ (1989: 93). As we shall see below, the brainwashing model predicts impairment and disorientation only for people during some of the intermediate stages, not at the end state. The popular association of brainwashing with zombie or robot states comes out of a misattribution of the characteristics of people going through the traumatic brainwashing process to people going through the traumatic brainwashing process to people who have completed the process. The former really are, at times, so disoriented that they appear to resemble caricatures of zombies or robots. The glassy eyes, inability to complete sentences, and fixed eerie smiles are characteristics of disoriented people under randomly varying levels of psychological stress. The latter, however, are, if the process was successful, functioning and presentable deployable agents.

geron Efraim1

Establishing causal direction in the association between cult membership and mental health is extremely tricky, and little progress has been made thus far. In an excellent article reviewing the extensive literature in this area, Saliba (1993: 108) concludes: ‘The study of the relationship between new religious movements and mental health is in its infancy.’ Writing five years later, Dawson (1998: 122) agrees that this is still true, and argues that ‘the inconclusiveness results of the psychological study of members and ex-members of NRMs cannot conceivably be used to support either the case for or against brainwashing.’ Saliba calls for prospective studies that will establish baseline mental health measurements for individuals before they join cults, followed by repeated measures during and afterward. While this is methodologically sensible, it is impractical because joining a cult is both a rare and unexpected event. This makes the general question of how cults affect mental health very difficult to answer.

Fortunately, examining the specific issue of whether brainwashing leaves psychological traces may be easier. The key is recognizing that brainwashing is a traumatic process, and, therefore, those who have gone through it should experience an increasing likelihood in later years of post-traumatic stress disorder. The classic clinical symptoms of PTSD — avoidance, numbing, and increased arousal (American Psychiatric Association 1994: 427) — have been observed in many ex-cult members regardless of their mode of exit and current movement affiliations (Katchen 1997; Zablocki 1999). However, these soft and somewhat subjective symptoms should be viewed with some caution given recent controversies over the ease with which symptoms such as these can be iatrogenically implanted, as, for example, false memories (Loftus and Ketcham 1994).

In the future, avenues for more precise neurological tracking may become available. Judith Herman (1997: 238) has demonstrated convincingly that ‘traumatic exposure can produce lasting alterations in the endocrine, autonomic, and central nervous systems … and un the function and even the structure of specific areas of the brain.’ It is possible in the future that direct evidence of brainwashing may emerge from brain scanning using positron emission tomography. Some preliminary research in this area has suggested that, during flashbacks, specific areas of the brain involved with language and communication may be inactivated (Herman 1997: 240; Rauch van der Kolk, et. al. 1996). Another promising area of investigation of this sort would involve testing for what van der Kolk and McFarlene (1996) have clinically identified as ‘the black hole of trauma.’ It should be possible to determine, once measures have been validated, whether such traces appear more often in individuals who claim to have gone through brainwashing than in a sample of controls who have been non-brainwashed members of cults for equivalent periods of time.

elder ephraim pascha

Separating the Investigative Steps

The final issue is a procedural one. There are four sequential investigative steps required to resolve controversies like the one we have been discussing.these steps are concerned with attempt, existence, incidence, and consequence. A great deal of confusion comes from nothing more than a failure to recognize that these four steps need to be kept analytically distinct from one another.

To appreciate the importance of this point, apart from the heat of controversy, let us alter the scene for a moment and imagine that the scientific conflict we are trying to resolve is over something relatively innocuous — say, vegetarianism. Let us imagine that on one side we have a community of scholars arguing that vegetarianism is a myth, that nobody would voluntarily choose to live without eating meat and that anyone who tried would quickly succumb to an overpowering carnivorous urge. On the other side, we have another group of scholars arguing that they had actually seen vegetarians and observed their non-meat-eating behavior over long periods of time, and that, moreover, vegetarianism is a rapidly growing social problem with many new converts each year being seduced by this enervating and debilitating diet.

It should be clear that any attempt to resolve this debate scientifically would have to proceed through the four sequential steps mentioned above. First, we would have to find out if anybody ever deliberately attempts to be a vegetarian. Maybe those observed not eating meat were simply unable to obtain it. If nobody could be found voluntarily attempting to follow a vegetarian diet, we would next have to observe him carefully enough and long enough to find out whether he succeeds in abstaining from meat. If we observe even one person successfully abstaining from meat, we would have to conclude that vegetarianism exists, increasing our confidence in the theory of the second group of researchers. But the first group could still argue, well, maybe you are right that a few eccentric people here and there do practise vegetarianism, but not enough to constitute a social phenomenon worth investigating. So, the next step would be to measure the incidence of vegetarianism in the population. Out of every million people, how many do we find following a vegetarian diet? If it turns out to be very few, we can conclude that, while vegetarianism may exist as a social oddity, it does not rise to the level of being a social phenomenon worthy of our interest. If, however, we find a sizeable number of vegetarians, we still need to ask, ‘So what?’ This is the fourth of our sequential steps. Does the practice of vegetarianism have any physical, psychological, or social consequences? If so, are these consequences worthy of our concern?

Each of these investigative steps requires attention focused on quite distinct sets of substantive evidence. For this reason, it is important that we not confuse them with one another as is so often done in ‘apologist’ writing about brainwashing, where the argument often seems to run as follows: Brainwashing doesn’t exist, or at least it shouldn’t exist, and even if it does the numbers involved are so few, and everybody in modern society gets brainwashed  to some extent, and the effects, if any, are impossible to measure. Such arguments jump around, not holding still long enough to allow for orderly and systematic confirmation or disconfirmation of each of the steps.

Once we recognize the importance of keeping the investigative steps methodologically distinct distinct from one another, it becomes apparent that the study of brainwashing is no more problematic (although undoubtedly much more difficult) than the study of an advertising campaign for a new household detergent. It is a straightforward question to ask whether or not some charismatic groups attempt to practise radical techniques of socialization designed to turn members into deployable agents. If the answer is no, we stop because there can be no brainwashing. If the answer is yes, we go on to a second question: Are these techniques at least sometimes effective in producing uncritical obedience? If the answer to this question is ye (even for a single person), we know that brainwashing exists, although it may be so rare as to be nothing more than a sociological oddity. therefore, we have to take a third step and ask. How frequently is it effective? What proportion of those who live in cults are subjected to brainwashing, and what proportion of these respond by becoming uncritically obedient? And, finally, we need to ask a fourth important question: How long do the effects last? Are the effects transitory, lasting only as long as the stimulus continues to be applied, or are they persistent for a period of time thereafter, and, if so, how long? Let us keep in mind the importance of distinguishing attempt from existence, from incidence, from consequences.

To be continued…

Panayuri

NOTES

  1. When I speak of ego dystonic behaviour, I refer to behaviour that was ego dystonic to the person before joining the cult and after leaving the cult.
  2. I have no doubt that Introvigne, who is a European attorney, is sincere in his desire to stifle brainwashing research out of fear that any suggestion that brainwashing might possibly occur in cults will be seized on by semi-authoritarian government committees eager to suppress religious liberty. Personally, I applaud Introvigne’s efforts to protect the fragile tree of religious freedom of choice in the newly emerging democracies of Eastern Europe. But I don’t appreciate his doing so by (perhaps inadvertently) sticking his  thumb on the scales upon which social scientists attempt to weigh evidence.
  3. The Anthony and Robbins article cited demonstrates how little we really know about traits that may predispose people to join cults. They say ‘…some traditionally conservative religious groups attract people who score highly on various measures of totalitarianism, e.g., the F scale or Rokeach’s Dogmatism scale… It seems likely that these results upon certain Christian groups would generalize to alternative religious movements or cults, as many of them have theological and social beliefs that seem similar to those in some fundamentalist denominations’ (1994:470).. Perhaps, but perhaps not. No consensus has yet emerged from numerous attempts to find a cult personality type, but this seems like a promising area of research to continue.
  4. Some, it is true, were nominally volunteers into re-education programs. However, the power of the state to make their lives miserable if they did not volunteer cannot be ignored.
  5. Unfortunately, however, uncritical obedience can be wayward and dangerous. It can be useful to a cult leader when the cult is functioning well. But it often has been perverted to serve a destructive or self-destructive agenda in cults that have begun to disintegrate.
  6. Some confusion on this subject has emerged from the fact that Lifton has distanced himself from those attempting to litigate against cults because of alleged brainwashing. He has constantly argued (and I wholeheartedly agree) that brainwashing, in and of itself, where no force is involved, should not be a matter for the law courts.
  7. Formal definitions for this and other technical terms will be presented in the next section of this chapter.
  8. In other words, the probability of a person’s leaving is inversely dependent upon the amount of time he or she has already spent as a member.
  9. The ‘cult-basher’ version of brainwashing theory has played into this misunderstanding by confounding manipulative recruitment techniques (like sleep deprivation and ‘love-bombing’) with actual brainwashing. While there may be some overlap in the actual techniques used, the former is a method for obtaining new members, whereas brainwashing is a method for retaining old members.
Advertisements

“Cuckoo’s Nest”: Grigoriou Monastery on the Holy Mountain (Vasos Vasileiou, 2010)

NOTE: The following article is taken from the Cypriot newspaper “Phileleftheros” (Ό Φιλελεύθερος), December 18, 2010, p. 23. The article contains the accusations of a hieromonk who was ousted after 22 years of control methods via the administration of psychiatric drugs. http://www.zougla.gr/page.ashx?pid=80&aid=227195&cid=122

Monk Christodoulos
Fr. Christodoulos

The monk, who “was expelled” from the Grigoriou monastery1 on Mount Athos 22 years after his admittance, denounces methods reminiscent of the movie One Flew Over the Cuckoo’s Nest.2 According to his allegations, methods of controlling the monks were applied with the administration of psychiatric drugs. The complaints come from Father Christodoulos3 who also produced a movie clip which shows him tied with leg padlocks to a bed, in a room of the Thessaloniki Hospital, where he was brought for “treatment.”

Fr. Christodoulos maintains he was of sound mind. He cites the opinion of Cypriot psychiatrist, Yiangou Mikelidis,4 who states that he examined Father Christodoulos and “he is not suffering from any serious mental illness and has no need of treatment.”

“The monk’s so-called mental illness reacheded,” as he says, “up to the Prefect of Thessaloniki whose testimony was invoked to register a complaint against the Abbot of Gregoriou5 Monastery for slanderous libel. Furthermore, he accuses the monastery’s administration of not returning money that he secured from the sale of his own real estate. Father Christodoulos was not the “typical” type of monk since he sought the Abbot’s resignation, he went on a hunger strike twice and while he was not as obedient, he remained an administrator of the monastery.

00(2)
Yiangos passed away in August 2014 at the age of 68.

When they gave him a certificate of discharge and he refused to leave the monastery, the Monastery’s administration called up policemen from Karyes who accompanied him off the Holy Mountain. They transferred the monk’s belongings to Karyes; these numbered 47 boxes with various personal items and were not delivered to him upon his expulsion.

“They tried to make me crazy”

Fr. Christodoulos (Nicholas Diamantopoulos in the world) spoke to “Φ” about everything he claims happened in Grigoriou Monastery:6

“I joined the Grigoriou Monastery in 1987 at age 30. In 2003 I did a hunger strike demanding the resignation of the Abbot because he could not exercise his duties completely. The Abbot gave me a handwritten letter in which he resigned and asked me to pass it to the elderly congregation (a copy was given to the “Φ”).

Archimandrite George Kapsanis
Geronda George Kapsanis, former Abbot of Grigoriou Monastery (d. June 2014)

“I raised the issue of resignation before the elderly assembly (composed of seven monks) but I was told they did not accept it. I returned to the Abbot and asked to be heard by the whole fraternity consisting of about 70 monks. I developed my position before them and they thereupon prepared a document calling the Public Prosecutor of Thessaloniki to lock me in a mental hospital. With the mobilization of the police, they lead me the mental hospital. The psychiatrist chanced to be a fellow student of the Monastery’s doctor; the one who sent me to the psychiatric hospital. I mentioned to the psychiatrist that I have differences with the Monastery’s administration. I explained that this administration wants to use him to make me out as crazy.

“I called my brother from a phone booth and explained that they wanted to declare me insane. So until he came, they tied me to a bed with the help of security guard. They used straps and padlock. When my two brothers came to ask me what happened, they were paid no attention to. When they saw me tied up, they made a clip with a camera and warned those responsible at the hospital they would be given to the public if they continued to have me bound. My brothers said they would take me to another psychiatrist who is not influenced by the monastery. It took three days of contacts and interventions to allow me to leave.

1
Monk Christodoulos strapped with padlocks to a bed in Thessaloniki Hospital.

καλογερς

“I went to another psychiatrist who, after a month of visits, advised that I am suffering from mixed personality disorder which has nothing to do with mental illness or any other serious illness. The doctor told me that I can go to the monastery with no problem. I returned to the monastery where they accepted me. (Last May I went to a psychiatrist, Giagkos Mikellides, who after examining me, opined in writing that I do not suffer from any serious mental illness and have no need of treatment. A copy of the advice was made available to “Φ”).

“In 2004 a priest-monk threatened me, saying they would expel me from the monastery. I started a hunger strike and sought the Abbot’s resignation. An assembly occurred, minus the Abbot who was then outside Mount Athos and I was told that either they would deport me from Mount Athos or I would go to a psychiatrist in Patras.

“I told them that I accept going to a psychiatrist. I went off to my hometown in Peloponnese without seeing a psychiatrist. When I returned a week later, the Abbot didn’t say anything to me nor ask me what happened with the psychiatrist. This means that two powers co-exist in the monastery. On the one hand, the Abbot and on the other an elderly congregation that insists on making me a mental patient.

“The elderly congregation has a problem because when I go out with permission, I travel abroad instead of only in Greece. With the “indiscipline” I require small chastisements for my “indiscipline” such as refraining from chanting, etc.

“In 2006 they changed the exit certificate and restricted my travels to only in Greece. On one occasion the Abbot obliged me to give him 500 prayer ropes, which I made, to enable me to go on a pilgrimage to the Patriarchate. Since then, when I go out with permission, I travel abroad without the blessing of the Abbot.

Apolytirion

“This year in March I went to the abbot and asked him to convene the fraternity and invite anyone who has something against me to say it before all. He threatened me with a curse (that he would curse me) because I ask things beyond obedience. When he threatened me with a curse, I wrote a curse. I noted that if I am right then the curse is to fall upon the head of the Abbot; if not, then the curse would fall to mine.7 After that, I came to Cyprus where I spent Pascha and when I returned I was called to the synaxis and they asked me for an explanation about my behavior.

“I told them that I cannot respect them to the depth they want; when in 2003 they tried to make me crazy.

“Afterwards, they gave me a certificate for insult and contempt towards the Abbot, but I returned it because it did not have his signature.

“They insisted that I leave. I didn’t leave and they brought the police in and they escorted me to Karyes.

“The Abbot told the Prefect of Thessaloniki, Mr. Psomiadi, that I’m a mental patient. Then I registered a lawsuit against the Abbot for slander which is pending before the Court.8

Fr. Christodoulos on Mount Athos
Fr. Christodoulos

POST SCRIPT:

To this day, Fr. Christodoulos still speaks out and references the injustices he suffered while living as a monk at Grigoriou Monastery. Here is a recent example, dated January 15, 2016:

“Many who know the details of my monastic life urge me to write an autobiography. If I decide to do such a “crazy thing”, the dead will roll in their graves, as well as the bones of those who are alive—the guileful, treacherous rassaphore monks of Grigoriou Monastery, Mt. Athos who through plots and intrigues that even the Italian Kamora would envy, continually tried to shut my mouth, slander me, humiliate me, ridicule me with processes that reach beyond the limits of a murder attempt at my expense.”

“I have evidence and documents stored electronically that would overturn the thrones of Churches (and not just sovereigns) if I were to publish them!!!”

kapsanis
Fr. George Kapsanis died on Pentecost, June 2014.

NOTES

  1. Grigoriou Monastery (Greek: Γρηγορίου) is situated on the southwest side of the Athos Peninsula in northern Greece, between the monasteries of Dionysiou and Simonopetra. Grigoriou originally was dedicated to the St. Nicholas but later was renamed in honor of its founder, Gregory. It is ranked seventeenth in the hierarchical order of the twenty monasteries located on the Mount Athos peninsula. Grigoriou is reputed to be one of the most well-organized and strict coenobitic monasteries on the Mount Athos peninsula.
  2. One Flew Over the Cuckoo’s Nest is a 1975 American drama film directed by Miloš Forman, based on the 1962 novel One Flew Over the Cuckoo’s Nest by Ken Kesey. Now considered to be one of the greatest films ever made, One Flew Over the Cuckoo’s Nest is No. 33 on the American Film Institute’s 100 Years… 100 Movies list. In 1993, it was deemed “culturally, historically, or aesthetically significant” by the United States Library of Congress and selected for preservation in the National Film Registry.
  3. A blog exists under the name of Χριστόδουλος Μοναχός Γρηγοριάτης—though there is no validation that the Monk Christodoulos actually wrote the posts contained therein (especially since he continues to make comments about the monastery to this day). Notably, to this day, he still speaks out about his experiences at Grigoriou Monastery. A month after Abbot George’s resignation, the following retraction was posted on this blog, “My esteemed Geronda, beloved fathers and brothers, please consider everything I posted on this blog as invalid. I recall all of my posts and have deleted them! I seek forgiveness from all of you, hoping that I will obtain favorable treatment. Pray for my salvation as I too! My Metanoia [repentance or prostration] to all of you! Monk Christodoulos. http://monaxoschristodoulos.blogspot.com/2014/03/blog-post.html
  4. Yiangos passed away in August 2014 at the age of 68. http://www.attacktv.gr/news/Pages/view.aspx?nID=28222
  5. Archimandrite George Kapsanis resigned from his abbacy in February 2014 for reasons unknown. He died on the day of Pentecost later that year (June 8, 2014).
  6. In April 2014, a blog existing under the name of “Monk Christodoulos Grigoriatis”, posted “My Second Sorry to Grigoriou Monastery.” This “Epistle of Repentance to Geronda George Kapsanis and the Holy Monastery of Grigoriou, Mount Athos” sounds more like a PR campaign contrived by the monastery. To this day, when talking about his experiences at Grigoriou Monastery,  Christodoulos speaks quite differently than the content found in this epistle. Here is the epistle in its entirety:

 

My esteemed Geronda,

 

Since April 2010, I have written and published on the internet or notifications by means of mass media (television, radio, newspapers) that gave me step of speech, ungrounded, obscene and other charges against you and against the brothers of the Monastery.

I recognize fully that both you and the brothers of the monastery are persons above reproach in every respect and that my accusations were untrue. But now I am fully aware of the truth and repent for what they did. I confess that I caused you great grief and psychic pain, but also scandalized many people who did not know the ethos of Grigoriou Monastery. I publicly apologize for this, both to you and the brothers of the monastery and the people I scandalized.

As a minimum indication of my practical repentance, I’ve already deleted my website that I maintained with the unjust and false accusations which I address to you and the brothers of the monastery, and I have posted two letters of apology online (this and the preceding that I have sent).

I hope that in this way I can restore, albeit slightly, the harm I caused you.

Because I am monk and I look forward to my salvation, I put my metanoia [repentance or prostration] and ask for your blessing.

I wish you a good and blessed Pascha in love of the Lord!

My repentance towards my former monastery Fr. George Kapsanis, Elder Fr. Christopher, the Fathers of the Holy Assembly and all the fathers of the monastery. Evlogeite your blessing!!

The signing that follows is genuinely mine.

Monk Christodoulos

http://monaxoschristodoulos.blogspot.com/2014/04/blog-post_1.html

  1. Geronda Ephraim teaches that cursing clergymen never works and it always falls back on the curser seven-fold. However, a curse by a clergyman always sticks due to the grace of ordination. In this case, both participants are ordained priests; thus, the curse by whichever hieromonk is in the right would have stuck.
  2. There does not seem to be any information about these proceedings available on the web.

 

 

The Confidence Game: What Con Artists Reveal About the Psychology of Trust and Why Even the Most Rational of Us Are Susceptible to Deception

NOTE: The following article was written by Maria Popova and was taken from https://www.brainpickings.org/2016/01/12/the-confidence-game-maria-konnikova/

“It’s the oldest story ever told. The story of belief — of the basic, irresistible, universal human need to believe in something that gives life meaning, something that reaffirms our view of ourselves, the world, and our place in it.”

theconfidencegame_konnikova

“Reality is what we take to be true,” physicist David Bohm observed in a 1977 lecture. “What we take to be true is what we believe… What we believe determines what we take to be true.” That’s why nothing is more reality-warping than the shock of having come to believe something untrue — an experience so disorienting yet so universal that it doesn’t spare even the most intelligent and self-aware of us, for it springs from the most elemental tendencies of human psychology. “The confidence people have in their beliefs is not a measure of the quality of evidence,” Nobel-winning psychologist Daniel Kahneman asserted in examining how our minds mislead us, “but of the coherence of the story that the mind has managed to construct.”

The machinery of that construction is what New Yorker columnist and science writer extraordinaire Maria Konnikova explores in The Confidence Game: Why We Fall for It … Every Time (public library) — a thrilling psychological detective story investigating how con artists, the supreme masterminds of malevolent reality-manipulation, prey on our propensity for believing what we wish were true and how this illuminates the inner workings of trust and deception in our everyday lives.

edwardgoreyfairytales_red4
Art by Edward Gorey for a special edition of the Brothers Grimm fairy tales.

“Try not to get overly attached to a hypothesis just because it’s yours,” Carl Sagan urged in his excellent Baloney Detection Kit — and yet our tendency is to do just that, becoming increasingly attached to what we’ve come to believe because the belief has sprung from our own glorious, brilliant, fool-proof minds. Through a tapestry of riveting real-life con artist profiles interwoven with decades of psychology experiments, Konnikova demonstrates that a con artist simply takes advantage of this hubris by finding the beliefs in which we are most confident — those we’re least likely to question — and enlisting them in advancing his or her agenda.

To be sure, we all perform micro-cons on a daily basis. White lies are the ink of the social contract — the insincere compliment to a friend who needs a confidence boost, the unaddressed email that “somehow went to spam,” the affinity fib that gives you common ground with a stranger at a party even though you aren’t really a “huge Leonard Cohen fan too.”

We even con ourselves. Every act of falling in love requires a necessary self-con — as Adam Phillips has written in his terrific piece on the paradox of romance, “the person you fall in love with really is the man or woman of your dreams”; we dream the lover up, we construct a fantasy of who she is based on the paltry morsels of information seeded by early impressions, we fall for that fantasy and then, as we immerse ourselves in a real relationship with a real person, we must convince ourselves that the reality corresponds to enough of the fantasy to feel satisfying.

But what sets the con artist apart from the mundane white-liar is the nefarious intent and the deliberate deftness with which he or she goes about executing that reality-manipulation.

Konnikova begins with the story of a lifelong impostor named Ferdinand Waldo Demara, who successfully passed himself off as a psychologist, a professor, a monk, a surgeon, a prison warden, the founder of a religious college, and even his own biographer.

demara
Ferdinand Waldo Demara (Photograph: Corbis)

Considering the perplexity of his astonishing ability to deceive, Konnikova — whose previous book examined the positive counterpart to the con, the psychology of thinking like Sherlock Holmes — writes:

“How was he so effective? Was it that he preyed on particularly soft, credulous targets? I’m not sure the Texas prison system, one of the toughest in the United States, could be described as such. Was it that he presented an especially compelling, trustworthy figure? Not likely, at six foot one and over 250 pounds, square linebacker’s jaw framed by small eyes that seemed to sit on the border between amusement and chicanery, an expression that made [his] four-year-old daughter Sarah cry and shrink in fear the first time she ever saw it. Or was it something else, something deeper and more fundamental — something that says more about ourselves and how we see the world?

It’s the oldest story ever told. The story of belief — of the basic, irresistible, universal human need to believe in something that gives life meaning, something that reaffirms our view of ourselves, the world, and our place in it… For our minds are built for stories. We crave them, and, when there aren’t ready ones available, we create them. Stories about our origins. Our purpose. The reasons the world is the way it is. Human beings don’t like to exist in a state of uncertainty or ambiguity. When something doesn’t make sense, we want to supply the missing link. When we don’t understand what or why or how something happened, we want to find the explanation. A confidence artist is only too happy to comply — and the well-crafted narrative is his absolute forte.”

aliceinwonderland_zwerger13
Art by Lisbeth Zwerger for a special edition of Alice’s Adventures in Wonderland.

Konnikova describes the basic elements of the con and the psychological susceptibility into which each of them plays:

“The confidence game starts with basic human psychology. From the artist’s perspective, it’s a question of identifying the victim (the put-up): who is he, what does he want, and how can I play on that desire to achieve what I want? It requires the creation of empathy and rapport (the play): an emotional foundation must be laid before any scheme is proposed, any game set in motion. Only then does it move to logic and persuasion (the rope): the scheme (the tale), the evidence and the way it will work to your benefit (the convincer), the show of actual profits. And like a fly caught in a spider’s web, the more we struggle, the less able to extricate ourselves we become (the breakdown). By the time things begin to look dicey, we tend to be so invested, emotionally and often physically, that we do most of the persuasion ourselves. We may even choose to up our involvement ourselves, even as things turn south (the send), so that by the time we’re completely fleeced (the touch), we don’t quite know what hit us. The con artist may not even need to convince us to stay quiet (the blow-off and fix); we are more likely than not to do so ourselves. We are, after all, the best deceivers of our own minds. At each step of the game, con artists draw from a seemingly endless toolbox of ways to manipulate our belief. And as we become more committed, with every step we give them more psychological material to work with.”

What makes the book especially pleasurable is that Konnikova’s intellectual rigor comes with a side of warm wit. She writes:

“Religion,” Voltaire is said to have remarked, “began when the first scoundrel met the first fool.” It certainly sounds like something he would have said. Voltaire was no fan of the religious establishment. But versions of the exact same words have been attributed to Mark Twain, to Carl Sagan, to Geoffrey Chaucer. It seems so accurate that someone, somewhere, sometime, must certainly have said it.

The invocation of Mark Twain is especially apt — one of America’s first great national celebrities, he was the recipient of some outrageous con attempts. That, in fact, is one of Konnikova’s most disquieting yet strangely assuring points — that although our technologies of deception have changed, the technologies of thought undergirding the art of the con are perennially bound to our basic humanity. She writes:

“The con is the oldest game there is. But it’s also one that is remarkably well suited to the modern age. If anything, the whirlwind advance of technology heralds a new golden age of the grift. Cons thrive in times of transition and fast change, when new things are happening and old ways of looking at the world no longer suffice. That’s why they flourished during the gold rush and spread with manic fury in the days of westward expansion. That’s why they thrive during revolutions, wars, and political upheavals. Transition is the confidence game’s great ally, because transition breeds uncertainty. There’s nothing a con artist likes better than exploiting the sense of unease we feel when it appears that the world as we know it is about to change. We may cling cautiously to the past, but we also find ourselves open to things that are new and not quite expected.

[…]

No amount of technological sophistication or growing scientific knowledge or other markers we like to point to as signs of societal progress will — or can — make cons any less likely. The same schemes that were playing out in the big stores of the Wild West are now being run via your in-box; the same demands that were being made over the wire are hitting your cell phone. A text from a family member. A frantic call from the hospital. A Facebook message from a cousin who seems to have been stranded in a foreign country.

[…]

Technology doesn’t make us more worldly or knowledgeable. It doesn’t protect us. It’s just a change of venue for the same old principles of confidence. What are you confident in? The con artist will find those things where your belief is unshakeable and will build on that foundation to subtly change the world around you. But you will be so confident in the starting point that you won’t even notice what’s happened.”

thebiggreenbook_gravessendak7
Art by Maurice Sendak for The Green Book by Robert Graves.

In a sense, the con is a more extreme and elaborate version of the principles of persuasion that Blaise Pascal outlined half a millennium ago — it is ultimately an art not of coercion but of complicity. Konnikova writes:

“The confidence game — the con — is an exercise in soft skills. Trust, sympathy, persuasion. The true con artist doesn’t force us to do anything; he makes us complicit in our own undoing. He doesn’t steal. We give. He doesn’t have to threaten us. We supply the story ourselves. We believe because we want to, not because anyone made us. And so we offer up whatever they want — money, reputation, trust, fame, legitimacy, support — and we don’t realize what is happening until it is too late. Our need to believe, to embrace things that explain our world, is as pervasive as it is strong. Given the right cues, we’re willing to go along with just about anything and put our confidence in just about anyone.”

So what makes you more susceptible to the confidence game? Not necessarily what you might expect:

“When it comes to predicting who will fall, personality generalities tend to go out the window. Instead, one of the factors that emerges is circumstance: it’s not who you are, but where you happen to be at this particular moment in your life.”

People whose willpower and emotional resilience resources are strained — the lonely, the financially downtrodden, those dealing with the trauma of divorce, injury, or job loss, those undergoing major life changes — are particularly vulnerable. But these, Konnikova reminds us, are states rather than character qualities, circumstances that might and likely will befall each one of us at different points in life for reasons largely outside our control. (One is reminded of philosopher Martha Nussbaum’s excellent work on agency and victimhood: “The victim shows us something about our own lives: we see that we too are vulnerable to misfortune, that we are not any different from the people whose fate we are watching…”) Konnikova writes:

“The more you look, the more you realize that, even with certain markers, like life changes, and certain tendencies in tow, a reliably stable overarching victim profile is simply not there. Marks vary as much as, and perhaps even more than, the grifters who fool them.”

Therein lies the book’s most sobering point — Konnikova demonstrates over and over again, through historical anecdotes and decades of studies, that no one is immune to the art of the con. And yet there is something wonderfully optimistic in this. Konnikova writes:

“The simple truth is that most people aren’t out to get you. We are so bad at spotting deception because it’s better for us to be more trusting. Trust, and not adeptness at spotting deception, is the more evolutionarily beneficial path. People are trusting by nature. We have to be. As infants, we need to trust that the big person holding us will take care of our needs and desires until we’re old enough to do it ourselves. And we never quite let go of that expectation.”

Trust, it turns out, is advantageous in the grand scheme of things. Konnikova cites a number of studies indicating that people who score higher on generalized trust tend to be healthier physically, more psychoemotionally content, likelier to be entrepreneurs, and likelier to volunteer. (The most generous woman I know, who is also a tremendously successful self-made entrepreneur, once reflected: “I’ve never once regretted being generous, I’ve only ever regretted holding back generosity.”) But the greater risk-tolerance necessary for reaping greater rewards also comes with the inevitable downside of greater potential for exploitation — the most trusting among us are also the perfect marks for the player of the confidence game.

thebiggreenbook_gravessendak9
Art by Maurice Sendak for The Green Book by Robert Graves.

But the paradox of trust, Konnikova argues, is only part of our susceptibility to being conned. Another major factor is our sheer human solipsism. She explains:

“We are our own prototype of being, of motivation, of behavior. People, however, are far from being a homogeneous mass. And so, when we depart from our own perspective, as we inevitably must, we often make errors, sometimes significant ones. [Psychologists call this] “egocentric anchoring”: we are our own point of departure. We assume that others know what we know, believe what we believe, and like what we like.”

She cites an extensive study, the results of which were published in a paper cleverly titled “How to Seem Telepathic.” (One ought to appreciate the scientists’ wry sarcasm in poking fun at our clickbait culture.) Konnikova writes:

“Many of our errors, the researchers found, stem from a basic mismatch between how we analyze ourselves and how we analyze others. When it comes to ourselves, we employ a fine-grained, highly contextualized level of detail. When we think about others, however, we operate at a much higher, more generalized and abstract level. For instance, when answering the same question about ourselves or others — how attractive are you? — we use very different cues. For our own appearance, we think about how our hair is looking that morning, whether we got enough sleep, how well that shirt matches our complexion. For that of others, we form a surface judgment based on overall gist. So, there are two mismatches: we aren’t quite sure how others are seeing us, and we are incorrectly judging how they see themselves.”

mauricesendak_junipertree_grimm5
Art by Maurice Sendak for a special edition of the Brothers Grimm fairy tales.

The skilled con artist, Konnikova points out, mediates for this mismatch by making an active effort to discern which cues the other person is using to form judgments and which don’t register at all. The result is a practical, non-paranormal exercise in mind-reading, which creates an illusion of greater affinity, which in turn becomes the foundation of greater trust — we tend to trust those similar to us more than the dissimilar, for we intuit that the habits and preferences we have in common stem from shared values.

And yet, once again, we are reminded that the tricks of the con artist’s exploitive game are different only by degree rather than kind from the everyday micro-deceptions of which our social fabric is woven. Konnikova writes:

“Both similarity and familiarity can be faked, as the con artist can easily tell you — and the more you can fake it, the more real information will be forthcoming. Similarity is easy enough. When we like someone or feel an affinity for them, we tend to mimic their behavior, facial expressions, and gestures, a phenomenon known as the chameleon effect. But the effect works the other way, too. If we mimic someone else, they will feel closer and more similar to us; we can fake the natural liking process quite well. We perpetuate minor cons every day, often without realizing it, and sometimes knowing what we do all too well, when we mirror back someone’s words or interests, feign a shared affinity for a sports team or a mutual hatred of a brand. The signs that usually serve us reliably can easily be massaged, especially in the short term — all a good con artist needs.”

In the remainder of the thoroughly fascinating The Confidence Game, Konnikova goes on to explore the role of storytelling in reality-manipulation, what various psychological models reveal about the art of persuasion, and how the two dramatically different systems that govern our perception of reality — emotion and the intellect — conspire in the machinery of trust. Complement it with Adrienne Rich on lying and what “truth” really means, David deSteno on the psychology of trust in work and love, and Alice Walker on what her father taught her about the love-expanding capacity of truth-telling.

 

Biderman’s Chart of Coercion

NOTE: This article is based on the writings of Albert D. Biderman, a sociologist who worked for the USAF in the 1950s. Biderman showed how Chinese and Korean interrogators used techniques including sleep deprivation, darkness or bright light, insults, threats, and exposure far more than physical force to break prisoners. A link to the entire pdf can be found at the end of the article.

Biderman book

“Most people who brainwash…use methods similar to those of prison guards who recognize that physical control is never easily accomplished without the cooperation of the prisoner. The most effective way to gain that cooperation is through subversive manipulation of the mind and feelings of the victim, who then becomes a psychological, as well as a physical, prisoner.” from an Amnesty International publication, “Report on Torture“, which depicts the brainwashing of prisoners of war.

 

Isolation

  • Deprives individual of social support, effectively rendering him unable to resist
  • Makes individual dependent upon interrogator
  • Develops an intense concern with self.

Once a person is away from longstanding emotional support and thus reality checks, it is fairly easy to set a stage for brainwashing. Spiritually abusive groups work to isolate individuals from friends and family, whether directly, by requiring the individuals to forsake friends and family for the sake of the “Kingdom” (group membership), or indirectly, by preaching the necessity to demonstrate one’s love for God by “hating” one’s father, mother, family, friends.

Abusive groups are not outward-looking, but inward-looking, insisting that members find all comfort and support and a replacement family within the group. Cut off from friends, relatives, previous relationships, abusive groups surround the recruits and hammer rigid ideologies into their consciousnesses, saturating their senses with specific doctrines and requirements of the group.

Isolated from everyone but those within the group, recruits become dependent upon group members and leaders and find it difficult if not impossible to offer resistance to group teachings. They become self-interested and hyper-vigilant, very fearful should they incur the disapproval of the group, which now offers the only support available to them which has group approval.

AZa
Monks and nuns from the various monasteries under Geronda Ephraim during St. Anthony Monastery’s Feast Day (ca. 2006)

Warning signs
The seed of extremism exists wherever a group demands all the free time of a member, insisting he be in church every time the doors are open and calling him to account if he isn’t, is critical or disapproving of involvements with friends and family outside the group, encourages secrecy by asking that members not share what they have seen or heard in meetings or about church affairs with outsiders, is openly, publicly, and repeatedly critical of other churches or groups (especially if the group claims to be the only one which speaks for God), is critical when members attend conferences, workshops or services at other churches, checks up on members in any way, i.e., to determine that the reason they gave for missing a meeting was valid, or makes attendance at all church functions mandatory for participating in church ministry or enjoying other benefits of church fellowship.

Once a member stops interacting openly with others, the group’s influence is all that matters. He is bombarded with group values and information and there is no one outside the group with whom to share thoughts or who will offer reinforcement or affirmation if the member disagrees with or doubts the values of the group. The process of isolation and the self-doubt it creates allow the group and its leaders to gain power over the members. Leaders may criticize major and minor flaws of members, sometimes publically, or remind them of present or past sins. They may call members names, insult them or ignore them, or practice a combination of ignoring members at some times and receiving them warmly at others, thus maintaining a position of power (i.e., the leaders call the shots.)

The sense of humiliation makes members feel they deserve the poor treatment they are receiving and may cause them to allow themselves to be subjected to any and all indignities out of gratefulness that one as unworthy as they feel is allowed to participate in the group at all. When leaders treat the member well occasionally, they accept any and all crumbs gratefully. Eventually, awareness of how dependent they are on the group and gratitude for the smallest attention contributes to an increasing sense of shame and degradation on the part of the members, who begin to abuse themselves with “litanies of self-blame,” i.e., “No matter what they do to me, I deserve it, as sinful and wretched as I am. I deserve no better. I have no rights but to go to hell. I should be grateful for everything I receive, even punishment.”

St. Anthony's Monastery Feast Day (early - mid-2000s)
In the monasteries it is taught that the most ideal way for someone to practice Orthodoxy is through blind obedience to a Geronda (or Gerondissa).

Monopolization of Perception

  • Fixes attention upon immediate predicament; fosters introspection
  • Eliminates stimuli competing with those controlled by captor
  • Frustrates all actions not consistent with compliance

Abusive groups insist on compliance with trival demands related to all facets of life: food, clothing, money, household arrangements, children, conversation. They monitor members’ appearances, criticize language and childcare practices. They insist on precise schedules and routines, which may change and be contradictory from day to day or moment to moment, depending on the whims of group leaders.

At first, new members may think these expectations are unreasonable and may dispute them, but later, either because they want to be at peace or because they are afraid, or because everyone else is complying, they attempt to comply. After all, what real difference does it make if a member is not allowed to wear a certain color, or to wear his hair in a certain way, to eat certain foods, or say certain words, to go certain places, watch certain things, or associate with certain individuals. In the overall scheme of things, does it really matter? In fact, in the long run, the member begins to reason, it is probably good to learn these disciplines, and after all, as they have frequently been reminded, they are to submit to spiritual authority as unto the Lord.. Soon it becomes apparent that the demands will be unending, and increasing time and energy are focused on avoiding group disapproval by doing something “wrong.” There is a feeling of walking on eggs. Everything becomes important in terms of how the group or its leaders will respond, and members’ desires, feelings and ideas become insignificant. Eventually, members may no longer even know what they want, feel or think. The group has so monopolized all of the members’ perceptions with trivial demands that members lose their perspective as to the enormity of the situation they are in.

The leaders may also persuade the members that they have the inside track with God and therefore know how everything should be done. When their behavior results in disastrous consequences, as it often does, the members are blamed. Sometimes the leaders may have moments, especially after abusive episodes, when they appear to humble themselves and confess their faults, and the contrast of these moments of vulnerability with their usual pose of being all-powerful endears them to members and gives hope for some open communication.

Threats sometimes accompany all of these methods. Members are told they will be under God’s judgment, under a curse, punished, chastised, chastened if they leave the group or disobey group leaders. Sometimes the leaders, themselves, punish the members, and so members can never be sure when leaders will make good on the threats which they say are God’s idea. The members begin to focus on what they can do to meet any and all group demands and how to preserve peace in the short run. Abusive groups may remove children from their parents, control all the money in the group, arrange marriages, destroy personal items of members or hide personal items.

cropped-11.jpg

Warning signs:
Preoccupation with trivial demands of daily life, demanding strict compliance with standards of appearance, dress codes, what foods are or are not to be eaten and when, schedules, threats of God’s wrath if group rules are not obeyed, a feeling of being monitored, watched constantly by those in the group or by leaders. In other words, what the church wants, believes and thinks its members should do becomes everything, and you feel preoccupied with making sure you are meeting the standards. It no longer matters whether you agree that the standards are correct, only that you follow them and thus keep the peace and in the good graces of leaders.

TX Synodia
The monks of Holy Archangels Monastery (TX).

Induced Debility and Exhaustion

People subjected to this type of spiritual abuse become worn out by tension, fear and continual rushing about in an effort to meet group standards. They must often avoid displays of fear, sorrow or rage, since these may result in ridicule or punishment. Rigid ministry demands and requirements that members attend unreasonable numbers of meetings and events makes the exhaustion and ability to resist group pressure even worse.

The Gerondia (Head) Table at St. Nektarios Monastery (NY)

Warning Signs:
Feelings of being overwhelmed by demands, close to tears, guilty if one says no to a request or goes against a church standards. Being intimidated or pressured into volunteering for church duties and subjected to scorn or ridicule when one does not “volunteer.” Being rebuked or reproved when family or work responsibilities intrude on church responsibilities.

St. Nektarios Brotherhood at The Russian Synodal Building, NY 2010

Occasional Indulgences

  • Provides motivation for compliance

Leaders of abusive groups often sense when members are making plans to leave and may suddenly offer some kind of indulgence, perhaps just love or affection, attention where there was none before, a note or a gesture of concern. Hope that the situation in the church will change or self doubt (“Maybe I’m just imagining it’s this bad,”) then replace fear or despair and the members decide to stay a while longer. Other groups practice sporadic demonstrations of compassion or affection right in the middle of desperate conflict or abusive episodes. This keeps members off guard and doubting their own perceptions of what is happening.

Some of the brainwashing techniques described are extreme, some groups may use them in a disciplined, regular manner while others use them more sporadically. But even mild, occasional use of these techniques is effective in gaining power.

CA nuns procession 5

Warning Signs:
Be concerned if you have had an ongoing desire to leave a church or group you believe may be abusive, but find yourself repeatedly drawn back in just at the moment you are ready to leave, by a call, a comment or moment of compassion. These moments, infrequent as they may be, are enough to keep hope in change alive and thus you sacrifice years and years to an abusive group.

Feast-of-St.-Thekla-2013
Feast Day of St. Thekla, 2013, Canada.

Devaluing the Individual

  • Creates fear of freedom and dependence upon captors
  • Creates feelings of helplessness
  • Develops lack of faith in individual capabilities

Abusive leaders are frequently uncannily able to pick out traits church members are proud of and to use those very traits against the members. Those with natural gifts in the areas of music may be told they are proud or puffed up or “anxious to be up front” if they want to use their talents and denied that opportunity. Those with discernment are called judgmental or critical, the merciful are lacking in holiness or good judgment, the peacemakers are reminded the Lord came to bring a sword, not peace. Sometimes efforts are made to convince members that they really are not gifted teachers or musically talented or prophetically inclined as they believed they were. When members begin to doubt the one or two special gifts they possess which they have always been sure were God-given, they begin to doubt everything else they have ever believed about themselves, to feel dependent upon church leaders and afraid to leave the group. (“If I’ve been wrong about even *that*, how can I ever trust myself to make right decisions ever again?”).

CA Nuns choir 3
There are 21 nuns residing at Life-Giving Spring Monastery.

Warning Signs:
Unwillingness to allow members to use their gifts. Establishing rigid boot camp-like requirements for the sake of proving commitment to the group before gifts may be exercised. Repeatedly criticizing natural giftedness by reminding members they must die to their natural gifts, that Paul, after all, said, “When I’m weak, I’m strong,” and that they should expect God to use them in areas other than their areas of giftedness. Emphasizing helps or service to the group as a prerequisite to church ministry. This might take the form of requiring that anyone wanting to serve in any way first have the responsibility of cleaning toilets or cleaning the church for a specified time, that anyone wanting to sing in the worship band must first sing to the children in Sunday School, or that before exercising any gifts at all, members must demonstrate loyalty to the group by faithful attendance at all functions and such things as tithing. No consideration is given to the length of time a new member has been a Christian or to his age or station in life or his unique talents or abilities. The rules apply to everyone alike. This has the effect of reducing everyone to some kind of lowest common denominator where no one’s gifts or natural abilities are valued or appreciated, where the individual is not cherished for the unique blessing he or she is to the body of Christ, where what is most highly valued is service, obedience, submission to authority, and performance without regard to gifts or abilities or, for that matter, individual limitations.

Bishop Joseph at St. John the Forerunner Monastery
Bishop Joseph at St. John the Forerunner Monastery

Biderman Chart

Psychopaths in Sheep’s Clothing

An Excerpt from the book: In Sheep’s Clothing by George K. Simon

wolves 2

Two Basic Types of Aggression

There are two basic types of aggression: overt-aggression and covert-aggression. When you’re determined to have something and you’re open, direct and obvious in your manner of fighting, your behavior is best labeled overtly aggressive. When you’re out to “win,” dominate or control, but are subtle, underhanded or deceptive enough to hide your true intentions, your behavior is most appropriately labeled covertly aggressive. Now, avoiding any overt display of aggression while simultaneously intimidating others into giving you what you want is a powerfully manipulative maneuver. That’s why covert-aggression is most often the vehicle for interpersonal manipulation.

Acts of Covert-Aggression vs. Covert-Aggressive Personalities

Most of us have engaged in some sort of covertly aggressive behavior from time to time. Periodically trying to manipulate a person or a situation doesn’t make someone a covert-aggressive personality. Personality can be defined by the way a person habitually perceives, relates to and interacts with others and the world at large.

The tactics of deceit, manipulation and control are a steady diet for covert-aggressive personality. It’s the way they prefer to deal with others and to get the things they want in life.

The Process of Victimization

For a long time, I wondered why manipulation victims have a hard time seeing what really goes on in manipulative interactions. At first, I was tempted to fault them. But I’ve learned that they get hoodwinked for some very good reasons:

1. A manipulator’s aggression is not obvious. Our gut may tell us that they’re fighting for something, struggling to overcome us, gain power, or have their way, and we find ourselves unconsciously on the defensive. But because we can’t point to clear, objective evidence they’re aggressing against us, we can’t readily validate our feelings.

2. The tactics manipulators use can make it seem like they’re hurting, caring, defending, …, almost anything but fighting. These tactics are hard to recognize as merely clever ploys. They always make just enough sense to make a person doubt their gut hunch that they’re being taken advantage of or abused. Besides, the tactics not only make it hard for you to consciously and objectively tell that a manipulator is fighting, but they also simultaneously keep you or consciously on the defensive. These features make them highly effective psychological weapons to which anyone can be vulnerable. It’s hard to think clearly when someone has you emotionally on the run.

3. All of us have weaknesses and insecurities that a clever manipulator might exploit. Sometimes, we’re aware of these weaknesses and how someone might use them to take advantage of us. For example, I hear parents say things like: “Yeah, I know I have a big guilt button.” – But at the time their manipulative child is busily pushing that button, they can easily forget what’s really going on. Besides, sometimes we’re unaware of our biggest vulnerabilities. Manipulators often know us better than we know ourselves. They know what buttons to push, when and how hard. Our lack of self-knowledge sets us up to be exploited.

4. What our gut tells us a manipulator is like, challenges everything we’ve been taught to believe about human nature. We’ve been inundated with a psychology that has us seeing everybody, at least to some degree, as afraid, insecure or “hung-up.” So, while our gut tells us we’re dealing with a ruthless conniver, our head tells us they must be really frightened or wounded “underneath.” What’s more, most of us generally hate to think of ourselves as callous and insensitive people. We hesitate to make harsh or seemingly negative judgments about others. We want to give them the benefit of the doubt and assume they don’t really harbor the malevolent intentions we suspect. We’re more apt to doubt and blame ourselves for daring to believe what our gut tells us about our manipulator’s character.

Recognizing Aggressive Agendas

Accepting how fundamental it is for people to fight for the things they want and becoming more aware of the subtle, underhanded ways people can and do fight in their daily endeavors and relationships can be very consciousness expanding. Learning to recognize an aggressive move when somebody makes one and learning how to handle oneself in any of life’s many battles, has turned out to be the most empowering experience for the manipulation victims with whom I’ve worked. It’s how they eventually freed themselves from their manipulator’s dominance and control and gained a much needed boost to their own sense of self esteem. Recognizing the inherent aggression in manipulative behavior and becoming more aware of the slick, surreptitious ways that manipulative people prefer to aggress against us is extremely important. Not recognizing and accurately labeling their subtly aggressive moves causes most people to misinterpret the behavior of manipulators and, therefore, fail to respond to them in an appropriate fashion. Recognizing when and how manipulators are fighting with covertly aggressive tactics is essential.

Defense Mechanisms and Offensive Tactics

Almost everyone is familiar with the term defense mechanism. Defense mechanisms are the “automatic” (i.e. unconscious) mental behaviors all of us employ to protect or defend ourselves from the “threat” of some emotional pain. More specifically, ego defense mechanisms are mental behaviors we use to “defend” our self-images from “invitations” to feel ashamed or guilty about something. There are many different kinds of ego defenses and the more traditional (psychodynamic) theories of personality have always tended to distinguish the various personality types, at least in part, by the types of ego defenses they prefer to use. One of the problems with psychodynamic approaches to understanding human behavior is that they tend to depict people as most always afraid of something and defending or protecting themselves in some way; even when they’re in the act of aggressing. Covert-aggressive personalities (indeed all aggressive personalities) use a variety of mental behaviors and interpersonal maneuvers to help ensure they get what they want. Some of these behaviors have been traditionally thought of as defense mechanisms.

While, from a certain perspective we might say someone engaging in these behaviors is defending their ego from any sense of shame or guilt, it’s important to realize that at the time the aggressor is exhibiting these behaviors, he is not primarily defending (i.e. attempting to prevent some internally painful event from occurring), but rather fighting to maintain position, gain power and to remove any obstacles (both internal and external) in the way of getting what he wants. Seeing the aggressor as on the defensive in any sense is a set-up for victimization. Recognizing that they’re primarily on the offensive, mentally prepares a person for the decisive action they need to take in order to avoid being run over. Therefore, I think it’s best to conceptualize many of the mental behaviors (no matter how “automatic” or “unconscious” they may appear) we often think of as defense mechanisms, as offensive power tactics, because aggressive personalities employ them primarily to manipulate, control and achieve dominance over others. Rather than trying to prevent something emotionally painful or dreadful from happening, anyone using these tactics is primarily trying to ensure that something they want to happen does indeed happen. Using the vignettes presented in the previous chapters for illustration, let’s take a look at the principal tactics covert-aggressive personalities use to ensure they get their way and maintain a position of power over their victims:

Denial – This is when the aggressor refuses to admit that they’ve done something harmful or hurtful when they clearly have. It’s a way they lie (to themselves as well as to others) about their aggressive intentions. This “Who… Me?” tactic is a way of “playing innocent,” and invites the victim to feel unjustified in confronting the aggressor about the inappropriateness of a behavior. It’s also the way the aggressor gives him/herself permission to keep right on doing what they want to do. This denial is not the same kind of denial that a person who has just lost a loved one and can’t quite bear to accept the pain and reality of the loss engages in. That type of denial really is mostly a “defense” against unbearable hurt and anxiety. Rather, this type of denial is not primarily a “defense” but a maneuver the aggressor uses to get others to back off, back down or maybe even feel guilty themselves for insinuating he’s doing something wrong.

In the story of James the minister, James’ denial of his ruthless ambition is massive. He denied he was hurting and neglecting his family. He especially denied he was aggressively pursuing any personal agenda. On the contrary, he cast himself as the humble servant to a honorable cause. He managed to convince several people (and maybe even himself) of the nobility and purity of his intentions. But underneath it all, James knew he was being dishonest: This fact is borne out in his reaction to the threat of not getting a seat on the Elders’ Council if his marital problems worsened. When James learned he might not get what he was so aggressively pursuing after all, he had an interesting “conversion” experience. All of a sudden, he decided he could put aside the Lord’s bidding for a weekend and he might really need to devote more time to his marriage and family. James’ eyes weren’t opened by the pastor’s words. He always kept his awareness high about what might hinder or advance his cause. He knew if he didn’t tend to his marriage he might lose what he really wanted. So, he chose (at least temporarily) to alter course.

In the story of Joe and Mary, Mary confronted Joe several times about what she felt was insensitivity and ruthlessness on his part in his treatment of Lisa. Joe denied his aggressiveness. He also successfully convinced Mary that what she felt in her gut was his aggressiveness was really conscientiousness, loyalty, and passionate fatherly concern. Joe wanted a daughter who got all A’s. Mary stood in the way. Joe’s denial was the tactic he used to remove Mary as an obstacle to what he wanted.

Selective Inattention – This tactic is similar to and sometimes mistaken for denial It’s when the aggressor “plays dumb,” or acts oblivious. When engaging in this tactic, the aggressor actively ignores the warnings, pleas or wishes of others, and in general, refuses to pay attention to everything and anything that might distract them from pursuing their own agenda. Often, the aggressor knows full well what you want from him when he starts to exhibit this “I don’t want to hear it!” behavior. By using this tactic, the aggressor actively resists submitting himself to the tasks of paying attention to or refraining from the behavior you want him to change. In the story of Jenny and Amanda, Jenny tried to tell Amanda she was losing privileges because she was behaving irresponsibly. But Amanda wouldn’t listen. Her teachers tried to tell her what she needed to do to improve her grade: but she didn’t listen to them either. Actively listening to and heeding the suggestions of someone else are, among other things, acts of submission. And, as you may remember from the story, Amanda is not a girl who submits easily. Determined to let nothing stand in her way and convinced she could eventually “win” most of her power struggles with authority figures through manipulation, Amanda closed her ears. She didn’t see any need to listen. From her point of view, she would only have lost some power and control if she submitted herself to the guidance and direction offered by those whom she views as less powerful, clever and capable as herself.

Rationalization – A rationalization is the excuse an aggressor tries to offer for engaging in an inappropriate or harmful behavior. It can be an effective tactic, especially when the explanation or justification the aggressor offers makes just enough sense that any reasonably conscientious person is likely to fall for it. It’s a powerful tactic because it not only serves to remove any internal resistance the aggressor might have about doing what he wants to do (quieting any qualms of conscience he might have) but also to keep others off his back. If the aggressor can convince you he’s justified in whatever he’s doing, then he’s freer to pursue his goals without interference.

In the story of little Lisa, Mary felt uneasy about the relentlessness with which Joe pursued his quest to make his daughter an obedient, all-A student once again. And, she was aware of Lisa’s expressed desire to pursue counseling as a means of addressing and perhaps solving some of her problems. Although Mary felt uneasy about Joe’s forcefulness and sensed the impact on her daughter, she allowed herself to become persuaded by his rationalizations that any concerned parent ought to know his daughter better than some relatively dispassionate outsider and that he was only doing his duty by doing as much as he possibly could to “help” his “little girl.” When a manipulator really wants to make headway with their rationalizations they’ll be sure their excuses are combined with other effective tactics. For example, when Joe was “selling” Mary on the justification for shoving his agenda down everyone’s throat he was also sending out subtle invitations for her to feel ashamed (shaming her for not being as “concerned” a parent as he was) as well as making her feel guilty (guilt-tripping her) for not being as conscientious as he was pretending to be.

Diversion – A moving target is hard to hit. When we try to pin a manipulator down or try to keep a discussion focused on a single issue or behavior we don’t like, he’s expert at knowing how to change the subject, dodge the issue or in some way throw us a curve. Manipulators use distraction and diversion techniques to keep the focus off their behavior, move us off-track, and keep themselves free to promote their self-serving hidden agendas.

Rather than respond directly to the issue being addressed, Amanda diverted attention to her teacher’s and classmates’ treatment of her. Jenny allowed Amanda to steer her off track. She never got a straight answer to the question.

Another example of a diversion tactic can be found in the story of Don and Al. Al changed the subject when Don asked him if he had any plans to replace him. He focused on whether he was unhappy or not with Don’s sales performance – as if that’s what Don had asked him about in the first place. He never gave Don a straight answer to a straight question (manipulators are notorious for this). He told him what he thought would make Don feel less anxious and would steer him away from pursuing the matter any further. Al left feeling like he’d gotten an answer but all he really got was the “runaround.”

Early in the current school year, I found it necessary to address my son’s irresponsibility about doing his homework by making a rule that he bring his books home every night. One time I asked: “Did you bring your books home today?” His response was: “Guess what, Dad. Instead of tomorrow, we’re not going to have our test – until Friday.” My question was simple and direct. His answer was deliberately evasive and diversionary. He knew that if he answered the question directly and honestly, he would have received a consequence for failing to bring his books home. By using diversion (and also offering a rationalization) he was already fighting with me to avoid that consequence. Whenever someone is not responding directly to an issue, you can safely assume that for some reason, they’re trying to give you the slip.

Lying – It’s often hard to tell when a person is lying at the time he’s doing it. Fortunately, there are times when the truth will out because circumstances don’t bear out somebody’s story. But there are also times when you don’t know you’ve been deceived until it’s too late. One way to minimize the chances that someone will put one over on you is to remember that because aggressive personalities of all types will generally stop at nothing to get what they want, you can expect them to lie and cheat. Another thing to remember is that manipulators – covert-aggressive personalities that they are – are prone to lie in subtle, covert ways. Courts are well aware of the many ways that people lie, as they require that court oaths charge that testifiers tell “the truth, the whole truth, and nothing but the truth.” Manipulators often lie by withholding a significant amount of the truth from you or by distorting the truth. They are adept at being vague when you ask them direct questions. This is an especially slick way of lying’ omission. Keep this in mind when dealing with a suspected wolf in sheep’s clothing. Always seek and obtain specific, confirmable information.

Covert Intimidation – Aggressors frequently threaten their victims to keep them anxious, apprehensive and in a one-down position. Covert-aggressives intimidate their victims by making veiled (subtle, indirect or implied) threats. Guilt-tripping and shaming are two of the covert-aggressive’s favourite weapons. Both are special intimidation tactics.

Guilt-tripping – One thing that aggressive personalities know well is that other types of persons have very different consciences than they do. Manipulators are often skilled at using what they know to be the greater conscientiousness of their victims as a means of keeping them in a self-doubting, anxious, and submissive position. The more conscientious the potential victim, the more effective guilt is as a weapon. Aggressive personalities of all types use guilt-tripping so frequently and effectively as a manipulative tactic, that I believe it illustrates how fundamentally different in character they are compared to other (especially neurotic) personalities. All a manipulator has to do is suggest to the conscientious person that they don’t care enough, are too selfish, etc., and that person immediately starts to feel bad. On the contrary, a conscientious person might try until they’re blue in the face to get a manipulator (or any other aggressive personality) to feel badly about a hurtful behavior, acknowledge responsibility, or admit wrongdoing, to absolutely no avail.

Shaming – This is the technique of using subtle sarcasm and put-downs as a means of increasing fear and self-doubt in others. Covert-aggressives use this tactic to make others feel inadequate or unworthy, and therefore, defer to them. It’s an effective way to foster a continued sense of personal inadequacy in the weaker party, thereby allowing an aggressor to maintain a position of dominance.

When Joe loudly proclaimed any “good” parent would do just as he was doing to help Lisa, he subtly implied Mary would be a “bad” parent if she didn’t attempt to do the same. He “invited” her to feel ashamed of herself. The tactic was effective. Mary eventually felt ashamed for taking a position that made it appear she didn’t care enough about her own daughter. Even more doubtful of her worth as a person and a parent, Mary deferred to Joe, thus enabling him to rein a position of dominance over her. Covert-aggressives are expert at using shaming tactics in the most subtle ways. Sometimes it can just be in the glances they give or the tone of voice they use. Using rhetorical comments, subtle sarcasm and other techniques, they can invite you to feel ashamed of yourself for even daring to challenge them. Joe tried to shame Mary when I considered accepting the educational assessment performed by Lisa’s school. He said something like: “I’m not sure what kind of doctor you are or just what kind of credentials you have, but I’m sure you’d agree that a youngster’s grades wouldn’t slip as much as Lisa’s for no reason. You couldn’t be entirely certain she didn’t have a learning disability unless you did some testing, could you?’ With those words, he “invited” Mary to feel ashamed of herself for not at least considering doing just as he asked. If Mary didn’t have a suspicion about what he was up to, she might have accepted this invitation without a second thought.

Playing the Victim Role – This tactic involves portraying oneself as an innocent victim of circumstances or someone else’s behavior in order to gain sympathy, evoke compassion and thereby get something from another. One thing that covert-aggressive personalities count on is the fact that less calloused and less hostile personalities usually can’t stand to see anyone suffering. Therefore, the tactic is simple. Convince your victim you’re suffering in some way, and they’ll try to relieve your distress.

In the story of Amanda and Jenny, Amanda was good at playing the victim role too. She had her mother believing that she (Amanda) was the victim of extremely unfair treatment and the target of unwarranted hostility. I remember Jenny telling me: “Sometimes I think Amanda’s wrong when she says her teacher hates her and I hate her. But what if that’s what she really believes? Can I afford to be so firm with her if she believes in her heart that I hate her?” I remember telling Jenny: “Whether Amanda has come to believe her own distortions is almost irrelevant. She manipulates you because you believe that she believes it and allow that supposed belief to serve as an excuse for her undisciplined aggression.”

Vilifying the Victim – This tactic is frequently used in conjunction with the tactic of playing the victim role. The aggressor uses this tactic to make it appear he is only responding (i.e. defending himself against) aggression on the part of the victim. It enables the aggressor to better put the victim on the defensive.

Returning again to the story of Jenny and Amanda, when Amanda accuses her mother of “hating” her and “always saying mean things” to her, she not only invites Jenny to feel the “bully,” but simultaneously succeeds in “bullying” Jenny into backing off. More than any other, the tactic of vilifying the victim is a powerful means of putting someone unconsciously on the defensive while simultaneously masking the aggressive intent and behavior of the person using the tactic.

Playing the Servant Role – Covert-aggressives use this tactic to cloak their self-serving agendas in the guise of service to a more noble cause. It’s a common tactic but difficult to recognize. By pretending to be working hard on someone else’s behalf, covert-aggressives conceal their own ambition, desire for power, and quest for a position of dominance over others. In the story of James (the minister) and Sean, James appeared to many to be the tireless servant. He attended more activities than he needed to attend and did so eagerly. But if devoted service to those who needed him was his aim, how does one explain the degree to which James habitually neglected his family? As an aggressive personality, James submits himself to no one. The only master he serves is his own ambition. Not only was playing the servant role an effective tactic for James, but also it’s the cornerstone upon which corrupt ministerial empires of all types are built. A good example comes to mind in the recent true story of a well-known tele-evangelist who locked himself up in a room in a purported display of “obedience” and “service” to God. He even portrayed himself’ a willing sacrificial lamb who was prepared to be “taken by God” if he didn’t do the Almighty’s bidding and raise eight million dollars. He claimed he was a humble servant, merely heeding the Lord’s will. He was really fighting to save his substantial material empire.

Another recent scandal involving a tele-evangelist resulted in his church’s governance body censuring him for one year. But he told his congregation he couldn’t stop his ministry because he had to be faithful to the Lord’s will (God supposedly talked to him and told him not to quit). This minister was clearly being defiant of his church’s established authority. Yet, he presented himself as a person being humbly submissive to the “highest” authority. One hallmark characteristic of covert-aggressive personalities is loudly professing subservience while fighting for dominance.

Seduction – Covert-aggressive personalities are adept at charming, praising, flattering or overtly supporting others in order to get them to lower their defenses and surrender their trust and loyalty. Covert-aggressives are also particularly aware that people who are to some extent emotionally needy and dependent (and that includes most people who aren’t character-disordered) want approval, reassurance, and a sense of being valued and needed more than anything. Appearing to be attentive to these needs can be a manipulator’s ticket to incredible power over others. Shady “gurus” like Jim Jones and David Koresh seemed to have refined this tactic to an art. In the story of Al and Don, Al is the consummate seducer. He melts any resistance you might have to giving him your loyalty and confidence. He does this by giving you what he knows you need most. He knows you want to feel valued and important. So, he often tells you that you are. You don’t find out how unimportant you really are to him until you turn out to be in his way.

Projecting the blame (blaming others) – Aggressive personalities are always looking for a way to shift the blame for their aggressive behavior. Covert-aggressives are not only skilled at finding scapegoats, they’re expert at doing so in subtle, hard to detect ways.

Minimization – This tactic is a unique kind of denial coupled with rationalization. When using this maneuver, the aggressor is attempting to assert that his abusive behavior isn’t really as harmful or irresponsible as someone else may be claiming. It’s the aggressor’s attempt to make a molehill out of a mountain.

I’ve presented the principal tactics that covert-aggressives use to manipulate and control others. They are not always easy to recognize. Although all aggressive personalities tend to use these tactics, covert-aggressives generally use them slickly, subtly and adeptly. Anyone dealing with a covertly aggressive person will need to heighten gut-level sensitivity to the use of these tactics if they’re to avoid being taken in by them.

Wolf-in-Sheep’s-Clothing-520x552