Print

 

Report of the APA Task Force on Deceptive and Indirect Techniques of Persuasion and Control

 

November 1986
Margaret Thaler Singer, University of California Berkeley; Harold Goldstein, National Institute of Mental Health; Michael D. Langone, American Family Foundation; Jesse S. Miller, San Francisco, California; Maurice K. Temerlin, Clinical Psychology Consultants, Inc.; Louis J. West, University of California Los Angeles

 

Topics


Historical Background
Cults
Definitional Issues
Religious Cults
Types of religious cults
Harms associated with religious cults
Methodological considerations
The brainwashing/deprogramming controversy.
Psychotherapy Cults
Literature review.
Legal cases.
Non-professional cults.
Large-Group Awareness Training
LGAT Historical Background
LGAT Review of the Literature
LGAT Conclusions
Analysis
The Continuum of Influence: A Proposal
Influence Continuum
Ethical Issues for Psychologists
Ethical Issues for Nonpsychologists
Recommendations

Abstract

Cults and large group awareness trainings have generated considerable controversy because of their widespread use of deceptive and indirect techniques of persuasion and control. These techniques can compromise individual freedom, and their use has resulted in serious harm to thousands of individuals and families. This report reviews the literature on this subject, proposes a new way of conceptualizing influence techniques, explores the ethical ramifications of deceptive and indirect techniques of persuasion and control, and makes recommendations addressing the problems described in the report.

Report of the APA Task Force on
Deceptive and Indirect Techniques of
Persuasion and Control

In recent years, cultic groups in the areas of religion, politics, and psychotherapy have generated considerable public criticism as a result of the harmful consequences of the techniques such groups use to recruit, persuade, and control their members. Many of these techniques are highly, though often subtly, manipulative and deceptive. The casualties of the nondiscriminating and unethical use of such techniques frequently wind up in the clinical or counseling psychologist's office.

The American Psychological Association has long involved itself with the ethical aspects of psychological techniques and practices, e.g., the APA's Task Force on Behavior Modification. Deceptive and indirect techniques of persuasion and control, however, have not been adequately examined; nor have the ethical principles pertinent to their application been well defined.

Therefore, the Board of Social and Ethical Responsibility in Psychology (BSERP) instituted in the fall of 1983 a Planning Committee on the Use of Coercive Psychological Techniques. (In 1984 the American Bar Association established a similar group, the Personal Litigation Subcommittee on Cults.) The Planning Committee concluded that the importance of the issue under study, especially considering the unsophisticated understanding of influence processes demonstrated by the media and the general public, called for the establishment of an APA Task Force on Deceptive and Indirect Techniques of Persuasion and Control.

  1. The freedom to make informed, autonomous decisions beneficial to the individual is central to our culture.
  2. "...Freedom is determined by the number of options available to people and the right to exercise them. The more behavioral alternatives and social prerogatives people have, the greater is their freedom of action." (Bandura, 1974, p. 815)
  3. Deceptive and indirect techniques of persuasion and control limit individuals' freedom by diminishing or restricting their alternatives, causing them to incorrectly evaluate the requirements and consequences of alternatives, or inducing them to perceive fewer alternatives than in fact exist.

Given these assumptions, the research to be reviewed later in this report, and the professional and research experiences of committee members, it seemed clear that individuals can be induced to make uninformed, personally detrimental decisions while under the illusion that their decisions are voluntary and to their benefit. In order to increase understanding of this phenomenon, the Committee charged the Task Force to:

  1. Describe the deceptive and indirect techniques of persuasion and control that may limit freedom and adversely affect individuals, families, and society.
  2. Review the data base in the field.
  3. Define the implications of deceptive and indirect techniques of persuasion and control for consumers of psychological services.
  4. Examine the ethical, educational, and social implications of this problem.

In order to ensure that this report is a reasonable length, the Committee recommended that the Task Force concentrate its efforts on certain controversial areas in which deceptive and indirect techniques of persuasion and control are extensively employed. Hence, this report will not concern itself with brief, single-episode situations in which one person deceives another (scams, street hustles, pigeon drops, bank examiner bunco artist operations, flim flam sales techniques). This is the world of deceptions with which the criminal justice system toils daily. Neither will the deceptions, persuasions and coercions used by psychopathic personalities and criminals be the center of attention. Rather, this report focuses on systematically organized efforts to influence and control behavior through the use of deceptive and indirect techniques in religious and psychotherapy cults and large-group awareness trainings (or what Cushman, 1986, calls "mass marathon psychology organizations").

Historical Background

During this century a series of events demonstrated that individual autonomy is much more fragile than was commonly believed. The Russian purge trials of the 1930s manipulated men and women into falsely confessing to crimes and falsely accusing others of having committed crimes (Mindszenty, 1974). The world press expressed bewilderment and amazement at the phenomenon, but, with few exceptions, soon lapsed into silence (Rogge, 1959). The late 1940's and early 1950's saw the effects of the revolutionary universities in China and the subjugation of an entire nation to a thought reform program which induced millions to espouse new philosophies and exhibit new behaviors (Chen, 1960; Hinkle and Wolff, 1956; Hunter, 1951; Lifton, 1961; Meerloo, 1951; Sargant, 1951, 1957, 1973; Schein, 1961).

Next came the Korean conflict in which United Nations' prisoners of war were subjected to an indoctrination program based on methods growing out of the Chinese thought reform program, combined with other social and psychological influence techniques. At that time, the term "brainwashing" was introduced into our vocabulary, "a colloquial term applied to any technique designed to manipulate human thought or action against the desire, will, or knowledge of the individual." (Encyclopedia Britannica, 1975)

After a few years, public interest in human influence and manipulation subsided. In academia, however, valuable though sometimes controversial research was conducted. Asch's (1952) conformity studies, Milgram's (1974) shock experiments, and Zimbardo's (Zimbardo, Ebbesen, & Maslach, 1977) prison role-play experiment are merely some of the many studies that delineated social-psychological influences on group behavior (see Cialdini, 1984, for a recent review).

As this academic work was proceeding, other significant events began to recapture the public's interest in influence processes. Charles Manson's diabolical control over a group of middle-class youths shocked the world during the early 1970's (Atkins, 1978; Bugliosi, 1974; Watkins, 1979). Soon after, in 1976, the Symbionese Liberation Army, a small California terrorist group, kidnapped Patricia Hearst and manipulated and controlled her behavior (Hearst, 1982). By the mid-1970's, thousands of families in the United States were puzzled and alarmed about the influence a vast array of new gurus, messiahs, and mind-manipulators had over their offspring. Then, on November 18, 1978 Jim Jones led 912 followers to death in a Guyanese jungle (Reiterman and Jacobs, 1982). Jim Jones's final hours of domination brought the concepts of influence, persuasion, thought reform, and brainwashing to the attention of the world.

In the post-Jonestown world, thousands of families who had relatives in various cultic groups began to be heard. The government responded to this pressure by conducting hearings on "mind-control" techniques in religious cults (Dole, 1977; Fraser, 1978; King, 1979; Lasher, 1979; Lefkowitz, 1974). Although these hearings ultimately led to some prosecutions of wrongdoing (e.g., Rev. Moon's tax evasion case), concerns about interfering with freedom of religion constrained authorities considerably (see Delgado, 1978, 1984, and Lucksted and Martel, 1982, for reviews of legal issues in this area). Frustrated families, then, had to learn to fend for themselves or seek help elsewhere. Many turned to mental health professionals.

The first wave of families seeking help were, for the most part, describing sudden and frightening personality changes in relatives who had become involved with various religious and philosophical cults (cf. Addis, Schulman-Miller, & Lightman, 1984; Clark, Langone, Schecter, & Daly, 1981; Langone, 1983; Singer,

1978, 1979, 1986; West and Singer, 1980). But as Singer (1979a; 1979b; 1986) and West and Singer (1980) noted, throughout history the ever-present, self-appointed messiahs, gurus, and pied pipers appear to adapt to changing times. Thus, as persuaders moved into new realms, more families began to seek consultations on situations that involved what, for lack of a better term, we call "cultic relationships." Cultic relationships, which are characterized by a state of induced dependency, can be found in certain pseudo-growth groups, pseudo-therapy groups, commercial groups, the mainline religious fringe, religious cults, and other undue influence situations.

Some of the larger, more powerful cultic groups have branches in many countries, extensive property holdings, subsidiary organizations with special names for special purposes, and a growing degree of influence. Besides the governmental investigations noted earlier, international concerns about the detrimental effects of certain cults upon the well-being of their members, families, and society in general have led to a recent national conference on the problem in Germany, a nationally televised debate in Spain, a Canadian government study (Hill, 198x), a French government investigation (Vivien, 1985), a resolution by the European Parliament (Cult Observer, 19xx), and numerous conferences in the United States and other countries.

Although estimates vary, there are approximately 3000 cultic groups in the United States alone. Most of these groups are relatively small, but others have tens of thousands of members and incomes of many millions of dollars a year. Based on observations of turnover, membership estimates of former members, and a few studies (Bird & Reimer, 1982; Zimbardo & Hartley, 1985), it seems probable that at least ten million Americans have been at least transiently involved with cultic groups during the past decade and a half.

The elderly and the very young are not excluded from cultic groups, as demonstrated by the membership of the Peoples Temple and the demography of the dead at Jonestown. However, persons between the ages of 18 and 30 are especially subject to recruitment. A recent study of students in the San Francisco area found that half were open to accepting an invitation to attend a cult meeting, and approximately 3% reported that they already belonged to cultic groups (Zimbardo & Hartley, 1985).

Mental health professionals who have studied the matter believe that no single personality profile characterizes those who join cultic groups (cf. Ash, 1985). Many well-adjusted, high-achieving individuals from intact families have been successfully recruited by cults. So have persons with varying degrees of psychological impairment. To the extent that predisposing factors exist, they may include one or more of the following: naive idealism, situational stress (frequently related to the normal crises of adolescence and young adulthood, such as romantic disappointment or school problems), dependency, disillusionment, an excessively trusting nature, coming from enmeshed families, or ignorance of the ways in which groups can manipulate individuals.

This report does not explore the personality factors which make some individuals especially susceptible to cultic enticements (see Ash, 1985; Clark, Langone, Schecter, & Daly, 1981; Schwartz, 198x; Zerin, 198x; and Zimbardo & Hartley, 1985 for discussions of personality variables). Nor does the report analyze the structure of the groups or situations in which systematic manipulations of social and psychological influence techniques bring about sometimes radical changes in behavior and belief. Instead, the report concentrates on psychological influence techniques and their consequences, as exemplified in cults and large-group awareness trainings.

Cults

Definitional Issues

Some scholars shun the word cult, preferring instead the term new religion, presumably because of the negative connotations of cult (Bromley & Shupe, 1981; Kilbourne & Richardson, 1984; Robbins & Anthony, 1981). Although this view is appealing in certain ways (because some of the several thousand known "cults" seem essentially harmless), it is misleading. New religions are not essentially like old religions, except for their newness. There are many other differences, not the least of which is the presence of institutionalized mechanisms of accountability. Furthermore, if all groups called cults are termed new religions, what happens to the term cult? Is it banished from the English language? Is a group like the Manson family a new religion? And what about non-religious cults, e.g., the Symbionese Liberation Army, or the growing number of nameless psychotherapy cults to be described later?

Webster's Third New International Dictionary (Unabridged, 1966) provides several definitions of cult, among which are: (1) a religion regarded as unorthodox or spurious; (2) ... a system for the cure of disease based on the dogma, tenets, or principles set forth by its promulgator to the exclusion of scientific experience or demonstration; (3) ...a great or excessive devotion or dedication to some person, idea, or thing...the object of such devotion...a body of persons characterized by such devotion.

These definitions clearly describe many cultic groups, characterized by extremist tendencies of one form or another, for which the term new religion would be inappropriate.

Furthermore, such cults often develop, as noted earlier, in diverse social areas, including politics, psychotherapy, religion, education, and business. Hence relying exclusively on a term such as new religion results in a disregard for non-religious cults, an attitude of deviance deamplification toward extremist cults, and a tendency to gloss over critical differences between cultic and non-cultic groups. Representative of this point of view is an article by Kilbourne and Richardson (1984), which argues that psychotherapy and religious cults ("new religions" in the authors' terminology) are "functionally equivalent." Kriegman and Solomon (1985) criticize the logic of this position:

Kilbourne and Richardson propose that psychotherapy and the new religions (cults) are "functionally equivalent." Psychotherapy and the "new religions" offer to ameliorate emotional suffering. But so do the purveyors of drugs, astrology, exercise and diet programs, and vacation get-aways. They may all be "competitors for a limited market," to use Kilbourne and Richardson's expression, but the critical question is what differentiates them and their followers. (p. 11)

Cultic groups have caused concern because they tend, unlike psychotherapy, to be totalist in character and to exploit rather than fulfill needs. The following definition illuminates the differences between groups that are merely unorthodox or innovative and groups that are cultic:

Cult (totalist type): a group or movement exhibiting a great or excessive devotion or dedication to some person, idea, or thing and employing unethically manipulative (i.e., deceptive and indirect) techniques of persuasion and control designed to advance the goals of the group's leaders, to the actual or possible detriment of members, their families, or the community. Unethically manipulative techniques include isolation from former friends and family, debilitation, use of special methods to heighten suggestibility and subservience, powerful group pressures, information management, suspension of individuality or critical judgment, promotion of total dependency on the group and fear of leaving it, etc.

Totalist cults, then, are likely to exhibit three elements to varying degrees: (1) excessively zealous, unquestioning commitment by members to the identity and leadership of the group; (2) exploitative manipulation of members, and (3) harm or the danger of harm. Totalist cults may be distinguished from new religious movements, new political movements, and innovative psychotherapies (terms that can be used to refer to unorthodox but relatively benign groups), if not by their professed beliefs then certainly by their actual practices. The term cult as employed henceforth in this report is intended to mean "totalist cults" as defined above (and as defined by contemporary usage, which has emphasized the negative connotations of the word, cult).

Religious Cults

Types of religious cults . Sociologists have proposed a variety of classification systems for religious Cults. Wilson (1976) distinguishes groups according to whether they teach that salvation is gained from knowledge stemming from a mystic source, the liberation of the self's own powers, or affiliation with a saved community. Campbell (1979) also suggests a tripartite classification, consisting of illumination (mystical), instrumental (self-adjustment), and service-oriented cults. Anthony and Robbins (1983) divide groups into dualistic and monistic movements, the former tending to be concerned with changing a morally defective world, and the latter advocating inner spiritual transformation. Wallis (1984) proposes a typology in which groups are divided into world-rejecting, world-affirming, and world-accommodating new religions.

Although these sociological models have merit, their utility is closely tied to the sociological theories on which they rest. A less ambitious, though perhaps immediately satisfying, approach is to look at religious cults and new religions in light of the religious traditions to which they are historically tied. Thus, one may speak of "Bible-based" (Christianity), Eastern (Hindu, Buddhist, Taoist, Sufi/Moslem), satanic (Satan worship), and eclectic (drawing on several traditions) groups. Those who take this approach tend to distinguish between new religions and cults according to their behavior and practices, rather than doctrinal variations from the traditions to which they are linked. The definition of cult proposed earlier implies such a typology in that new religions depart in doctrine or practice from the various religious traditions but do not exhibit the totalist tendencies which characterize cults.

Harms associated with religious cults . Much of the controversy surrounding cults revolves around the contention that some cultic groups harm individuals and society. Although it has been argued that such accusations are mere "atrocity tales" designed to stigmatize deviant groups (Bromley, Shupe, Ventimiglia, 1983), considerable evidence indicates that many so-called "atrocity tales" are factual. Although the quantity and quality of the evidence varies and is far from one-sided, it seems clear that many religious cults have seriously harmed the physical or psychological well-being of members or their families (Langone & Clark, 1985). There are also reports detailing concerns about legal, political, and economic abuses perpetrated by certain groups, most of which are religious in nature (Antidefamation League, 19xx; Boettcher, 1980; Casgrain, 1986 (Kropveld); Delgado, 1977, 1982, 1985; Dole, 1977; D'Souza, 1985; Grafstein, 1984; Hill, 1980; McLeod, 1986; Rudin, 1979/80; Spero, 1984; Williams, 1980).

No reliable data exist which would permit a comparison of the frequency of physical or psychological harm in religious cults and in mainstream society. Furthermore, because of the number and variety of cults and their tendency to be suspicious of outsiders, it is possible that such data will never exist. Nevertheless, if ten to fifteen cultic groups were randomly selected from current lists (Hulet, 19xx) of cults and studied systematically and in depth, reasonably confident generalizations might be made. In the absence of such a costly study, however, conclusions must be based on anecdotal reports and investigations of groups, which have caught researchers' attention for one reason or another.

The most striking examples of cult-related harm involve death, violence, and child abuse. Jonestown (Reiterman & Jacobs, 1982) certainly is the preeminent example, although it is not the only instance of cultic mass suicide in history. (Robbins, 198x). Law enforcement officials report increasing numbers of ritual killings (of animals and humans) apparently linked to small satanic cults (Baird, 1984; Boston Globe, 1984; Gallant, 1985;Groh, 1984; Kunhardt & Wooden, 1985; McFadden, 1984). The Center for Disease Control found that members of Faith Assembly in Indiana had a maternal death rate 100 times the state average and a perinatal death rate 3 times the state average (MMWR, 1984), while the Fort Wayne News-Sentinel documented the deaths of 65 people who died after they or their parents followed the sects teachings (News-Sentinel, June 1, 1984). A lawyer investigating Synanon, a drug "rehabilitation" community turned religion, was bitten by a rattlesnake cult members placed in his mailbox (Anson, 1978). White supremacist groups have been linked to murders and death threats (Maddis, 1985; Ridgeway, 1985). Three top disciples of guru Bhagwan Shree Rajneesh were indicted for attempted murder, conspiracy to commit murder, and first-degree assault, all of which charges related to the leaders' attempts to control members of the commune (Cult Observer, March/April 1986). So many children were reportedly beaten at the Northeast Kingdom Community Church in Vermont (Grizzuti-Harrison, 1984) that state authorities removed 112 children (Burchard, 1984), a move that was overturned by a controversial judicial decision (Boston Globe, June 23, 1984). Sixty-two youths were removed from the House of Judah following the death of a child (New York Times, July 9, 1983). And countless other cases of beatings and medical neglect of children in religious cults have been reported (Gaines, Wilson, Redican, & Baffi, 198x; Landa, 1985; New York State Assembly, 1979; Markowitz & Halperin, 1984).

Hundreds of newspaper and magazine articles, as well as legislative hearings (references), have recounted the experiences of former religious cult members and their families (Scharff, 1985, is one of the best such reports), while a number of books have been authored by parents (Adler, 1978; Allen, 1982; Ikor, 198x; Yanoff, 1981) and ex-cult members (Edwards, 1979; Kemperman, 1981; Mills, 1979; Underwood & Underwood, 1979). Most accounts published in the popular press, especially during the late 1970s and early 1980s, tell the stories of deprogrammed ex-members, who describe how cultic manipulation and exploitation made them unhappy yet unable to leave their group. Deprogramming, which generally consists of intensive interactions over a period of three to seven days, provides cult members with information about their group and manipulative influence techniques in order to help them make an informed decision about continued group affiliation. Although in popular usage the term deprogramming often implies abduction, most contemporary deprogramming occur with the unpressured consent of the cult member. Exit counseling has come to be the most commonly used term for describing this process, although some professionals (e.g., Langone, 1983) prefer the term reevaluation counseling because it does not presuppose a goal of exiting.Clinical reports about the experiences of cultists (Addis, Schulman-Miller, & Lightman, 1984; Clark, 1978, 1979; Etamed, 1979; Galper, 1982; Lazarus, 19xx; Levine, 1979; Maleson, 1981; Schwartz, 1983; Schwartz & Zemel, 1980; Singer, 1978, 1979, 1985; Spero, 1982; West & Singer, 1982) are generally consistent with published personal accounts, although the latter, as is to be expected, tend to be one person's story, abbreviated, and relatively unanalytical. Many cultists and ex-cultists come for professional help at the urging of their families, who have usually already consulted clinicians.

The overwhelming majority of cultists examined by clinicians appear to have been stressed (e.g., romantic breakup, academic difficulties) prior to joining the cult, such stress magnifying whatever other susceptibilities they may have had to manipulative influences. Most converts were commonly subjected to an array of deceptive and indirect techniques of persuasion and control. Although clinicians differ in their explanations of what happened to these cultists, there is general agreement that most underwent major, frequently sudden, changes in personality (which usually was the trigger for parental alarm). In taking on the cult's values, beliefs, attitudes, and practices, most of these cultists experienced considerable conflict, which was usually suppressed (often through the use of chanting or other dissociative techniques), especially when the prescribed changes were significantly opposed to how the convert was raised. (Some clinicians, cf. Ash, 1985, believe that many cultists experience a prolonged dissociative state while under the group's influence. See also DSM-III 300.15, Atypical Dissociative Disorder.) These changes and tensions appear to have motivated families to intervene (e.g., through deprogramming) or to have caused converts to leave the group, usually in turmoil (some converts experienced psychotic breaks, which often led to their ejection from the cult).

Ex-cultists from clinical samples were in the cult an average of two to three years. Their readjustment to life in the mainstream world is generally not easy, with many ex-cultists exhibiting significant levels of depression, anxiety, guilt, anger, distrust, interpersonal problems, and a form of dissociation called "floating," which is somewhat analogous to drug flashbacks (Ash, 1985). Skonovd (1983), providing many quotations from interviews, describes a variety of reasons why people leave cults. Within the network of activist parents and ex-members, approximately one-third of former cultists left on their own, i.e., without involuntary deprogramming (Conway & Siegelman, 1979; Eden, 197x; Langone, 1984; Schwartz, 1983; Solomon, 1981). Informal reports and several studies (Barker, 198x; Galanter, 1983; Levine, 1985), however, suggest that the voluntary departure rate for the broader cult population is probably considerably higher. Two studies (Solomon, 1981; Wright, 1983) found a correlation between reported negativity toward the cult and exposure to the counter-cult network. Although the authors of these studies were inclined to attribute the former cultists' negativity to their contact with the counter-cult network, it seems more probable that a self-selection process occurred: those parents and cultists who were most adversely affected by the cult affiliation were most likely to seek help and, consequently, most likely to come into contact with the counter-cult network.

The results of a number of research studies are consistent with clinical reports. Conway and Siegelman (1985), who drew their sample from the counter-cult network, from which most clinical cases come, found significant correlation’s between participation in cult rituals and various indicators of distress. Spero (1982), who treated 65 cultists in psychotherapy averaging 15 months, found two basic profiles: "a) significant constriction in cognitive processes with a clearly defined preference for stereotype or b) a manic denial of depressive trends, also featuring deficiencies in optimum psychological differentiation, exceedingly quick response times, emotionally labile rather than constricted responses, and featuring unrealistic and idealistic object-relational themes"(p. 338). Galanter (1983) found that 36% of 66 former members of the Unification Church "reported that they had had serious emotional problems after leaving" (p. 984). Deutsch (1975), who interviewed 14 devotees of an American guru, concluded that all appeared to have psychiatric disorders. In a clinical study in which Rorschachs were given to four Unification Church members, Deutsch and Miller (1983) found evidence of hysterical and dependency features. Schwartz (1983), in the only identified survey of parental responses to a child's cult involvement, agreed with clinical descriptions. Parents described themselves as "numb, rejected, opposed, skeptical, disappointed, angry, disapproving, devastated, guilty, 'damned mad,’ stunned, ashamed" and "baffled" (p. 5).

Other research studies suggest that the level of harm associated with religious cults may be less than clinical reports indicate. Levine and Salter (1976) and Levine (198x) found little evidence of impairment in structured interviews of over 100 cultists, although Levine and Salter did note some reservation about "the suddenness and sharpness of the change" (p. 415) that was reported to them. Ross (1983), who gave a battery of tests, including the MMPI, to 42 Hare Krishna members in Melbourne, Australia, reported that all "scores and findings were within the normal range, although members showed a slight decline in mental health (as measured on the MMPI) after 1 1/2 years in the movement and a slight increase in mental health after 3 years in the movement" (p. 416). Ungerleider and Wellisch (1979), who interviewed and tested 50 members or former members of cults, found "no evidence of insanity or mental illness in the legal sense" (p. 279), although members showed elevated Lie scales on the MMPI. In studies of the Divine Light Mission (Buckley and Galanter, 1979) and the Unification Church (Galanter, Rabkin, Rabkin, and Deutsch, 1979; Galanter, 1983), the investigators found improvement in well-being as reported by members, approximately one-third of whom had received mental health treatment before joining the group.

Methodological considerations. All of these studies suffer from serious methodological flaws, including sample selection (e.g., help-seekers vs. "volunteers" from groups that have clearly tried to woo academicians, cf. Dole and Dubrow-Eichel, 1981), the use of measuring instruments with inadequate psychometric development (e.g., Galanter’s General Well-Being Schedule and Neurotic Distress Scale), motivated distortion on the part of subjects (e.g., the Lie Scale elevations in Ungerleider & Wellisch, 1979 and the "moderate attempt for both men and women to 'look good"' [p. 418] reported by Ross, 1983), inability of some standard measuring instruments to detect subtle psychopathology, such as dissociation (Ash, 1985), and bias, confusion over terms, overgeneralization, unwarranted causal inferences, and inadequate methods of collecting data (Balch, 1985).

The brainwashing/deprogramming controversy. How much of the harm associated with cults is causally related to group practices? Why, for instance, should one consider "child abuse and cults" a meaningful topic of study, but not "child abuse and Methodists" or "child abuse and sociologists"? Many would answer that cults, unlike Methodists or sociologists, tend to be very controlling and characteristically use disturbingly subtle manipulations: deliberate attempts to manipulate someone else's behavior seem exploitative when they are covert. One can always imagine that the victim might have resisted had the attempt been more overt or had informed consent been solicited (Andersen & Zimbardo, 1985, p. 197).

This emphasis on harm-producing manipulation has given rise to the brainwashing/ deprogramming controversy. On one side are former cultists, their parents, and many journalists. These people compare cult conversion to Korean War indoctrination and Chinese thought reform programs, often referred to in the popular press as brainwashing. These writers note the similarities of isolation, group pressure, debilitation, induced dependency, etc. They frequently support deprogramming and their terminology often determines the language with which the layman conceptualizes this problem. On the other side of the controversy are cultists, their public relations experts, and a few sociologists and other writers. This group attacks the validity of the brainwashing model (partly because the term in their minds implies the use of coercion and physical threats, and partly because of a fixed disregard of the active, orchestrated, and often deceptive recruiting programs used by cultic groups), which they frequently use as a strawman, the knocking down of which apparently is aimed at dismissing ex-members' claims of exploitation (Shuler, 1983).

In the middle are many clinicians and other professionals who have studied cults. They recognize that the threat of physical coercion found in Korean War brainwashing is rarely present in cult conversions, that brainwashing represents one end of a continuum of influence (Langone, 1985) and is not mysterious, and that the individual's personality and actions play a significant role in his or her conversion. These professionals, however, do not gloss over the many distinctions between cultic and more traditional conversions. Their positions on deprogramming vary markedly, depending upon their ethical evaluations of the procedure.

Perhaps the question "Are cultists brainwashed?" should be replaced with two questions, addressing, respectively, the individual and social levels. With regard to the individual, one could say, paraphrasing Bergin and Strupp (197x): "To what extent has this particular group's use of deceptive and indirect techniques of persuasion and control harmed this particular person or family at this particular time?" On the social level one could say: "To what extent does this particular group - or class of groups - misuse deceptive and indirect techniques of persuasion and control?"

Summary. In summary, it seems that the only confident conclusion one can draw from the many studies of religious cults is that a large variety of people join diverse groups for many reasons and are affected in different ways. Nevertheless, considerable numbers of persons join cults in large part because of their vulnerability to the deceptive and indirect techniques of persuasion and control used by cults. Furthermore, a significant percentage of cultists is clearly harmed, some terribly so. Many cultists, however, appear to be unharmed or even positively affected, although some scholars (e.g., Ash, 1985) argue that most of these are disturbed in subtle ways.

Psychotherapy Cults

Literature review. Temerlin and Temerlin (1982), Hochman (1984), Singer (1986), and West and Singer (1980) each pointed to a new phenomenon: the psychotherapy cult. Cultic therapists use varying combinations of coercive, indirect, and deceptive psychological techniques in order to control patients. In so doing, these therapists violate ethical prohibitions against forming exploitative and dual relationships with clients, misusing therapeutic techniques, and manipulating therapeutic relationships to the advantage of the therapist. Therapy cults may arise from the distortion and corruption of long-term individual therapy (Temerlin and Temerlin, 1982; Conason and McGarrahan, 1986), group psychotherapy (Hochman, 1984), large-group awareness trainings, human potential groups, or any of a variety of groups led by non-professionals (West & Singer, 1980; Singer, 1983, 1986).

Temerlin and Temerlin (1982) studied five bizarre groups of mental health professionals, which were formed when five teachers of psychotherapy consistently ignored ethical prohibitions against multiple relationships with clients. Patients became their therapists' friends, lovers, relatives, employees, colleagues, and students.

Simultaneously they became "siblings" who bonded together to admire and support their common therapist. (p. 131)

These cults were formed when professionals deviated from an ethically based, fee-for-service, confidential relationship with clients and brought clients together to form cohesive, psychologically incestuous groups. Leaders were idolized rather than transferences studied and understood. Instead of personal autonomy being built, patients were led into submissive, obedient, dependent relationships with the therapists. Their thinking eventually resembled what Hoffer saw in the "True Believers" (1951) and what Lifton (1961) termed totalistic. That is, the clients were induced to accept uncritically their therapists' theories, to grow paranoid toward the outside world, to limit relationships and thinking to the elite world created by the cult-producing therapist, and selflessly devote themselves to their therapists. The groups varied in size from 15 to 75 members. Often members had been in groups from 10 to 15 years.

The authors concluded that membership in a psychotherapy cult was an iatrogenically induced negative effect of psychotherapeutic techniques and relationships being used in unethical ways.

Hochman (1984), writing about a now defunct school of psychotherapy, The Center for Feeling Therapy, also spoke of the many iatrogenic symptoms he found in former clients and patients who had been members of this group, which had evolved into a therapy cult. He wrote:

A cult that is destructive...veers toward remolding the individual to conform to codes and needs of the cult, institutes new taboos that preclude doubt and criticism, and produces a kind of splitting where cult members see themselves as an elite surrounded by unenlightened, and even dangerous outsiders. (p. 367)

This group lasted approximately ten years, and consisted of 350 patients living near one another, sharing homes in the Hollywood district of Los Angeles. Hundreds more were non-resident outpatients, and others communicated with "therapists" (some were licensed, others allegedly were patients assigned to be therapists) by letter. Maximum benefit supposedly came only to residents, and patients were led to see themselves as the potential leaders of a therapy movement that would dominate the 21st century. The leaders promulgated a "theory" which maintained that individuals function with "reasonable insanity," but that if individuals learn to "go 100%" in five areas -- expression, feeling, activity, clarity, and contact -- a person could put aside his "old image" and now be "sane," which was defined as the "full experiencing of feelings." This latter, ambiguous objective was purported to be the attainment of the next stage in human evolution. Thus, therapy cults use a technique also commonly seen in religious cults (Singer, 1983): the inhibition of critical thinking by encouraging the use of thought-stopping cliches.

Legal cases. A number of civil suits and hearings of the California Department of Consumer Affairs Board of Medical Quality Assurance have grown out of the activities occurring in the Center for Feeling Therapy. The following are illustrative but not exhaustive: State of California: Psychology Examining Committee Case 392, L-33445 v. Binder; State of California as cited, v. Corriere, Gold, Hart, Hopper, and Karle, Case L-30665, D.3103 through 3107; State of California as cited v. Woldenberg, No. D-3108, L-30664; Hart et al. v. McCormack et al., Superior Court of the State of California, for the County of Los Angeles, No. 00713; Raines et al. v. Center Foundation, Superior Court of the State of California, County of Los Angeles, No. 372-843 consolidated with C 379-789; Board of Behavioral Science Examiners, No. M 84, L 31542 v. Cirincione, Franklin, Gold, and Gross.

In these legal cases, defendants were charged with extreme departures from the standards of psychology, the standards of medicine, and the standards of psychotherapeutic care. The State alleged that the staff, while purporting to be providing psychotherapy:

Respondent, in order to break down and control Center members, utilized racial, religious and ethnic slurs, physical and verbal humiliation, physical, especially sexual, abuse, threats of insanity and violence and enforced states

False and misleading advertising was alleged. It was represented that six to twelve months were needed to complete therapy for stated amounts of money, whereas patients were intimidated during sessions into paying amounts greatly in excess of those advertised and Center therapy was designed to keep patients in therapy as long as the Center existed. (p. xx).

Timnick (1986), calling the Center "a once trendy 'therapeutic community,"' reported that the above legal hearings have "become the longest, costliest and most complex psychotherapy malpractice case in California history" (p. 3). In this case, more than 100 former patients filed complaints of fraud, sexual misconduct, and abuse. Already, civil cases have settled for more than six million dollars on behalf of former clients. Testimony at hearings depicted the group as a psychotherapy cult using deceptive, manipulative, and coercive techniques to retain and control clients. The welfare of the patients was subverted to the welfare of the therapists. Treatment plans and goals were subverted to financially and personally benefit the therapists and the Center. Instead of the usual standard of practice of patients being aided to achieve greater independence and self-direction, the therapists instituted a systematic social influence process and an enforced dependency situation that was cultic and controlling.

Non-professional cults. In contrast to the above psychotherapy cults directed by trained professionals, Singer (1983; 1986) reported the rise of cult-like groups conducted by non-professionals. The latter appear to have familiarized themselves with many of the tactics, techniques, and methods of both individual and group therapies, and have used these techniques to recruit, control, and maintain followers. Defining a cultic relationship as "those relationships in which a person induces others to become totally or nearly totally dependent on him or her for almost all major life decisions, and inculcates in these followers a belief that he or she has some special talent, gift, or knowledge" (1986, p. 270), Singer described a series of groups in which non-professionals using various psychological techniques were subverting the welfare of individuals who had been led, often through deceptive methods, into believing the leaders had psychological or other special skills, talents, or knowledge.

In one instance, a diet cult was described in which a man and woman leader implied that they had special scientific methods for weight control programs. In addition to leaving their regular work, followers were persuaded to turn over large sums of money and property to the leaders, cut off all ties with family and former friends, and move to an isolated town and remain in the domicile of the leaders. Here each day four to five hours of hypnosis and self-hypnosis exercises plus many periods of hyperventilation interspersed with lectures and demonstrations of how to "speak in voices and hear in voices" were carried out, ostensibly to change the personality of the followers. No actual weight control programs existed. Only the pseudo-therapy and pseudo-growth programs were operative.

Earlier, Singer (1983) reported on the experiences of individuals in various groups run by non-professionals who used intense confrontational techniques, encounter group type exercises, and enforced and prolonged self-revelation processes. Some groups encouraged participants to move in with the leader, others did not. But all seemed, in one way or another, to encourage dropping former friends and relatives and becoming psychologically, socially, and otherwise dependent upon the group. Several long-standing groups reportedly are run by not only non-professionals, but by persons with criminal records. The promises of the promoters lead persons entering these groups to think that they will receive various psychological and mental benefits from participation. Whether benefits accrue is not possible to ascertain, because studies have not been made of either the groups or of sufficient numbers of former members of these groups. Additionally, because these groups often are on the fringe of legality, leaders are not likely to allow open scrutiny of their operations.

That individuals have been damaged psychologically and in other ways appears evident. That psychological and social manipulations by means of indirect, deceptive, and coercive methods are occurring is also evident.

Large-Group Awareness Training

Historical Background

The Human Potential Movement bloomed in the 1950's and 1960's. Sensitivity and encounter groups spread rapidly, promising increased communication, intensified experience, and expanded consciousness. Business, educational, and other groups were sold sensitivity training programs, some conducted by psychologists, but most led by non-professionals who used the processes and techniques developed by psychologists. There soon appeared the commercially packaged large-group awareness trainings (LGATs), which combined a number of the encounter and sensitivity techniques with various sales, influence, indoctrination, and behavior control techniques.

Most existing commercial LGATs grew out of a format developed in the early 1960s by William Penn Patrick, who labeled his venture Leadership Dynamics Institute (LDI). This was the first of what has become a smorgasbord of commercially sold LGATs.

Church and Carnes (1972) describe the original LDI program as an encounter group training session costing $1,000 and in which persons "were held virtual prisoners for four days of living hell during which members of the class were beaten, deprived of food and sleep, jammed into coffins, forced to perform degrading sexual acts, and even crucified" (p. 178). Purportedly, this commercial encounter group would make persons "better leaders and executives." The seminar was supposed to rid people of their "hang-ups," teach total obedience, and motivate participants to persuade other persons to take the training. Patrick, who headed Holiday Magic Cosmetics, Mind Dynamics, LDI, and other pyramid sales organizations, decreed that attending an LDI "seminar" was required for anyone wishing a management position with Holiday

Magic. Attendees were kept in the dark about what they would experience at these seminars, as "graduates" were pledged not to reveal their experiences. The venture ended amidst multiple law suits in California courts.

Some changes have occurred in subsequent LGATs, while certain features have remained. In most of the new groups, attendees continue to pledge secrecy and push the product on friends and acquaintances within a pyramid sales structure. The status of "graduates" of the LGATs is generally dependent upon the number of recruits they bring in. Eliciting most criticism, however, is the extensive use of deceptive and indirect, and even coercive, techniques of persuasion and control at all levels of the organization, including the training.

Review of the Literature

There is an extensive array of accounts of these "trainings," most notably about est (Erhard Seminars Training). Hundreds of journalistic reports exist. The following are but a sample of articles or books on est: Bartley, 1978; Benziger, 1976; Brewer, 1975; Bry, 1976; Fenwick, 1976; Frederick, 1974; Greene, 1976; Hargrove, 19xx; Hoyt, 1985; Leonard, 1972; Rhinehart, 1976; Tipton, 1982. Descriptions of Lifespring, another well-known LGAT, are found in Haaken and Adams (1983) and Cushman (1986), while Actualizations is described by Martin (1977).

The term training is misleading if the consumer thinks the title refers to skill-building groups (Rudestam, 1932). The LGATs are not skill training events, but instead resemble intense indoctrination programs. In the LGATs an authoritarian leader, now minus the Marine Corps swagger stick of LDI days, persuades the consumers who purchase attendance to believe that their lives are not working, that they have caused every dire event that has happened to them, and that salvation is based upon accepting the belief system being offered, learning to talk in the jargon of the trainer, and remaining connected with the organization by becoming unpaid volunteer helpers recruiting other customers for the organization. The formats, stripped of individual jargon characterizing each particular commercial group, remain essentially as outlined above (Baer & Stolz, 1978; Cinamon & Farson, 1979; Fenwick, 1977; Gross, 1978; Tipton, 1982; Zilbergeld, 1983).

Finkelstein, Wenegrat, and Yalom (1982) consider Lifespring, Actualizations, and est as examples of "intensive large-group awareness trainings." They describe these groups as being characterized by "the commercial, non-professional use of potentially powerful tools for personal growth," which "often evoke powerful emotions" (p. xx). These authors discuss the “trainer's extraordinary demeanor...(his/her) air of absolute authority ... no affect, even when he excoriates the trainees ... repeatedly referring to them as ‘assholes'...devalues their accomplishments with the repeated assertion that their lives 'do not work"' (p.xx).

Finkelstein et al. (1982) report on est's "Truth Process," an event occurring on the second day of the training. During this exercise trainees lie on the floor, eyes closed, meditating on an individual problem they have selected.

At the trainer's command, the trainees imagine a situation in which that problem has occurred and systematically explore the detailed bodily sensations and images associated with the problem itself. As the trainer orders the trainees to examine images from the past and from childhood, powerful affects are released. The room is soon filled with the sound of sobbing, retching, and uncontrolled laughter, punctuated by the exclamations of those remonstrating with figures from their past... Later in the second day, during the so-called "Danger Process,” trainees come to the dais in groups of 25 and stand facing the audience. The trainer exhorts those on the dais to "be" themselves, and reprimands those who appear to be posturing or falsely smiling, or who fail to make eye contact with the seated trainees. It is not uncommon, apparently, for trainees to faint or cry when called to the

At a mid-week meeting following the first weekend, "trainees report on their experiences since the weekend, often to tell of dramatic improvements...and occasionally to complain of deterioration in their mood" (Finkelstein et al., 1982, p. 515-521).

Finkelstein et al. (1982) also note that nearly 450,000 persons in the United States have undergone one of the several commercial large-group awareness trainings. Yet the literature on these groups resembles that of the early encounter and human potential groups:

...a few objective outcome studies which exist side by side with highly positive testimonials and anecdotal reports of psychological harm. Reports of testimonials have been compiled by est advocates and suffer from inadequate methodology. More objective and rigorous research reports fail to demonstrate that the positive testimony and evidence of psychological change among est graduates result from specific attributes of est training. Instead, non-specific effects of expectancy and response sets may account for positive outcomes. Reports of psychological harm as the result of est training remain anecdotal, but borderline or

The reports of psychological harm resulting from LGATs appear in Fenwick (1976), Glass, Kirsch, and Parris (1977), Kirsch and Glass (1977), Simon (1977, 1978), Higgitt and Murray (1983), and Haaken and Adams (1983). While Fenwick, a psychologist, was a participant observer at an est training, Haaken and Adams, a psychologist and a sociologist respectively, were participant observers at a Lifespring training.

Fenwick called attention to the est training selections-admission forms, in which persons were asked if they had been in therapy, and if in therapy (now or recently), were they "winning." She voiced the opinion that some persons might, intentionally or because of incapacity, misrepresent their psychiatric status on such a form, or might feel "their medical or psychiatric history is not appropriately revealed to a private business offering them an 'educational' service" (p. xx). She concluded:

Trainers in est do not and cannot take the precautions that would be considered appropriate for psychotherapy. They use techniques such as confrontation, which undermines psychological defenses and strips away resistances. They use some techniques whose effect is to increase anxiety and other

Fenwick further noted that the Lieberman and Yalom studies (19xx) of encounter groups indicated that "the people who experienced negative results in combination with the psychological casualties constituted about 19% ... or for close to one out of five people who participated in these group experiences, the results were harmful" (p. 166).

Haaken and Adams (1983) analyzed Lifespring from a psychoanalytic perspective.

Basing our conclusions on a participant-observation study, we argue that the impact of the training was essentially pathological. First, in the early period of the training, ego functions were systematically undermined and regression was promoted. Second, the ideational or interpretive

Cushman (1986) termed a number of groups, such as est, Lifespring, Psi World, Transformations, and Summit Workshops, "mass marathon psychology organizations." As had the many writers who described the est trainings, he noted the highly coercive and authoritarian methods of control used in these groups. He called them restrictive groups, because they depended upon strict milieu control, public rewards and punishments, and the pressuring of participants to enroll others and immerse themselves in the organizations as volunteers and companions of other graduates.

Despite the LGATs promoting themselves as educational experiences, the majority of the professionally trained writers (psychologists and psychiatrists) who have published comments on the groups consider them to be psychological in nature (Cushman, 1986; Hoyt, 1985; Fenwick, 1976; Glass, Kirsch, & Paris, 1977; Haaken & Adams, 1983; Higgitt & Murray, 1983; Kirsch & Glass, 1977; Paul & Paul, 1978; Simon, 1977, 1978). Glass et al. (1977) concluded that although est presents its programs as educational, they are in fact "quasi-therapeutic group experiences" (p. xx). Simon (1978) stated that "est has some powerful psychological effects on many of those who take the training... It is apparent from the progressive and regressive responses to est that some powerful change agent is at work here. It may be that... Werner Erhard has discovered an unconventional route to approach these psychotherapeutic goals" (p. 686, 691). Other legal cases . Space prohibits a complete description of all legal cases involving LGATs. However, there are currently more than 30 such cases, including... [to be added if deemed appropriate]

Conclusions

The preceding literature review suggests that most of the nationally known LGATs and a burgeoning, but as yet undetermined number, of take-offs on them are using powerful psychological techniques capable of stripping individuals of their psychological defenses, inducing behavioral regression, and promoting regressive modes of reasoning. Further, it appears that deceptive sales techniques are involved in promoting the trainings since the secrecy surrounding the programs' sales promotions prevents consumers from obtaining full disclosure. Consumers are persuaded to purchase programs described as educational, while in actuality the programs consist of highly orchestrated, intense indoctrination processes capable of inducing marked psychological experience. Consumers are not fully and adequately informed about the programs' intensity, the new philosophical formulations of reality that they imply, the potentially harmful consequences of some of the exercises to which participants will be exposed, the sometimes lurid psychological upset they will witness, nor the fact that management is aware of at least some of the risks to which they subject participants. Such practices run counter to American Psychological Association recommendations on the running of growth groups (American Psychological Association, 1973).

Analysis

As should be clear by now, criticism of cults and LGATs stems from the observation that such groups use deceptive and indirect (and sometimes coercive) techniques of persuasion and control to advance the goals of leaders, frequently to the detriment of members, their families, and society at large. The problems posed by such groups, then, have psychological and ethical aspects. The psychological aspect concerns the nature of behavior and attitude change techniques and their consequences. The ethical aspec

Educational DVDs and Videos