Notes on Brainwashing & 'Cults'

post by gwern · 2013-09-13T20:49:51.412Z · LW · GW · Legacy · 106 comments

“Brainwashing”, as popularly understood, does not exist or is of almost zero effectiveness. The belief stems from American panic over Communism post-Korean War combined with fear of new religions and sensationalized incidents; in practice, “cults” have retention rates in the single percentage point range and ceased to be an issue decades ago. Typically, a conversion sticks because an organization provides value to its members.

Some old SIAI work of mine. Researching this was very difficult because the relevant religious studies area, while apparently completely repudiating most public beliefs about the subject (eg. the effectiveness of brainwashing, how damaging cults are, how large they are, whether that’s even a meaningful category which can be distinguished from mainstream religions rather than a hidden inference - a claim, I will note, which is much more plausible when you consider how abusive Scientology is to its members as compared to how abusive the Catholic Church has been etc), prefer to publish their research in book form, which makes it very hard to review any of it. Some of the key citation were papers - but the cult panic was so long ago that most of them are not online or have been digitized! I recently added some cites and realized I had not touched the draft in a year; so while this collection of notes is not really up to my preferred standards, I’m simply posting it for what it’s worth. (One lesson to take away from this is that controlling uploaded human brains will not be nearly as simple & easy as applying classic ‘brainwashing’ strategies - because those don’t actually work.)

Reading through the literature and especially the law review articles (courts flirted disconcertingly much with licensing kidnapping and abandoning free speech), I was reminded very heavily - and not in a good way - of the War on Terror.

Old American POW studies:

Started the myth of effective brain-washing. But in practice, cult attrition rates are very high! (As makes sense: if cults did not have high attrition rates, they would long ago have dominated the world due to exponential growth.) This attrition claim is made all over the literature, with some example citations being:

Iannaccone 2003, “The Market for Martyrs” (quasi-review)

From the late-1960s through the mid-1980s, sociologists devoted immense energy to the study of New Religious Movements. [For overviews of the literature, see Bromley (1987), Robbins (1988), and Stark (1985).] They did so in part because NRM growth directly contradicted their traditional theories of secularization, not to mention the sensational mid-sixties claims God was “dead” (Cox 1966; Murchland 1967). NRM’s also were ideal subjects for case stud ies, on account of their small size, brief histories, distinctive practices, charismatic leaders, devoted members, and rapid evolution. But above all, the NRM’s attracted attention because they scared people.

…We have trouble recalling the fear provoked by groups like the Krishnas, Moonies, and Rajneeshees. Their years of explosive growth are long past, and many of their “strange” ideas have become staples of popular culture. [We see this influence not only in today’s New Age and Neo-Pagan movements, but also in novels, music, movies, TV shows, video games, university courses, environmentalism, respect for “cultural diversity,” and the intellectual elite’s broad critique of Christian culture.] But they looked far more threatening in the seventies and eighties, especially after November 18, 1978. On that day, the Reverend Jim Jones, founder of the People’s Temple, ordered the murder of a U.S. Congressman followed by the mass murder/suicide of 913 members of his cult, including nearly 300 children.

The “cults” aggressively proselytized and solicited on sidewalks, airports, and shopping centers all over America. They recruited young adults to the dismay of their parents. Their leaders promoted bizarre beliefs, dress, and diet. Their members often lived communally, devoted their time and money to the group, and adopted highly deviant lifestyles. Cults were accused of gaining converts via deception and coercion; funding themselves through illegal activities; preying upon people the young, alienated, or mentally unstable ; luring members into strange sexual liaisons; and using force, drugs, or threats to deter the exit of disillusioned members. The accusations were elaborated in books, magazine articles, newspaper accounts, and TV drama. By the late-1970s, public concern and media hype had given birth to anti-cult organizations, anti-cult legislation, and anti-cult judicial rulings. The public, the media, many psychologists, and the courts largely accepted the claim that cults could “brainwash” their members, thereby rendering them incapable of rational choice, including the choice to leave. [Parents hired private investigators to literally kidnap their adult children and subject them to days of highly-coercive “deprogramming.” Courts often agreed that these violations of normal constitutional rights were justified, given the victim’s presumed inability to think and act rationally (Anthony 1990; Anthony and Robbins 1992; Bromley 1983; Richardson 1991; Robbins 1985).]

We now know that nearly all the anti-cult claims were overblown, mistaken, or outright lies. Americans no longer obsess about Scientology, Transcendental Meditation, or the Children of God. But a large body of research remains. It witnesses to the ease with which the public, media, policy-makers, and even academics accept irrationality as an explanation for behavior that is new, strange, and (apparently or actually) dangerous.

…As the case stud ies piled up, it became apparent that both the media stereotypes (of sleep-deprived, sugar-hyped, brainwashed automatons) and academic theories (of alienated, authoritarian, neurotics) were far off mark. Most cult converts were children of privilege raised by educated parents in suburban homes. Young, healthy, intelligent, and college educated, they could look forward to solid careers and comfortable incomes. [Rodney Stark (2002) has recently shown that an analogous result holds for Medieval saints - arguably the most dedicated “cult converts” of their day.]

Psychologists searched in vain for a prevalence of “authoritarian personalities,” neurotic fears, repressed anger, high anxiety, religious obsession, personality disorders, deviant needs, and other mental pathologies. The y likewise failed to find alienation, strained relationships, and poor social skills. In nearly all respects - economically, socially, psychologically - the typical cult converts tested out normal. Moreover, nearly all those who left cults after weeks, months, or even years of membership showed no sign of physical, mental, or social harm. Normal background and circumstances, normal personalities and relationships, and a normal subsequent life - this was the “profile” of the typical cultist.

…Numerous studies of cult recruitment, conversion, and retention found no evidence of “brainwashing.” The Moonies and other new religious movements did indeed devote tremendous energy to outreach and persuasion, but they employed conventional methods and enjoyed very limited success. In the most comprehensive study to date, Eileen Barker (1984) could find no evidence that Moonie recruits were ever kidnapped, confined, or coerced (though it was true that some anti-cult “deprogrammers” kidnapped and restrained converts so as to “rescue” them from the movement). Seminar participants were not deprived of sleep; the food was “no worse than that in most college residences;” the lectures were “no more trance-inducing than those given everyday” at many colleges; and there was very little chanting, no drugs or alcohol, and little that could be termed “frenzy” or “ecstatic” experience (Barker 1984). People were free to leave, and leave they did - in droves.

Barker’s comprehensive enumeration showed that among the relatively modest number of recruits who went so far as to attend two-day retreats (claimed to be Moonies’ most effective means of “brainwashing”), fewer than 25% joined the group for more than a week, and only 5% remained full-time members 1 year later. Among the larger numbers who visited a Moonie centre, not 1 in 200 remained in the movement 2 years later. With failure rates exceeding 99.5%, it comes as no surprise that full-time Moonie membership in the U.S. never exceeded a few thousand. And this was one of the most successful cults of the era! Once researchers began checking, rather than simply repeating the numbers claimed by the groups, defectors, or journalists, they discovered dismal retention rates in nearly all groups. [For more on the prevalence and process of cult defection, see Wight (1987) and Bromley (1988).] By the mid-1980s, researchers had so thoroughly discredited “brainwashing” theories that both the Society for the Scientific Study of Religion and the American Sociological Association agreed to add their names to an amicus brief denouncing the theory in court (Richardson 1985).

Singer in particular has been heavily criticized; “Cult/Brainwashing Cases and Freedom of Religion”, Richardson 1991:

Dr. Singer is a clinical psychologist in private practice who earns a considerable portion of her income from cult cases. She has been an adjunct professor at the University of California at Berkeley, but has never held a paid or tenured-track position there. See H. Newton Malony, “Anticultism: The Ethics of Psychologists’ Reactions to New Religions,” presented at annual meeting of the American Psychological Association (New York, 1987) and Anthony, “Evaluating Key Testimony” for more details on Singer’s career.

…The [amicus curiae] brief further claimed that Singer misrepresents the tradition of research out of which terms like “thought reform” and “coercive persuasion” come. She ignores the fact that these earlier studies focus on physical coercion and fear as motivators, and that even when using such tactics the earlier efforts were not very successful. With great facility, Singer moves quickly from situations of physical force to those where none is applied, claiming that these “‘second generation’” thought reform techniques using affection are actually more effective than the use of force in brainwashing people to become members. Thus, Singer is criticized for claiming to stand squarely on the tradition of research developed by scholars such as Edgar Schein and Robert Lifton, while she shifts the entire focus to non-coercive situations quite unlike those encountered in Communist China or Korean prisoner of war camps. The brief points out, as well, that Singer ignores a vast amount of research supporting the conclusion that virtually all who participate in the new religions do so voluntarily, and for easily understandable reasons. No magical “black box” of brainwashing is needed to explain why significant numbers of young people chose, in the 1960s and 1970s, to abandon their place in society and experiment with alternative life styles and beliefs. Many youth were leaving lifestyles that they felt were hypocritical, and experimenting with other ways of life that they found to be more fulfilling, at least temporarily. Particularly noteworthy, but ignored by Singer, are the extremely high attrition rates of all the new religions. These groups are actually very small in numbers (the Hare Krishna and the Unification Church each have no more than two to three thousand members nationwide), which puts the lie to brainwashing claims. If “brainwashing” practiced by new religions is so powerful, why are the groups experiencing so much voluntary attrition, and why are they so small?

…Considerable research reported in refereed scholarly journals and other sources supports the idea that the new religions may be serving an important ameliorative function for American society. The groups may be functioning as “half-way houses” for many youth who have withdrawn from society, but still need a place to be until they decide to “return home.” Participation in some new religions has been shown to have demonstrable positive effects on the psychological functioning of individuals, a finding that Singer refuses to acknowledge.

“Overcoming The Bondage Of Victimization: A Critical Evaluation of Cult Mind Control Theories”, Bob and Gretchen Passantino Cornerstone Magazine 1994:

Neither brainwashing, mind control’s supposed precursor, nor mind control itself, have any appreciable demonstrated effectiveness. Singer and other mind control model proponents are not always candid about this fact: The early brainwashing attempts were largely unsuccessful. Even though the Koreans and Chinese used extreme forms of physical coercion as well as persuasive coercion, very few individuals subjected to their techniques changed their basic world views or commitments. The CIA also experimented with brainwashing. Though not using Korean or Chinese techniques of torture, beatings, and group dynamics, the CIA did experiment with drugs (including LSD) and medical therapies such as electroshock in their research on mind control. Their experiments failed to produce even one potential Manchurian Candidate, and the program was finally abandoned.

Although some mind control model advocates bring up studies that appear to provide objective data in support of their theories, such is not the case. These studies are generally flawed in several areas: (1) Frequently the respondents are not from a wide cross-section of ex-members but disproportionately are those who have been exit-counseled by mind control model advocates who tell them they were under mind control; (2) Frequently the sample group is so small its results cannot be fairly representative of cult membership in general; (3) It is almost impossible to gather data from the same individuals before cult affiliation, during cult affiliation, and after cult disaffection, so respondents are sometimes asked to answer as though they were not yet members, or as though they were still members, etc. Each of these flaws introduces unpredicatiblity and subjectivity that make such study results unreliable…The evidence against the effectiveness of mind control techniques is even more overwhelming. Studies show that the vast majority of young people approached by new religious movements (NRMs) never join despite heavy recruitment tactics. This low rate of recruitment provides ample evidence that whatever techniques of purported mind control are used as cult recruiting tools, they do not work on most people. Even of those interested enough to attend a recruitment seminar or weekend, the majority do not join the group. Eileen Barker documents [Barker, Eileen. New Religious Movements: A Practical Introduction. London: Her Majesty’s Stationery Office, 1989.] that out of 1000 people persuaded by the Moonies to attend one of their overnight programs in 1979, 90% had no further involvement. Only 8% joined for more than one week, and less than 4% remained members in 1981, two years later:

. . . and, with the passage of time, the number of continuing members who joined in 1979 has continued to fall. If the calculation were to start from those who, for one reason or another, had visited one of the movement’s centres in 1979, at least 999 out of every 1,000 of those people had, by the mid-1980s, succeeded in resisting the persuasive techniques of the Unification Church.

Of particular importance is that this extremely low rate of conversion is known even to Hassan, the best-known mind control model advocate whose book [Hassan, Steven. Combatting Cult Mind Control. Rochester, VT: Park Street Press, 1990?] is the standard text for introducing concerned parents to mind control/exit counseling. In his personal testimony of his own involvement with the Unification Church, he notes that he was the first convert to join at the center in Queens; that during the first three months of his membership he only recruited two more people; and that pressure to recruit new members was only to reach the goal of one new person per member per month, a surprisingly low figure if we are to accept the inevitable success of cult mind control techniques.

Objection: High Attrition Rates Additionally, natural attrition (people leaving the group without specific intervention) was much higher than the self-claimed 65% deprogramming success figure! It is far more likely a new convert would leave the cult within the first year of his membership than it is that he would become a long term member.

Gomes, Unmasking the Cults (Wikipedia quote):

While advocates of the deprogramming position have claimed high rates of success, studies show that natural attrition rates actually are higher than the success rate achieved through deprogramming

“Psychological Manipulation and Society”, book review of Spying in Guruland: Inside Britain’s Cults, Shaw 1994

Eventually Shaw quit the Emin group. Two months later he checked in with some Emin members at the Healing Arts Festival, a psychic fair. He avoided many Emin phone invitations for him to attend another meeting. He discovered that most, if not all, of the people who joined with him had dropped out. This is consistent with what Shaw has noted about most cults and recruits: the dropout rate is high.

Anthony & Robbins 1992, “Law, Social Science and the ‘Brainwashing’ Exception to the First Amendment”:

Lifton and Schein are also characterized in Molko (54) as attesting to the effectiveness of brainwashing, although Schein, an expert on Chinese coercive persuasion of Korean War POWs, actually thought, as do a number of scholars, that the Chinese program was relatively ineffective (Schein, 1959, p. 332; see also Anthony, 1990a; Schefiin & Opton, 1978)…Schein appears to actually have considered the communist Chinese program to be a relative “failure” at least, “considering the effort devoted to it” (Schein, 1959, p. 332; Anthony, 1990a, p. 302)…Various clinical and psychometric studies of devotees of well-known “cults” (Ross, 1983; Ungerleider & Wellisch, 1979) have found little or no personality disorder or cognitive impairment.

  • Ross 1983. “Clinical profile of Hare Krishna devotees”, American Journal of Psychiatry
  • Schein, E. (1959). “Brainwashing and totalitarianization in modern society”. World Politics, 2,430441.
  • Ungerleider, T., & Wellisch, D. K (1979). “Coercive persuasion (brainwashing), religious cults, and deprogramming”. American Journal of Psychiatry , 136,3,279-82.

“Brainwashed! Scholars of cults accuse each other of bad faith”, by Charlotte Allen, Lingua Franca Dec/Jan 1998:

Zablocki’s conversion to brainwashing theory may sound like common sense to a public brought up on TV images of zombielike cultists committing fiendish crimes or on the Chinese mind control experiments dramatized in the 1962 film The Manchurian Candidate. But among social scientists, brainwashing has been a bitterly contested theory for some time. No one doubts that a person can be made to behave in particular ways when he is threatened with physical force (what wouldn’t you do with a gun pressed to your head?), but in the absence of weapons or torture, can a person be manipulated against his will?

Most sociologists and psychologists who study cults think not. For starters, brainwashing isn’t, as Zablocki himself admits, “a process that is directly observable.” And even if brainwashing could be isolated and measured in a clinical trial, ethical objections make conducting such a test almost unthinkable. (What sort of waivers would you have to sign before allowing yourself to be brainwashed?) In the last decade, while brainwashing has enjoyed a high profile in the media-invoked to explain sensational cult disasters from the mass suicide of Heaven’s Gate members to the twelve sarin deaths on the Tokyo subway attributed to the Aum Shinrikyo cult-social scientists have shunned the the term as a symptom of Cold War paranoia and anticult hysteria. Instead, they favor more benign explanations of cult membership. Alternatives include “labeling” theory, which argues there is simply nothing sinister about alternative religions, that the problem is one of prejudicial labeling on the part of a mainstream culture that sees cult members as brainwashed dupes, and “preexisting condition” theory, which posits that cult members are people who are mentally ill or otherwise maladjusted before they join. (A couple of scholars have even proposed malnutrition as a preexisting condition, arguing that calcium deficiency may make people prone to charismatic susceptibility.)

Thus, when Zablocki published an indignant 2-part, 60-page defense of brainwashing theory in the October 1997 and April 1998 issues of Nova Religio, a scholarly journal devoted to alternative belief systems, he ignited a furor in the field. Pointing to the “high exit costs” that some cults exacted from those who tried to defect-shunning, forfeiture of parental rights and property, and veiled threats-Zablocki argued that these were indications of brainwashing, signs that some groups were using psychological coercion to maintain total control over their members. Although he admitted he could not prove brainwashing empirically, he argued that at the very least brainwashing should not be dismissed out of hand.

…Zablocki’s colleagues were unimpressed. In a response also published in Nova Religio, David Bromley, a sociologist at Virginia Commonwealth University who has studied the Reverend Sun Myung Moon’s Unification Church, complained that in Zablocki’s formulation brainwashing remained a vague, slippery, limiting, and ultimately untestable concept. Moreover, he pointed out, cults typically have low recruitment success, high turnover rates (recruits typically leave after a few months, and hardly anyone lasts longer than two years), and short life spans, all grounds for serious skepticism about the brainwashing hypothesis. Even if you overlook these facts, Bromley added, “the extraordinarily varied cultural origins, patterns of organizational development, and leadership styles of such groups pose a problem in explaining how they seem to have discovered the same ‘brainwashing’ psycho-technology at almost precisely the same historical moment.” A quick survey of the field reveals that Bromley is far from being the only doubter. Eileen Barker, a sociologist at the London School of Economics who has also studied the Unification Church, says, “People regularly leave the Moonies of their own free will. The cults are actually less efficient at retaining their members than other social groups. They put a lot of pressure on them to stay in-love-bombing, guilt trips-but it doesn’t work. They’d like to brainwash them, but they can’t.”

…To further complicate matters, researchers often bring very different, even conflicting approaches to their work. Psychologists, for example, tend to emphasize how a repeated environmental stimulus can elicit a conditioned response-depriving subjects of their autonomy. Sociologists, by contrast, typically endorse a voluntarist conversion model for religion, which posits that people join cults for generally rational reasons connected to the group’s ability to satisfy their needs: for a transcendent theology; for strong bonds of kinship and solidarity; for enough social support to enable them to quit drugs or otherwise turn their personal lives around. (For example, one study has shown that schizophrenics who joined cults functioned better than those who tried drugs or conventional psychotherapy.)

…In 1980 the New York state legislature, over objections from the American Civil Liberties Union, passed a bill that would have legalized deprogramming (it was vetoed by Governor Hugh Carey). “With deprogramming-with parents having their children abducted and held captive-the whole thing became intensely emotional,” says Thomas Robbins. “Who were the kidnappers: the parents, the cults, or the police? There were hard feelings on both sides.” Among the most outraged were social scientists who had never believed that people could be brainwashed into joining cults and who, as good civil libertarians, were appalled by deprogramming. Ofshe and Singer’s scholarly testimony (and fat fees) distressed a number of these scholars, whose credentials were equally respectable and whose own research had led them to conclude that coercive persuasion was impossible in the absence of some sort of physical coercion such as prison or torture.

…Zablocki made another, potentially more damning charge, however-one that Robbins did not take up. A significant amount of cult money, he wrote, has gone to scholars-in support of research, publication, conference participation, and other services. Zablocki did not name names. But a number of professors freely admit that nontraditional religions (in most cases, the Unificationists and Scientologists) have cut them checks. The list includes some of the most prominent scholars in the discipline: Bromley, Barker, Rodney Stark of the University of Washington, Jeffrey Hadden of the University of Virginia, and James Richardson, a sociologist of religion at the University of Nevada at Reno. All five have attended cult-subsidized conferences, and Bromley, Hadden, and Richardson have occasionally testified in court on behalf of cults or offered their services as expert witnesses against brainwashing theory. “This is an issue,” Zablocki wrote sternly, “of a whole different ethical magnitude from that of taking research funding from the Methodists to find out why the collection baskets are not coming back as heavy as they used to.”

106 comments

Comments sorted by top scores.

comment by shminux · 2013-09-13T23:52:45.674Z · LW(p) · GW(p)

An interesting question is, given the general failure of brainwashing, how do new religions manage to take hold, like Christianity, Islam, Mormonism, Sikhism, etc.? How come Christian and Islamic proselytism has been so consistently successful in many parts of the world?

Replies from: gwern, Ishaan, jimmy, name99
comment by gwern · 2013-09-14T00:22:40.045Z · LW(p) · GW(p)

These are good questions, and as you can imagine, much debated in the literature, with explanations ranging from acts of God (the paradigmatic example being the Jew quoted in Acts of the Apostles arguing that Christianity didn't need to be suppressed because if it flourished, it must be favored by God, and it would fail if it was disfavored by him) to enabling effective society coordination (particularly attractive for Islam: the horsebacked nomads managed to coordinate under Mohammed rather than feud, and did as well as the Mongols, with conversion then following from personal advantage and to escape dhimmitude) to arguments that it's just random drift (Carrier points out that the best estimates of the sizes of early Christianity as tiny even centuries after Jesus then necessarily imply that the annual growth rate must have been far tinier than commonly assumed).

comment by Ishaan · 2013-09-15T16:58:12.971Z · LW(p) · GW(p)

My uneducated guess is that it is because Christianity, Islam, Sikhism, Buddhism, and Judaism were all backed by governments and military forces during the initial stages of expansion. I don't believe there are any large religions for which this is not true - Hinduism is too old for us to say much about its origins, but there was a time when Buddhism was becoming extremely popular, and power was involved in re-establishing Hinduism.

If I'm right, then the thing that causes small memeplexes to become big memeplexes is the successful conversion of a few powerful and influential people (and that process happens through random drift in the case of religion)

Also, I think Christianity, Islam, and Judaism are the only religions which care about whether or not you believe them. (As in, members think that belief itself has consequences and so they aught to care what others believe). It's harder to leave these religions, with shadows of hell hanging over you. I think that in most other religions, people can sort of vaguely redirect worship from one set of symbols to another without really rejecting old beliefs and accepting new ones in a way that is consistent with "brainwashing" - it's more or less immaterial which religion they are following. I've got relatives who pray to little pictures of Jesus along with other Hindu idols, and I don't think they realize how odd this would seem to a Christian. The notion that deviation from a religious orthodoxy is bad tends to be absent, and I imagine that this makes conversion easier.

comment by jimmy · 2013-09-14T18:32:23.187Z · LW(p) · GW(p)

Typically, a conversion sticks because an organization provides value to its members.

People do get value from religion. The big two seem to be social conformity and fear of death, but there are others. The only atheist that I personally know who converted to Christianity got a wife out of the deal.

comment by name99 · 2019-11-12T04:35:04.828Z · LW(p) · GW(p)

The general answer seems to be that religions (just like "total" political parties) provide value for money, in particular a social environment. Friends, baby sitters, group activities, help when you lose a job or someone dies. I think academics, in particular, tend to be such loners, and to be content with such social support as is provided by the government, that they radically underestimate how hungry people are for this sort of social interaction.

comment by Viliam_Bur · 2013-09-14T22:06:42.007Z · LW(p) · GW(p)

My model of a cult is a mechanism that exploits various flaws in human thinking. For example, peer pressure turned up to eleven: if you leave a cult, you lose all your friends at the same moment. (In real life, one's friends are usually not this coordinated.) The cult keeps you busy with all the cultish stuff, which makes the natural procrastination about important decisions (such as leaving the cult) even stronger. There is the initial "love bombing", which prevents you from estimating how happy you would be if you joined. Etc.

Typically, a conversion sticks because an organization provides value to its members.

Disagree connotationally. (Also I am not sure what "conversion sticks" means precisely. If a person spends 5 or 10 years in a cult and then leaves, do we consider those initial years as a success, because the person did not run away immediately?) Yes, technically, the organization provides something, but if we want to get more specific, it usually provides promises that something very good will happen in the unspecified - but very close - future. It also provides something immediately, for example a social group, but I think those promises are very important for many people. So if we define "value" as promises that sound good but are never fulfilled, then yes, the organization provides the value to its members. But if we define "value" as the thing that was promised, then it does not really provide the value.

comment by NancyLebovitz · 2020-10-28T13:33:57.620Z · LW(p) · GW(p)

Until I read this, I didn't realize there are different possible claims about the dangers of cults. One claim-- the one gwern is debunking-- is that cults are a large-scale danger, and practically anyone can be taken over by a cult.

The other less hyperbolic claim is that cults can seriously screw up people's lives, even if it's a smallish proportion of people. I still think that's true.

comment by JQuinton · 2013-09-17T13:53:01.452Z · LW(p) · GW(p)

This is an excerpt from Valerie Tarico's web series "Christian Belief Through The Lens of Cognitive Science"

In revival meetings or retreats, semi-hypnotic processes draw a potential convert closer to the toggle point. These include including repetition of words, repetition of rhythms, evocative music, and Barnum statements (messages that seem personal but apply to almost everyone– like horoscopes). Because of the positive energy created by the group, potential converts become unwitting participants in the influence process, actively seeking to make the group’s ideas fit with their own life history and knowledge. Factors that can strengthen the effect include sleep deprivation or isolation from a person’s normal social environment. An example would be a late night campfire gathering with an inspirational story-teller and altar call at Child Evangelism’s “Camp Good News.”

These powerful social experiences culminate in conversion, a peak experience in which the new converts experience a flood of relief. Until that moment they have been consciously or unconsciously at odds with the group center of gravity. Now, they may feel that their darkest secrets are known and forgiven. They may experience the kind of joy or transcendence normally reserved for mystics. And they are likely to be bathed in love and approval from the surrounding group, which mirrors their experience of God.

Also, military basic training seems to employ some of these methods too:

To do this, however, we need a form of psychological training that is able to forge individuals who can do this. That is why boot camp has evolved to become such a potent tool in today's military machine.

The most important single thing to know about boot camp is that it is 100 percent designed to reprogram children and civilians into warriors. It places within them a sense that they are expected to do important things, far more important things than could be expected from other 18-year-olds. This is all happening during one of the most intensely stressful periods of your life, when you are kept isolated from contact from your family and friends and taught that everything you were before entering the Marines was weak and lacking any real value until you too are a Marine. Cults are made this way too. I'm just saying. But in all seriousness, the psychological transformation of boot camp is a very intense and intentional effort by the Marine Corps to make warriors able to fight and kill out of kids who have just barely left high school. From the point that you graduate boot camp, you will be different and have parts of the Marine Corps culture as part of your psyche.

[...]

Now we move on to something else very important and why I say that it is "psychological" retraining. You go through the next few days running from place to place, doing this, that, this, that and you won't even realize ... you haven't slept in three days. Yeah, you will go about three days without sleep upon arrival. The whole time you are completely exhausted while running on adrenaline and hearing over and over, that you are inferior. Inferior to real Marines, which you aren't yet. You aren't thinking about it, but it is sinking in. You are completely tired and these things build up. Without realizing it, you start to believe that that which is being told to you is true, that there is a weakness in you and that you are less than perfect. In your current state, you believe them and that you must change to be good enough.

(Caveat: I've been through bootcamp)

I'm not sure you could call this brainwashing, though. Not any more than you can call singing and dancing in synchrony brainwashing or doing extreme rituals. Like someone else said, taboo the word "brainwashing"; the word itself has a bunch of negative connotations. Brainwashing in the popular sense also assumes a sort of permanence, which is probably a strawman of what's actually going on.

comment by lc · 2023-11-22T02:55:57.003Z · LW(p) · GW(p)
Nuremberg rallies - Wikipedia
One officer says to the other officer: "Mind control doesn't exist. You should check out the psychology research."
comment by shminux · 2013-09-13T22:04:15.308Z · LW(p) · GW(p)

Very interesting and surprising. A priori I would have expected the most successful NRMs to be at least 10-20% effective in one-year new member retention. I wonder how non-religious non-mainstream organizations that demand some amount of sacrifice from their members measure up? E.g. what are the retention rates in online forums, gaming communities, fitness centers, etc...?

comment by buybuydandavis · 2013-09-13T21:22:56.950Z · LW(p) · GW(p)

Cultist, n. One who is obstinately and zealously attached to an opinion that you do not entertain.

(That's actually "bigot" in the Devil's Dictionary, but cultist is a better fit to me.)

Replies from: wedrifid
comment by wedrifid · 2013-09-14T01:55:59.863Z · LW(p) · GW(p)

In the translation from 'bigot' to 'cultist' we could perhaps add "or group you do not approve of".

comment by gwern · 2014-09-29T17:58:17.150Z · LW(p) · GW(p)

Has the rehabilitation of 'cults' begun? "The Cult Deficit", Ross Douthat:

LIKE most children of the Reagan era, I grew up with a steady diet of media warnings about the perils of religious cults — the gurus who lurked in wait for the unwary and confused, offering absolute certainty with the aftertaste of poisoned Kool-Aid. From the 1970s through the 1990s, from Jonestown to Heaven’s Gate, frightening fringe groups and their charismatic leaders seemed like an essential element of the American religious landscape. Yet we don’t hear nearly as much about them anymore, and it isn’t just that the media have moved on. Some strange experiments have aged into respectability, some sinister ones still flourish, but over all the cult phenomenon feels increasingly antique, like lava lamps and bell bottoms. Spiritual gurus still flourish in our era, of course, but they are generally comforting, vapid, safe — a Joel Osteen rather than a Jim Jones, a Deepak Chopra rather than a David Koresh.

...The decline of cults, while good news for anxious parents of potential devotees, might actually be a worrying sign for Western culture, an indicator not only of religious stagnation but of declining creativity writ large. The first writer is Philip Jenkins, a prolific religious historian, who argues that the decline in “the number and scale of controversial fringe sects” is both “genuine and epochal,” and something that should worry more mainstream religious believers rather than comfort them. A wild fringe, he suggests, is often a sign of a healthy, vital center, and a religious culture that lacks for charismatic weirdos may lack “a solid core of spiritual activism and inquiry” as well. The second writer is Peter Thiel, the PayPal co-founder, venture capitalist and controversialist, who includes an interesting aside about the decline of cults in his new book, Zero to One

...From the Franciscans to the Jesuits, groups that looked cultlike to their critics have repeatedly revitalized the Catholic Church, and a similar story can be told about the role of charismatic visionaries in the American experience. (The enduring influence of one of the 19th century’s most despised and feared religious movements, for instance, is the reason the state of Utah now leads the United States on many social indicators.)...When “people were more open to the idea that not all knowledge was widely known,” Thiel writes, there was more interest in groups that claimed access to some secret knowledge, or offered some revolutionary vision. But today, many fewer Americans “take unorthodox ideas seriously,” and while this has clear upsides — “fewer crazy cults” — it may also be a sign that “we have given up our sense of wonder at secrets left to be discovered.”

comment by BrotherNihil · 2013-09-15T00:57:16.152Z · LW(p) · GW(p)

My observation about cults, from personal experience leading them, is that they are a totally normal mode of human operation. People are always looking for strong leaders with vision, passion and charisma who can organize them for a larger purpose. What distinguishes a cult from a non-cult is that they are outside the norms of the mainstream society (as established by the dominant cults -- i.e. "the culture"). "Cult", "brainwashing", "deprogramming", etc. are terms of propaganda used by the dominant culture to combat competing memeplexes.

I think of cults as testbeds for new civilizations and new ways of life. In times of change, when the old ways are failing and the civilization is falling, cults may be well-positioned to expand and become the new normal. I suppose this is the memetic equivalent of marginal species who exploit mass extinctions to become genetically dominant -- cults provide memetic diversity. This is apparently what was going on in the declining years of Rome, and I see indications that something similar is happening today.

Replies from: Dahlen, Viliam_Bur
comment by Dahlen · 2013-09-15T18:49:09.659Z · LW(p) · GW(p)

My observation about cults, from personal experience leading them

* raises eyebrow *

Replies from: niceguyanon
comment by niceguyanon · 2013-09-16T20:57:54.024Z · LW(p) · GW(p)

From BrotherNihil's website:

Lately I've been thinking a lot about how one could go about becoming an online Mohammed or Genghis Khan – a great leader who sends forth an army of trolls to conquer web sites for the Religion and the Empire. I don’t think it has been tried, but think it may be possible.

He wasn't kidding about the personal experience.

I say this because I find it quite easy to go to a web site and to begin to control the debate, stir up dissent, refute ideologies, recruit people, or otherwise manipulate the site as I see fit.

Heh... goodluck with that here on LW.

Depending on the level of moderation, this may have to be done subtly, but there is always a way to counter whatever propaganda is being spread at a given site and to inject some of your own counter-propaganda. If one is clever and persistent, one should in this way be able to alter, destroy or co-opt any site according to one's agenda.

I take the crackpottery of his site as evidence to not take much of what he says seriously.

Replies from: Flaglandbase
comment by Flaglandbase · 2022-03-12T00:08:14.014Z · LW(p) · GW(p)

J.K. Rowling could probably manipulate Lesswrong as she sees fit by buying the site, shadowbanning all commenters, and putting up new comments using their names (but preventing the real users from seeing these) were they slowly become convinced witchcraft real.

comment by Viliam_Bur · 2013-09-15T18:40:53.977Z · LW(p) · GW(p)

"Cult", "brainwashing", "deprogramming", etc. are terms of propaganda used by the dominant culture to combat competing memeplexes.

There is something like manipulation. To make this a discussion about anticipated experience, here is an experiment proposal:

Kidnap a few new members from different religious organizations. (It's just an imaginary experiment.) Keep them for one week isolated from their religious groups: no personal contact, no phone, no books. If they start to do some rituals they were told to do, for example repeat a mantra or sing a song, prevent them from doing so. Otherwise, don't do them any harm, and keep them in a nice environment. -- When the week is over, just let them go. Observe how many of them return to the original group. Compare with a control group of randomly selected people you didn't kidnap; how many of them remained in the group after the week. Are there statistically significant differences for different religious groups?

My prediction is that there would be observable differences for different religious groups. I believe there is some pressure involved in the process of recruitment in some religious (or not just religious) groups; some algorithm which increases the chances of membership when done properly, and fails when interrupted. Perhaps "brainwashing" is too strong word, but it a kind of manipulation. It consists of pushing the person towards more expressions of commitment, without giving them time to reflect whether they really want it (whether it is okay with their other values).

Replies from: gwern
comment by gwern · 2013-09-15T18:49:15.750Z · LW(p) · GW(p)

Kidnap a few new members from different religious organizations. (It's just an imaginary experiment.)

This is pretty similar to what the deprogrammers did. They didn't have too high success rates.

Replies from: None
comment by [deleted] · 2013-09-15T19:30:50.899Z · LW(p) · GW(p)

People like to resist coercion. Reactions to being kidnapped in order to be forced to abandon the cult could be different than reactions to being kidnapped and held for a week by a mad psychologist with a mysterious agenda. Though for the agenda to be mysterious, the idea of preventing them from engaging in rituals would have to be abandoned.

comment by Brillyant · 2013-09-16T04:10:13.276Z · LW(p) · GW(p)

Taboo "brainwashing".

What does Christianity, for instance, succeed in doing, if not brainwashing?

It seems to me that it's (sincere) adherents have been persuaded to believe Christianity is the most rational choice. They've been convinced it is the best wager available.

Is it? Is Christianity (or any religion) the best wager? Is it rational?

If not, then what can we say about the mechanism(s) used to get humans to be wholly convinced otherwise? What shall we name it? How does it work?

And how is this yet-nameless, magical process different from brainwashing?

Replies from: Viliam_Bur
comment by Viliam_Bur · 2013-09-16T09:20:08.795Z · LW(p) · GW(p)

Religions succeed in making people believe in them. But how specifically? I propose three mechanisms:

First, by providing them some advantages. There may be advantages of the belief (belief in afterlife alleviates fear of death), advantages of explicitly commanded behavior (forbidding theft reduces costs of protecting property), other advantages (meeting every Sunday in church helps one meet their neighbors).

Second, by blackmailing them. If you leave the religion, you will be forever tortured in the hell; and in some situations your former friends will murder you.

Third, by modifying their thoughts or behavior into ones that make leaving less likely. For example teaching people that things can be "true" even if they are completely invisible and statistically undetectable (which makes them less likely to realize that the beliefs are wrong), by making them too busy to make any long-term thinking (such as whether to leave the religion), by removing information sources or friends that could provide information or encouragement against the religion.

If this reflects the reality well enough, I would suggest that the word "brainwashing" means using mainly the third and second kind of mechanisms (as opposed to mostly the first one, which feels legitimate). One religious group can give people friends and pleasant social activities, so the members are happy to be there. Other religious group can make them busy 16 hours a day (praying, meditating, converting new people, etc.) and destroy all out-group contacts, so the members' agency is crippled, and they stay even if they are unhappy.

comment by gwern · 2013-11-14T18:56:31.858Z · LW(p) · GW(p)

In Jin, Moon and his wife’s fourth child, seemed suited for the task. She had a modern American upbringing and a master’s degree from Harvard. In 2009, she took over the Unification Church of America and introduced a bold modernization program. Her aim, she said, was to transform the church into one that people—especially young people—were “dying to join.” She renamed the church Lovin’ Life Ministries, shelved the old hymn books, and launched a rock band, an offshoot of which played New York clubs under the moniker Sonic Cult. She also discarded the old Korean-inspired traditions: bows and chanting gave way to “Guitar Hero” parties, open mics, concerts, and ping-pong tournaments. What’s more, In Jin broke some long-standing taboos. Rather than adhering to the church line on arranged marriage, for example, she encouraged young people to play a role in choosing their own spouses. Her reforms were met with heated resistance. Across the country, Moon’s disciples took to the Internet to denounce In Jin’s “bling-bling” style and her “ridiculous accent.” One online critic dubbed her ministry the “mushroom church,” because “all you do is sit passively in the dark and are fed bovine excrement.” Within two years, nationwide monthly attendance plunged from roughly 26,000 to less than 7,500, according to internal church documents.

http://www.newrepublic.com/article/115512/unification-church-profile-fall-house-moon

In other words, some popularizing reforms which reduced apparent coercion and cultishness cut membership by 75% - more strikingly, despite being one of the most famous, notorious, politically influential 'cults', they were down to just 25k total in the USA in 2009.

Replies from: gjm
comment by gjm · 2017-01-10T16:04:20.305Z · LW(p) · GW(p)

some popularizing reforms which reduced apparent coercion and cultishness cut membership by 75%

This seems like evidence against (a perhaps overstrong version of) the thesis of the OP, namely that cult "techniques" are ineffective. But note that

  • it's perfectly consistent with them not being scarily effective; and
  • it's also possible that these changes made no difference to (or even increased) the Moonies' ability to acquire new members and keep them in the short term, and that it cut their membership because longstanding members who were used to the old way of doing things hated the reforms.
comment by private_messaging · 2013-09-14T15:05:21.294Z · LW(p) · GW(p)

It always seemed obvious to me that cults have rather low conversion rates.

Cults do not optimize for having many members. They optimize for the dedication of the members. This may be because the typical cult leader would rather have 10 people believe that he is the saviour and the messenger of God, than have 1000 people believe that he's merely a good guy.

(I tend to delineate cults/non-cults on the basis of how they resolve this trade-off between extremism and popularity)

Replies from: gwern
comment by gwern · 2013-09-14T15:34:41.946Z · LW(p) · GW(p)

Cults do not optimize for having many members. They optimize for the dedication of the members. This may be because the typical cult leader would rather have 10 people believe that he is the saviour and the messenger of God, than have 1000 people believe that he's merely a good guy.

No one in the literature suggests this, and cults (just like mainstream religions such as Mormonism) invest enormous efforts into proselytization, rather than strenuous filtering of existing converts. The efforts just don't succeed, and like the Red Queen, minority religions need to run as fast as they can just to stay in place.

Replies from: private_messaging, private_messaging
comment by private_messaging · 2013-09-14T16:44:19.163Z · LW(p) · GW(p)

The low rate of retention is extreme filtering. The cults try to get members to sever ties with the family and friends, for example - and this is a filter, most people get creeped out and a few go through with it. edit: and of course, with such extreme filtering, one needs a lot of proselytism to draw just a hundred very dedicated supporters.

Replies from: gwern, ChristianKl
comment by gwern · 2013-09-14T18:35:55.537Z · LW(p) · GW(p)

The low rate of retention is extreme filtering.

You are arguing by definition here; please consider what could falsify your mental model of cults. If my local gym discovers only 1% of the people joining after New Years will stick around for more than a year, does that necessarily imply that the gym is ruled by a charismatic leader driving people away so as to maximize the proportion of unthinkingly loyal subordinates?

Low rate of retention is simply low rate of retention. This can be for a great many reasons, such as persecution, more attractive rival organizations, members solving their problems and leaving, or (way down the list) extreme filtering for loyalty which drives away otherwise acceptable members. How often do you see a cult leader going 'well, sure, we could have thousands more members if we wanted (people are pounding down the doors to convert), and majorly increase our donations and financial holdings, but gosh, we wouldn't want to sell out like that!'

Of course, like any organization, there's concerns about freeriding and wasting club goods and it'll seek to strike a balance between inclusiveness and parasite load; but a cult which has 'successfully' shed all but a few fanatics is a cult which is about to become history.

The cults try to get members to sever ties with the family and friends, for example

Recruiting through family and friends is a major strategy of cults - indeed, perhaps the only strategy which does not have abysmally low success rates.

Replies from: private_messaging
comment by private_messaging · 2013-09-14T18:59:26.988Z · LW(p) · GW(p)

Low rate of retention is a product of many reasons simultaneously, including the extreme weird stuff creeping people out. If your local gym is creepy, it will have lower retention rate, than same gym that is not creepy.

My mental model of failed retention includes the general low retention rate, in combination with the weird things that cult does creeping people out, on top of that.

How often do you see a cult leader going 'well, sure, we could have thousands more members if we wanted (people are pounding down the doors to convert), and majorly increase our donations and financial holdings, but gosh, we wouldn't want to sell out like that!'

I rarely see people reflect on their motives or goal structure. You often see a cult leader abusing a cultist, which leads insufficiently dedicated cultists to leave. Such actions sacrifice quantity for "quality".

Recruiting through family and friends is a major strategy of cults - indeed, perhaps the only strategy which does not have abysmally low success rates.

Yes, and a lot of the time that fails, and the family members start actively denouncing the cult, and the member has to choose between the family and friends, and the cult, at which point, well, few choose the cult.

Replies from: gwern
comment by gwern · 2013-09-14T22:31:36.495Z · LW(p) · GW(p)

Low rate of retention is a product of many reasons simultaneously, including the extreme weird stuff creeping people out.

As pointed out in the OP by one author, the cults in question have in many ways been assimilated by the mainstream and so are far less 'weird' than ever before. Has that helped their retention rates? Environmentalism and meditation are completely mainstream now, have the Hare Krishnas staged a comeback?

If your local gym is creepy, it will have lower retention rate, than same gym that is not creepy.

The counterfactual is not available or producible, and so this is meaningless to point out. If the Hare Krishnas did not hold 'creepy' beliefs, in what sense is this counterfactual organization similar to the Hare Krishnas? If Transcendental Meditators did not do as weird a thing as meditate, how are they Transcendental Meditators? Defining away all the unique characteristics does not add any insight.

You often see a cult leader abusing a cultist, which leads insufficiently dedicated cultists to leave.

"You often see a boss abusing a subordinate, which leads insufficiently dedicated employees to leave. This is because bosses wish to sacrifice quantity and being able to handle work for 'quality' of subordinates."

No, there is nothing unique about cults in this respect. Monkeys gonna monkey. And for the exact same reason businesses do not casually seek to alienate 99% of their employees in order to retain a fanatical 1%, you don't see cults systematically organization-wide try to alienate everyone. You see a few people in close proximity to elites being abused. Just like countless other organizations.

the member has to choose between the family and friends, and the cult, at which point, well, few choose the cult.

Which explains the success of deprogrammers, amirite?

Replies from: Jiro, private_messaging
comment by Jiro · 2013-09-15T17:05:34.163Z · LW(p) · GW(p)

Environmentalism and meditation are completely mainstream now, have the Hare Krishnas staged a comeback?

I would suggest that if beliefs believed by cults becoime mainstream, that certainly decreases one barrier to such a cult's expansion, but because there are additional factors (such as creepiness) that alone is not enough to lead the cult to expand much. It may be that people's resistance to joining a group drastically increases if the group fails any one of several criteria. Just decrementing the number of criteria that the group fails isn't going to be enough, if even one such criterion is left.

"You often see a boss abusing a subordinate, which leads insufficiently dedicated employees to leave. This is because bosses wish to sacrifice quantity and being able to handle work for 'quality' of subordinates.

The level of abuse done by bosses and cult leaders is different, so although the statement is literally true for both bosses and cult leaders, it really doesn't imply that the two situations are similar.

Replies from: gwern
comment by gwern · 2013-09-15T18:45:06.861Z · LW(p) · GW(p)

It may be that people's resistance to joining a group drastically increases if the group fails any one of several criteria.

Maybe, but I don't know how we'd know the difference.

The level of abuse done by bosses and cult leaders is different, so although the statement is literally true for both bosses and cult leaders, it really doesn't imply that the two situations are similar.

Is it really? Remember how many thousands of NRMs there are over the decades, and how people tend to discuss repeatedly a few salient examples like Scientology. Can we really compare that favorably regular bosses with religious figures? Aside from the Catholic Church scandal (with its counterparts among other closemouthed groups like Jewish and Amish communities), we see plenty of sexual scandals in other places like the military (the Tailhook scandal as the classic example, but there's plenty of recent statistics on sexual assault in the military, often enabled by the hierarchy).

comment by private_messaging · 2013-09-14T23:25:17.478Z · LW(p) · GW(p)

I don't see how environmentalism or for that matter meditation itself is creepy.

What's creepy about Hare Krishnas is the zoned out sleep deprived look on the faces (edit: I am speaking of the local ones, from experience), and the whole obsession with the writings of the leader thing, and weirdly specific rituals. Now that environmentalism and meditation are fairly mainstream, you don't have to put up with the creepy stuff if you want to be around people who share your interests in environmentalism and meditation. You have less creepy alternatives. You can go to a local Yoga class, that manages to have same number of people attending as the local Khrishna hangout, despite not trying nearly as hard to find new recruits. You can join a normal environmentalist group.

No, there is nothing unique about cults in this respect. Monkeys gonna monkey. And for the exact same reason businesses do not casually seek to alienate 99% of their employees in order to retain a fanatical 1%, you don't see cults systematically organization-wide try to alienate everyone. You see a few people in close proximity to elites being abused. Just like countless other organizations.

The difference is, of course, in extent. For example, putting up a portrait of the founder at every workplace (or perhaps in a handbook, or the like) would be something that a cult leader would do in a cult, but what a corporation would seldom ever do because doing so would be counter-productive.

edit: actually. What do you think makes joining a cult worse than joining a club, getting a job, and so on? Now, what ever that is, it makes it harder to get new recruits, and requires more dedication.

Replies from: gwern
comment by gwern · 2013-09-15T18:35:22.615Z · LW(p) · GW(p)

I don't see how environmentalism or for that matter meditation itself is creepy.

Which goes to show how far into the zeitgeist they've penetrated. Go back to the 1960s when the cult panic and popular image of cults was being set, and things were quite different. One of the papers discusses a major lawsuit accusing the Hare Krishnas of 'brainwashing' a teen girl when she ran away from home and stayed with some Krishnas; the precipitating event was her parents getting angry about her meditating in front of a little shrine, and ripping it out and burning it (and then chaining her to the toilet for a while). To people back then, 'tune in, turn on, drop out' sounds less like a life choice than a threat...

What's creepy about Hare Krishnas is the zoned out sleep deprived look on the faces (edit: I am speaking of the local ones, from experience)

Well, I can hardly argue against your anecdotal experiences.

the whole obsession with the writings of the leader thing,

Supreme Court - jurists or cultists? Film at 11. We report, you decide.

and weirdly specific rituals.

I don't even know what 'weirdly specific' would mean. Rituals are generally followed in precise detail, right down to the exact repetitive wording and special garments like Mormon underpants; that's pretty much what distinguishes rituals from normal activities. Accepting Eucharist at mass? Ritual. Filling out a form at the DMV? Not ritual.

You can go to a local Yoga class, that manages to have same number of people attending as the local Khrishna hangout, despite not trying nearly as hard to find new recruits.

Hmm, where was one to find yoga back then... Ah yes, also in cults. Ashrams in particular did a lot of yoga. Interesting that you no longer have to go to an ashram or fly to India if you want to do yoga. It's almost like... these cult activities have been somehow normalized or assimilated into the mainstream...

You can join a normal environmentalist group.

And where did these environmentalist groups come from?

For example, putting up a portrait of the founder at every workplace (or perhaps in a handbook, or the like) would be something that a cult leader would do in a cult, but what a corporation would seldom ever do because doing so would be counter-productive.

Really? That seems incredibly common. Aside from the obvious examples of many (all?) government offices like post offices including portraits of their supreme leader - I mean, President - you can also go into places like Walmart and see the manager's portrait up on the wall.

What do you think makes joining a cult worse than joining a club, getting a job, and so on?

Personally? I think it's mostly competition from the bigger cults. Just like it's hard to start up a business or nonprofit.

Replies from: Luke_A_Somers, private_messaging
comment by Luke_A_Somers · 2013-09-16T15:39:30.501Z · LW(p) · GW(p)

I wasn't around in the 60s and wasn't aware for any of the 70s, but... Environmentalism seems qualitatively different from everything else here. Is there some baggage to this beyond, say, conservation, or assigning plants and animals some moral weight, that is intended here?

Something may have seemed weirder in the past because it was weirder back then.

I suspect few modern Christians would sign up for AD 200 Christianity.

Replies from: gwern
comment by gwern · 2013-09-16T16:33:59.850Z · LW(p) · GW(p)

Environmentalism seems qualitatively different from everything else here. Is there some baggage to this beyond, say, conservation, or assigning plants and animals some moral weight, that is intended here?

Not really, aside from the standard observation that you can just as easily play the 'find cult markers' game with environmental groups like Greenpeace or ELF. Cleansing rituals like recycling, intense devotion to charismatic leaders, studies of founding texts like Silent Spring, self-abnegating life choices, donating funds to the movement, sacralization of unusual objects like owls or bugs, food taboos ('GMOs'), and so on and so forth.

comment by private_messaging · 2013-09-15T18:45:08.989Z · LW(p) · GW(p)

What do you think makes joining a cult worse than joining a club, getting a job, and so on?

Personally? I think it's mostly competition from the bigger cults. Just like it's hard to start up a business or nonprofit.

That doesn't even make sense as an answer. Rest likewise doesn't seem in any way contradictory to the point I am making, but is posed as such.

Replies from: gwern
comment by gwern · 2013-09-15T18:47:41.069Z · LW(p) · GW(p)

That doesn't even make sense as an answer.

Of course it makes sense. As I've already claimed, cults are not engaged in some sort of predatory 'brainwashing' where they exploit cognitive flaws to just moneypump people with their ultra-advanced psychological techniques: they offer value in return for value received, just like businesses need to offer value to their customers, and nonprofits need to offer some sort of value to their funders. And these cults have plenty of established competition, so it makes sense that they'd usually fail. Just like businesses and nonprofits have huge mortality rates.

Rest likewise doesn't seem in any way contradictory to the point I am making, but is posed as such.

I've given counter-examples and criticized your claims. Seems contradictory to me.

Replies from: private_messaging
comment by private_messaging · 2013-09-15T19:04:37.821Z · LW(p) · GW(p)

Of course it makes sense.

The question was, "What do you think makes joining a cult worse than joining a club, getting a job, and so on?" . How is competition from other cults impacting the decision to join a cult - any cult?

As I've already claimed, cults are not engaged in some sort of predatory 'brainwashing' where they exploit cognitive flaws to just moneypump people with their ultra-advanced psychological techniques: they offer value in return for value received

Well, I know of one cult that provides value in form of the nice fuzzy feeling of being able - through a very little effort - to see various things that, say, top physicists can not see. Except this feeling is attained entirely through self deception, unbeknown to the individuals, and arguing that it is providing value is akin to arguing that a scam which sells fake gold for the cheap is providing value.

(Then there's of course Janestown, and so on and so forth)

Replies from: gwern
comment by gwern · 2013-09-15T19:14:03.370Z · LW(p) · GW(p)

How is competition from other cults impacting the decision to join a cult?

Exactly as I said, pressure from other cults: direct retaliation (like the legal system endorsing your kidnapping), opportunity costs, lack of subsidies, regulatory capture being used against you, the risk of joining a small new organization... Many of the reasons that apply to not joining a startup and instead working at Microsoft can be tweaked to apply to small cults vs big cults.

Well, I know of one cult that provides value in form of the nice fuzzy feeling of being able - through a very little effort - to see various things that, say, top physicists can not see. Except this feeling is attained entirely through self deception, unbeknown to the individuals, and arguing that it is providing value is akin to arguing that a scam which sells fake gold for the cheap is providing value.

You know what's even more awesome than self-deception? Sliming people you don't like as cults, when your ideas about what a cult is aren't even right in the first place. Sweet delicious meta-contrarianism.

True, it's not as good a racket as Singer getting paid tons of money to testify about how awful cults are and how powerful their deceptions are - but it's a lot less work and more convenient.

Replies from: private_messaging
comment by private_messaging · 2013-09-15T19:24:26.912Z · LW(p) · GW(p)

Exactly as I said, pressure from other cults: direct retaliation (like the legal system endorsing your kidnapping), opportunity costs, lack of subsidies, regulatory capture being used against you, the risk of joining a small new organization... Many of the reasons that apply to not joining a startup and instead working at Microsoft can be tweaked to apply to small cults vs big cults.

I said, joining a cult. I didn't say, joining a small cult, I didn't say, joining a big cult, I said, joining a cult.

You know what's even more awesome than self-deception? Sliming people you don't like as cults, when your ideas about what a cult is aren't even right in the first place. Sweet delicious meta-contrarianism.

Well, a scam then, if you don't like me to call it a cult. It is my honest opinion that the value arises through the self deception, which goes against the intent of individual, and is of lesser value compared to what the individual is expecting to get.

Replies from: gwern
comment by gwern · 2013-09-15T19:42:04.187Z · LW(p) · GW(p)

I said, joining a cult. I didn't say, joining a small cult, I didn't say, joining a big cult, I said, joining a cult.

I'm sorry, I didn't realize I was supposed to interpret that as meaninglessly general as possible, rather than, you know, be about the topic of my post or the topic of the previous comments.

Why do all organizations and religions in particular exist? That's a tough question which I'm afraid I have no quick answer to, but the right answer looks like 'all sorts of reasons'.

Replies from: private_messaging
comment by private_messaging · 2013-09-16T12:04:50.261Z · LW(p) · GW(p)

Your barrage of non-sequiturs posed as "counterarguments" is incredibly annoying. Your post clearly doesn't deal with specifically the small cults, nor anything else does.

Replies from: gwern
comment by gwern · 2013-09-16T16:34:52.607Z · LW(p) · GW(p)

Your barrage of non-sequiturs posed as "counterarguments" is incredibly annoying.

Failure to engage noted.

Your post clearly doesn't deal with specifically the small cults, nor anything else does.

You're right, it's not like all the cites deal with small organizations or anything like that.

Replies from: private_messaging
comment by private_messaging · 2013-09-17T00:31:43.402Z · LW(p) · GW(p)

You're right, it's not like all the cites deal with small organizations or anything like that.

Krishnas, scientology, moonies, etc. how are those examples of small cults? Or do you count every major religion as a cult?

Replies from: gwern
comment by gwern · 2013-09-17T16:21:57.358Z · LW(p) · GW(p)

Krishnas, scientology, moonies, etc. how are those examples of small cults?

/sigh

You know, I suspected that you hadn't actually read what I posted and jumped straight to the comment section to vomit your preconceptions out, but that assertion pretty much confirms it. Here, from the quotes:

These groups are actually very small in numbers (the Hare Krishna and the Unification Church each have no more than two to three thousand members nationwide), which puts the lie to brainwashing claims. If “brainwashing” practiced by new religions is so powerful, why are the groups experiencing so much voluntary attrition, and why are they so small?

Two of the most famous cults together have like a third of the people the university I went to did? That sounds pretty darn small to me!

Or do you count every major religion as a cult?

I don't think there's any meaningful difference aside from things like the size or social acceptability, as I think I've been pretty clear all along.

comment by ChristianKl · 2013-09-14T20:26:46.445Z · LW(p) · GW(p)

The cults try to get members to sever ties with the family and friends, for example - and this is a filter, most people get creeped out and a few go through with it.

I'm not sure whether that's true. You have people on LessWrong talking about cutting family ties with nonrational family members and nobody get's creeped out.

I don't think I have ever witnessed people getting creeped out by such discussions in the self help area and I think I have frequently heard people encouraging others to cut ties with someone that "holds them back".

Replies from: yli, Costanza, wedrifid
comment by yli · 2013-09-15T01:42:40.729Z · LW(p) · GW(p)

Really? Links? A lot of stuff here is a bit too culty for my tastes, or just embarassing, but "cutting family ties with nonrational family members"?? I haven't been following LW closely for a while now so I may have missed it, but that doesn't sound accurate.

Replies from: Douglas_Knight
comment by Douglas_Knight · 2013-09-15T03:38:52.072Z · LW(p) · GW(p)

Here's an example.

Replies from: Mestroyer, yli
comment by Mestroyer · 2013-09-15T14:56:56.723Z · LW(p) · GW(p)

diegocaleiro didn't just say they were just irrational:

(1) Stupid (2) Religious (3) Non-rationalists (4) Absolutely clueless about reality (5) Pushy about inserting their ideas/ideals/weltenshaaung/motifs into you?

I strongly suspect that this isn't a case of "My family members don't believe as I do, therefore fuck those guys." but rather "These family members know that I am nonreligious and aggressively proselytize because of it." This probably isn't even about rationality or LessWrong, rather atheism.

Note also that it is diegocaleiro who initiated the conversation, and note the level of enthusiasm about the idea received from other posters (Only ChristianKI and Benito's responses seem wholly in favor, Villiam_Bur and drethelin's responses are against, shminux and Ben_LandauTaylor's responses are neutral).

Replies from: Eugine_Nier
comment by Eugine_Nier · 2013-09-15T15:37:49.496Z · LW(p) · GW(p)

"These family members know that I am nonreligious and aggressively proselytize because of it."

Outside view: These family members know that [diegocaleiro joined a group with weird non-mainstream religious beliefs] and [are trying to deconvert him].

comment by yli · 2013-09-15T04:01:21.368Z · LW(p) · GW(p)

Thanks for the link. I don't really see creepy cult isolation in that discussion, and I think most people wouldn't, but that's just my intuitive judgment.

Replies from: ChristianKl
comment by ChristianKl · 2013-09-15T09:56:07.986Z · LW(p) · GW(p)

That's the point. It doesn't look that way from the inside.

If someone would tell those family members that the OP cutted their family ties with them because he made a rational analysis with help from his LessWrong friends those family member might see it as an example of the evil influence that LessWrong has on people.

comment by Costanza · 2013-09-14T20:48:23.880Z · LW(p) · GW(p)

I'm at least mildly creeped out by occasional cultish behavior on LessWrong. But every cause wants to be a cult

Eliezer said so, so therefore it is Truth.

comment by wedrifid · 2013-09-15T02:10:17.825Z · LW(p) · GW(p)

I'm not sure whether that's true. You have people on LessWrong talking about cutting family ties with nonrational family members and nobody get's creeped out.

I do not believe you. If it is the case that people talk about cutting family ties with 'nonrational family members' then there will be people creeped out by it.

Note that if the 'nonrational' family members also happen to be emotionally abusive family members this would not match the criteria as I interpret it. (Even then I expect some people to be creeped out by the ties cutting and would expect myself to aggressively oppose such expressions so as to suppress a toxic influence.)

Replies from: Eugine_Nier
comment by Eugine_Nier · 2013-09-15T12:55:11.897Z · LW(p) · GW(p)

Note that if the 'nonrational' family members also happen to be emotionally abusive family members this would not match the criteria as I interpret it.

You do realize that a lot of cults tend to classify normal family reactions, e.g., attempting to get the person out of the cult, as emotional abuse.

Replies from: wedrifid
comment by wedrifid · 2013-09-15T13:33:36.025Z · LW(p) · GW(p)

You do realize that a lot of cults tend to classify normal family reactions, e.g., attempting to get the person out of the cult, as emotional abuse.

I don't care and I'm somewhat outraged at this distortion of reasoning. It is so obviously bad and yet remains common and is all too seldom refuted. Emotional abuse is a sufficiently well defined thing. It is an undesirable thing. Various strategies for dealing with it are possible. In severe cases and in relationships where the gains do not offset the damage then severing ties is an appropriate strategy to consider. This doesn't stop being the case if someone else also misuses the phrase 'emotional abuse'.

Enduring emotional abuse rather than severing ties with the abuser because sometimes cultists sever ties while using that phrase is idiotic. Calling people 'creepy' for advocating sane, mainstream interpersonal strategies is absurd and evil.

Replies from: Kaj_Sotala, Eugine_Nier
comment by Kaj_Sotala · 2013-09-18T05:44:00.090Z · LW(p) · GW(p)

I don't care and I'm somewhat outraged at this distortion of reasoning. It is so obviously bad and yet remains common and is all too seldom refuted.

Sorry, exactly what is it that you're outraged about? Eugene seemed to merely be pointing out that people inside particular social groups might see things differently than people outside them, with the outsiders being creeped out and insiders not being that. More specifically, that things that we deem okay might come off as creepy to outsiders. That seems correct to me.

Replies from: wedrifid
comment by wedrifid · 2013-09-18T06:44:59.660Z · LW(p) · GW(p)

Sorry, exactly what is it that you're outraged about?

As a general policy:

  • All cases where non-sequitur but technically true claims are made where the actual implied rhetorical meaning is fallacious. Human social instincts are such that most otherwise intelligent humans seem to be particularly vulnerable to this form of persuasion.
  • All arguments or insinuations of the form "Hitler, Osama Bin Laden and/or cultists do . Therefore, if you say that is ok then you are Bad."
  • Additional outrage, disdain or contempt applies when:
    • The non-sequitur's are, through either high social skill or (as in this case) plain luck, well calibrated to persuade the audience despite being bullshit.
    • Actual negative consequences can be expected to result from the epistemic damage perpetrated.
Replies from: Kaj_Sotala, Eugine_Nier
comment by Kaj_Sotala · 2013-09-20T11:38:45.018Z · LW(p) · GW(p)

Thanks, that sounds reasonable. I didn't interpret Eugene's comments as being guilty of any of those, though.

comment by Eugine_Nier · 2013-09-19T07:28:43.353Z · LW(p) · GW(p)

All cases where non-sequitur but technically true claims are made where the actual implied rhetorical meaning is fallacious. Human social instincts are such that most otherwise intelligent humans seem to be particularly vulnerable to this form of persuasion.

In my experience nearly all accusations that someone is being "emotionally abusive" are of this type.

Replies from: wedrifid
comment by wedrifid · 2013-09-19T10:52:03.007Z · LW(p) · GW(p)

In my experience nearly all accusations that someone is being "emotionally abusive" are of this type.

If that is true then you are fortunate to have lived such a sheltered existence. If it is not true (and to some extent even if it is) then I expect being exposed to this kind of denial and accusation of dishonesty to be rather damaging to those who are actual victims of the phenonemon you claim is 'nearly all' fallacious accusation.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2013-09-19T20:31:58.580Z · LW(p) · GW(p)

If that is true then you are fortunate to have lived such a sheltered existence.

I could say the same thing about you if you've never encountered people willing to make false accusations of abuse (frequently on behalf of children) with the force of the law, or at least child services behind them.

If it is not true (and to some extent even if it is) then I expect being exposed to this kind of denial and accusation of dishonesty to be rather damaging to those who are actual victims of the phenonemon you claim is 'nearly all' fallacious accusation.

This is as good a summery of the "how dare you urge restraint" position as any I've heard.

comment by Eugine_Nier · 2013-09-15T15:29:46.699Z · LW(p) · GW(p)

Emotional abuse is a sufficiently well defined thing. It is an undesirable thing.

So could you provide a definition. The article you linked to begins by saying:

As of 1996, There were "no consensus views about the definition of emotional abuse."

And then proceeds to list three categories that are sufficiently vague to include a lot of legitimate behavior.

Enduring emotional abuse rather than severing ties with the abuser because sometimes cultists sever ties while using that phrase is idiotic.

You don't seem to be getting the concept of "outside view". Think about it this way: as the example of cults shows, humans have a bias that makes them interpret Bob attempting to persuade Alice away from one's meme set as emotional abuse. Consider the possibility that you're also suffering from this bias.

Replies from: wedrifid
comment by wedrifid · 2013-09-16T00:19:56.033Z · LW(p) · GW(p)

So could you provide a definition.

Yes, but I do not believe this to be necessary or appropriate at this time. The sincere reader is invited to simply use their own definition in good faith. The precise details do not matter or, rather, are something that could be discussed elsewhere by interested parties or on a case by case basis. For now I will say this is an example of emotional abuse which would in most situations call for the severing of ties. Other cases are less clear but, again, can be argued about when they crop up.

You don't seem to be getting the concept of "outside view".

Don't be absurd. Conversation over. Be advised that future comments of your on any of the subjects of emotional abuse, cults or creepiness will be voted on without reply unless I perceive them to be a danger to others. The reasoning you are using is both non-sequitur and toxic. I don't have the patience for it.

Think about it this way: as the example of cults shows, humans have a bias that makes them interpret Bob attempting to persuade Alice away from one's meme set as emotional abuse. Consider the possibility that you're also suffering from this bias.

I don't care about evangalism. I care about gaslighting, various forms of emotional blackmail and verbal abuse. Again, the fact that the phrase "emotional abuse" can be misused by someone in a cult does not make refusal to respond to actual emotional abuse appropriate or sane. To whatever extent your 'outside' view cannot account for that your outside view is broken.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2013-09-16T07:10:38.692Z · LW(p) · GW(p)

For now I will say this is an example of emotional abuse which would in most situations call for the severing of ties.

I agree gaslighting is bad. Ironically, most of the examples that come to mind (and the only example of attempted gaslighting happening to some I know) involve attempting to plant false memories that someone else was emotionally (and possibly also physically) abusing them.

Don't be absurd. Conversation over. Be advised that future comments of your on any of the subjects of emotional abuse, cults or creepiness will be voted on without reply unless I perceive them to be a danger to others. The reasoning you are using is both non-sequitur and toxic. I don't have the patience for it.

What I suspect is happening is you perceive evil "emotional abuse" as having occured and your reaction is "how dare eugine urge restraint."

I care about gaslighting, various forms of emotional blackmail and verbal abuse. Again, the fact that the phrase "emotional abuse" can be misused by someone in a cult does not make refusal to respond to actual emotional abuse appropriate or sane.

Yes, but is "actual emotional abuse" (to the extent it's an objective concept) occurring. In particular do you have any evidence that gaslighting (the only specific example you gave) occurred in any of the examples under discussion. Setrainly none of the things diego mentioned even suggest gaslighting was occurring.

Replies from: wedrifid
comment by wedrifid · 2013-09-16T08:24:50.558Z · LW(p) · GW(p)

What I suspect is happening is you perceive evil "emotional abuse" as having occured and your reaction is "how dare eugine urge restraint."

This is false. I object to the reasoning used in this conversation for the previously expressed reasons. I consider it disingenuous, with the inevitable caveat that I cannot reliably distinguish between disengenuity and sincere inability to think in a manner which I consider coherent. That is all.

For better or worse I viscerally experience more disgust when observing clever use of non-sequitur retorts than I experience at descriptions of the hypothetical abusive behaviours. Bullshit is my enemy. "Emotional abuse" is a mere abstract evil.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2013-09-19T07:31:31.040Z · LW(p) · GW(p)

Your first response to my comment was not to declare it "bullshit" but to declare it "evil". Furthermore, all your reasons boil down to "How dare you invoke the outside view when we all know Evil Things(tm) are happening". And you don't bother to engage with any of the arguments I provided.

Replies from: wedrifid
comment by wedrifid · 2013-09-19T11:23:19.982Z · LW(p) · GW(p)

Your first response to my comment was not to declare it "bullshit" but to declare it "evil".

This accusation seems to be actively relying on the assumption that readers will not check the context to verify accuracy. The first two sentences in the reply in question seem to be quite clearly declaring 'bullshit'. In particular note the phrases "distortion of reasoning", "so obviously bad and yet remains common" and "all too seldom refuted". I quite frequently reference on bullshit when describing that pattern of behaviour but it doesn't seem necessary to explicitly use the word 'bullshit' every single time. In fact I try to make myself use natural language descriptions like this rather than using bullshit every time because that habit would just get weird.

Furthermore, all your reasons boil down to "How dare you invoke the outside view when we all know Evil Things(tm) are happening".

This is false. Eugine_Nier has presented approximately the same straw man previously and it was false then too. I conclude that he has little interest in making his accusations match reality.

And you don't bother to engage with any of the arguments I provided.

I did engage, and that was a mistake. Like other users have mentioned in the past I now must concur that Eugine_Nier is systematically incapable of engaging in good-faith conversation. I will henceforth refrain from communicating with Eugine_Nier except when I deem it necessary to lend support to another user I perceive to be mistreated (via straw man barrages and the like). Apart from such cases I will limit myself to downvoting as appropriate then ignoring.

comment by private_messaging · 2013-09-15T14:26:05.245Z · LW(p) · GW(p)

Ohh, another good example of filtering, prophesies.

Replies from: gwern
comment by gwern · 2013-09-15T18:20:00.273Z · LW(p) · GW(p)

That would be a much more convincing example about cults in general if it weren't about a failed dead cult. EDIT: I would add that successful cults tend to spend time and energy minimizing and explaining away failed prophecies rather than being happy to spiral into a tiny core of believers & no one else; Jehovah's Witnesses spend little time discussing failed predictions by the Watchtower, and Christian theologians for millennia have been explaining away things like Jesus's prophecy that the apocalypse would come within a generation of him.

BTW, if you are interested in cults, you should really read the book When Prophecy Fails (it's on Libgen, so no excuses!) - you'll find it doesn't match your ideas about cults, but matches the academic literature described in OP, with regard to very low retention rates, minimal efficacy of recruitment, and the cult adding value to retained members' lives.

Replies from: private_messaging, private_messaging
comment by private_messaging · 2013-09-16T12:26:49.254Z · LW(p) · GW(p)

I would add that successful cults tend to spend time and energy minimizing and explaining away failed prophecies rather than being happy to spiral into a tiny core of believers & no one else; Jehovah's Witnesses spend little time discussing failed predictions by the Watchtower, and Christian theologians for millennia have been explaining away things like Jesus's prophecy that the apocalypse would come within a generation of him.

Very few cults are as successful as Jehovah's Witnesses or the like. Your typical cult is something fairly small, a congregation around one or a few literally insane persons which are indulging in short sighted self gratification such as convincing oneself that god speaks to them, or at times, around a person initially out for easy money (e.g. Keith Raniere). Said people need a certain number of close admirers. The danger, likewise, is not in the successes but in failure modes that infrequently include mass murder, and much more frequently include e.g. sexual abuse of minors, consequences to the member's health, and so on.

Replies from: gwern
comment by gwern · 2013-09-16T16:40:28.051Z · LW(p) · GW(p)

Your typical cult is something fairly small, a congregation around one or a few literally insane persons which are indulging in short sighted self gratification such as convincing oneself that god speaks to them, or at times, around a person initially out for easy money (e.g. Keith Raniere).

The median cult may be small, but the median cult quickly dies. Why does this matter? If you were to apply your argument that cults are not intended to grow to, say, businesses, wouldn't it look completely ridiculous?

'Very few corporations are as successful as Microsoft. Your typical corporation is something fairly small, a group of one or two literally insanely optimistic entrepreneurs who are indulging in short-sighted egotistic expenditures such as the delusional belief that what the market needs is another Facebook clone, or at times, around a successful marketer out for easy money (e.g. Peter Pham). Said people need a certain number of close employees. Businesses are not intended to be successful or grow and make money, just keep loyal employees for the founder's gratification. The danger, likewise, is not in the successes but in failure modes that infrequently include mass murder like the Bhopal incident, and much more frequently include e.g. sexual abuse of minors, consequences to the customer's health, and so on.'

As I already said, cults die at such high rates that your theory is impossible because it presupposes utterly self-defeating behavior and is inconsistent with the behavior of successful cults.

Replies from: private_messaging, orthonormal
comment by private_messaging · 2013-09-17T00:54:21.811Z · LW(p) · GW(p)

The median cult may be small, but the median cult quickly dies. Why does this matter? If you were to apply your argument that cults are not intended to grow to, say, businesses, wouldn't it look completely ridiculous?

I'd say vast majority of start-ups are founded by people with some head issues in the direction of narcissism. Who may well be in some abstract sense intending to succeed, but to get from intent to succeed to actions takes quite a lot of intellect, which they mostly lack. Meanwhile, day to day actions are actually based on desire for self gratification (avoidance of feedback especially, things that generally make them feel well), with very short sighted planning. The end result is massive waste of human potential (of those unfortunate enough to end up in said startups), financial losses (often avoided by the narcissistic founder himself), and so on.

As I already said, cults die at such high rates that your theory is impossible because it presupposes utterly self-defeating behavior

What? How is a theory that presupposes utterly self defeating behaviour at odds with, you know, defeat?

and is inconsistent with the behavior of successful cults.

I'm speaking of cults in general, which as you yourself say generally die. The few highly successful cults are not particularly bad, and succeed precisely by being dramatically different from the unsuccessful cults.

Replies from: gwern
comment by gwern · 2013-09-17T16:21:58.959Z · LW(p) · GW(p)

I'd say vast majority of start-ups are founded by people with some head issues in the direction of narcissism...Meanwhile, day to day actions are actually based on desire for self gratification (avoidance of feedback especially, things that generally make them feel well), with very short sighted planning.

I respect your bullet-biting with regard to equating startups and cults, even if I think your view is as ridiculous as it looks.

How is a theory that presupposes utterly self defeating behaviour at odds with, you know, defeat?

My point was that the cult death rates were similar to that of organizations which are not generally believed to be organized to gratify narcissistic leaders' egos but make money, which would have been a counter example that refuted your argument, except you then chose to bite that bullet and argue that businesses are exactly like cults in this respect and aren't counter-examples at all. So you're right that that argument no longer works, but you've done so only by making completely absurd claims which prove too much.

If you want to argue that businesses are cults and hence the equivalent death rates are consistent with both being about leader gratification, that's consistent. But it's absurd and I don't believe it for a second and I doubt anyone else will either.

Replies from: jasticE
comment by jasticE · 2015-02-06T07:36:13.890Z · LW(p) · GW(p)

From recent personal experience at a startup, I am inclined to believe the view, as it makes said experience make a lot more sense.

comment by orthonormal · 2013-09-21T02:39:25.596Z · LW(p) · GW(p)

Your reductio ad absurdam is something I can quite easily imagine Michael Vassar saying.

comment by private_messaging · 2013-09-15T18:22:58.198Z · LW(p) · GW(p)

That would be a much more convincing example about cults in general if it weren't about a failed dead cult.

I think an example about a failed dead startup is most informative about startups in general.

edit: also, on the reading list, what I expect is for my interpretation of it to be quite massively different from yours. I'd be better served by picking a reputable book about cults at random, anyway (cherry picking vs unfiltered data).

edit2: as for adding value, I'm not sure value adding cults are nearly of as much of impact-weighted interest as cults which end up in a Jonestown. Furthermore, sunk cost fallacy - like failure mode seems massively relevant to retention in cults.

Replies from: gwern
comment by gwern · 2013-09-15T18:52:23.957Z · LW(p) · GW(p)

I think an example about a failed dead startup is most informative about startups in general.

What's that? Surely if a prophecy were a useful filtering mechanism as you say, then dying is a problem. A cult which fails cannot serve anyone's purpose at all...

I'd be better served by picking a reputable book about cults at random, anyway (cherry picking vs unfiltered data).

Fair enough, but shouldn't you then retract your previous claims? I mean, what with it being based on cherry picked evidence and all?

Furthermore, sunk cost fallacy - like failure mode seems massively relevant to retention in cults.

You should probably know that I consider it seriously questionable whether sunk costs affect individuals at all, then, and reject the premise of that argument, much less whether it applies to cults.

comment by James_Miller · 2013-09-13T22:34:33.914Z · LW(p) · GW(p)

I wonder what percentage of adult North Koreans have been successfully brainwashed by their government to the extent that, say, they believe that their country's founding dictator was one of the greatest forces for good the world has ever known. What's your estimate?

[pollid:553]

Replies from: gwern, shminux, army1987, DavidAgain, private_messaging
comment by gwern · 2013-09-14T00:04:21.636Z · LW(p) · GW(p)

In the Korean context, surveys have been done of defectors (for the obvious reasons) to try to gauge the current level of support for the regime. The result is sadly predictable for anyone who's seen Russians nostalgic for Stalin or Chinese wistfully thinking back to Mao: Il-Sung is still venerated by many North Koreans, even if they don't like his son or despise the pig-grandson.

Some survey data is summarized in The Hidden People of North Korea: Everyday Life in the Hermit Kingdom and "An Assessment of the North Korean System's Durability" is an extensive discussion of defector surveys. (Apparently in the 2002 defector survey, 67% of them believed their countrymen venerated Il-Sung as the "greatest mind of humanity". Many interesting bits, like "Few North Koreans seem aware that the United States has been one of North Korea's principal food donors.")

Replies from: gwern, DanArmak
comment by gwern · 2013-09-28T23:49:37.391Z · LW(p) · GW(p)

From a new paper, "Preparing for the Possibility of a North Korean Collapse", Bennett 2013 (RAND):

...Since the end of the Korean War, the North Korean government has indoctrinated its population, only allowing them access to state-generated information. But information on the outside is spreading in North Korea, debunking at least some of the North Korean propaganda, and generating the potential for instability: “There is mounting evidence that Kim Jong Il is losing the propaganda war inside North Korea, with more than half the population now listening to foreign news, grass-roots cynicism undercutting state myths and discontent rising even among elites.”53 Analyzing the results of their survey of North Korean refugees in China and South Korea, Marcus Noland and Stephen Haggard have identified a number of significant shifts in information and resulting North Korean attitudes:

  • The survey found that roughly half of North Koreans have access to foreign news or entertainment, a sharp rise from the 1990s, eroding faith in the regime’s statements that the United States is causing its woes.54
  • “Not only is foreign media becoming more widely available, inhibitions on its consumption are declining as well,” the report said, referring to broadcasts from South Korea, China and the United States. “The availability of alternative sources of information undermines the heroic image of a workers’ paradise and threatens to unleash the information cascade that can be so destabilizing to authoritarian rule.”55
  • A survey of refugees has found that “everyday forms of resistance” in the North are taking root as large swaths of the population believe that pervasive corruption, rising inequity and chronic food shortages are the fault of the government in Pyongyang—and not of the United States, South Korea or other foreign forces. . . .
  • “Evaluations of the regime appear to be getting more negative over time,” the report said. “Although those who departed earlier were more willing to entertain the view that the country’s problems were due to foreigners, respondents who left later were more likely to hold the government accountable.” . . .
  • The survey found that cynicism about the government—and willingness to crack jokes about its failures—was higher among refugees who come from elite backgrounds in the government or military. It also found that distaste for the government was strongest among those deeply involved in the markets.56

...With much more outside information penetrating into the North Korean society, a significant number of citizens likely believe at least parts of that information:

The regime has made desperate and increasingly futile efforts to maintain a stranglehold on information, such as periodic crackdowns by the authorities on mobile phones brought in from China and seizures of widely popular and avidly watched South Korean soap operas recorded on video and DVD.57

Even the North Korean military is not exempt:

An increasing number of North Korean military officers and soldiers are caught watching South Korean films or soap operas in barracks, sources say. A Beijing-based source who visits the North often said Monday, “Several Army officers and soldiers have been caught watching South Korean movies or TV dramas since last year, and the military has been providing extensive indoctrination for all officers and soldiers with a view to preventing the cultural infiltration of imperialism.”58

Corruption in the army has become so widespread that the government authorized the civilian police (the People’s Safety Agency) to investigate cases of corrupt military personnel. Previously, the military police handled such investigations, but the government believes the military police have become corrupted, and can no longer be trusted to find and punish soldiers involved in criminal acts (stealing, or aiding smugglers to get across the border). All this reflects poorly on the National Security Agency (secret police), who are also seen as corrupted.59

...Some early defectors in 1987 said, “[w]hen we lived in the North, we were told that South Korea was a living hell.”5 A defector in 2006 said, “When I came to the South and saw how rich it was, I was very angry at the Pyongyang regime.”6 The influx into the North of information about South Korea has weakened this propaganda line. While it is still repeated on occasion, now North Koreans are told that the South

has lost its true national identity, so its inhabitants are full of admiration toward the spiritual purity of their Northern brethren. The southerners, the propaganda claims, also badly want to purify themselves under the wise guidance of the Dear Leader Kim Jong-il (allegedly a cult figure in both the South and the North).7

Brian Myers, another remarkable specialist on North Korean culture and propaganda (not quite distinguishable areas, actually), recently wrote at length about a change of tune in Pyongyang propaganda: South Korea ceased to be depicted as the living hell, the land of depravation. The new image of the South is that of the country whose population secretly (or even not so secretly) longs to join its Northern brethren in their happiness under the wise care of the Beloved General.8

Apparently, DVDs and other information from the ROK have penetrated so much into North Korea that the argument of ROK impoverishment is not credible with many in the North and undermines overall North Korean propaganda. So an alternative approach is being taken to keep the multidimensional propaganda approach viable, claiming that the ROK is now poor in wise guidance and leadership.

comment by DanArmak · 2013-09-14T08:47:01.579Z · LW(p) · GW(p)

And that's just for defectors, which must be a selection effect in favour of being against Il-Sung.

Replies from: DavidAgain
comment by DavidAgain · 2013-09-14T14:49:06.405Z · LW(p) · GW(p)

Note that the survey says that they believe that their [i]countrymen[/i] venerated Il-Sung. Defectors may be likely to dislike Il Sung themselves, but my (low certainty) expectation would be that they'd be more likely to see the population at large as slavishly devoted. People who take an unusual stance in a society are quite likely to caricature everyone else's position and increase the contrast with their own. Mind you, they sometimes take the 'silent majority' thing of believing everyone secretly agrees with them: I don't know which would be more likely here.

But I'd guess that defectors would be both be more likely to think everyone else is zealously loyal, AND be more likely to believe that everyone wishes they could overthrow the government. I'd imagine them to be more likely to end up on the extremes, in short.

comment by shminux · 2013-09-13T23:47:57.024Z · LW(p) · GW(p)

Not sure what the purpose of this poll is. Brainwashing from birth with little or no exposure to alternative views is a quite different environment from the one NRMs operate in. How many Americans or Greeks (or pre-war Germans) believe that their country is the greatest? How many Russians believed in Communism in 1950s? The numbers are clearly a lot higher than any cult can hope to achieve.

Replies from: gwern
comment by gwern · 2013-09-14T00:11:38.460Z · LW(p) · GW(p)

In particular, North Korea clamps heavily down on unauthorized information and makes up a lot of stuff. When your data is bad, it's not too surprising if your conclusions are bad.

Even people who are cynical about the regime probably aren't cynical enough. I forget the book I read this in (The Cleanest Race?) but I recall reading one story about a high-level NK official who was aware of the many abuses, but it wasn't until he learned from the Russian archives that the Korean War had actually been started by Kim Il-Sung after Stalin gave his permission (the official NK version is that the bloodthirsty capitalist SK dictator Syngman Rhee invaded NK unprovoked) that he realized just how far down the rabbit hole he had to go.

Replies from: Protagoras, ikrase
comment by Protagoras · 2013-09-14T02:54:20.620Z · LW(p) · GW(p)

Admittedly, from what I recall of Rhee, it's likely that the only reason he didn't invade the North is because he knew how badly he'd lose; it's totally something he would have done if he'd had a better military.

comment by ikrase · 2013-09-14T07:56:22.823Z · LW(p) · GW(p)

Yeah, it's actually enough to make me wonder if just forcing information into the country would trigger a rebellion...

comment by A1987dM (army1987) · 2013-09-14T15:56:34.305Z · LW(p) · GW(p)

No “I'm not going to vote; just show me the results” option?

comment by DavidAgain · 2013-09-14T14:45:10.196Z · LW(p) · GW(p)

I don't think 'brainwashing' is a helpful or accurate term here, in the sense that I think most people mean it (deliberate, intensive, psychological pressure of various kinds). Presumably most North Koreans who believe such a thing do so because lots of different authority sources say so and dissenting voices are blocked out. I'm not sure it's helpful to call this 'brainwashing', unless we're going to say that people in the middle ages were 'brainwashed' to believe in monarchy, or to be racist, or to favour their country over their neighbours etc.

Even outside of repressive regimes, there are probably a whole host of things that most Americans believe that most Brits don't and vice versa, and that's in a case with shared language and culture. I'm not sure 'brainwashing' can be used just because lots of people in one place believe something that hardly anyone from outside does.

Replies from: TheAncientGeek
comment by TheAncientGeek · 2014-05-13T18:51:56.898Z · LW(p) · GW(p)

There's two theories here. One is that brainwashing is a rare and ineffective thing, The other is that accultauration, or whanever is.pervasive and effective and largely unnoticed, and the reason the NRMs aren't too effective is that the standard societal indoctrination is hard to budge.

comment by private_messaging · 2013-09-14T13:38:40.630Z · LW(p) · GW(p)

I would estimate 66% or so, on the basis that a multitude of experiments found that about 2/3 of people are considerably more susceptible to authority than the rest, but I am not sure to which extent they managed to kill off the 1/3 , or to which extent the 1/3's conditional compliance counts towards "successfully brainwashed". edit: ahh, you say founding dictator. Well, then it could easily be higher, because it's a much less practical thing to think rebellious thoughts about right now.

Replies from: wedrifid
comment by wedrifid · 2013-09-14T13:47:16.169Z · LW(p) · GW(p)

about 2/3 of people are more susceptible to authority than the rest

It would seem that one could replace "2/3" with any other proper fraction and that finding would remain true.

Replies from: 4hodmt, private_messaging
comment by 4hodmt · 2013-09-14T16:41:09.143Z · LW(p) · GW(p)

Editing the quote to remove the "considerably" changes the meaning. The original is not a tautology because the "considerably" suggests a visible step in the curve.

Replies from: wedrifid, Mestroyer, ChristianKl
comment by wedrifid · 2013-09-15T00:54:10.815Z · LW(p) · GW(p)

Editing the quote to remove the "considerably" changes the meaning. The original is not a tautology because the "considerably" suggests a visible step in the curve.

I didn't remove a word. The original was edited to change the meaning.

Replies from: private_messaging, 4hodmt
comment by private_messaging · 2013-09-16T12:01:51.769Z · LW(p) · GW(p)

Yea, you merely interpreted it in a ridiculous way that was not intended, thus requiring an extra word where none would have been needed if maxim of relevance at all held.

Replies from: wedrifid
comment by wedrifid · 2013-09-16T12:20:05.184Z · LW(p) · GW(p)

Yea, you merely interpreted it in a ridiculous way that was not intended, thus requiring an extra word where none would have been needed if maxim of relevance at all held.

Your edited version is far more useful. Thankyou.

comment by 4hodmt · 2013-09-15T02:51:17.024Z · LW(p) · GW(p)

My apologies then. It would be useful if LessWrong marked edited posts as edited.

Replies from: Douglas_Knight
comment by Douglas_Knight · 2013-09-15T03:24:40.363Z · LW(p) · GW(p)

It does mark edited comments, by an * after the date. It does not mark edits to top-level posts or edits by admins (even self-edits by admins, which is clearly a bug).

Replies from: 4hodmt
comment by 4hodmt · 2013-09-15T03:45:19.038Z · LW(p) · GW(p)

Thanks, I didn't notice the '*'s.

comment by Mestroyer · 2013-09-14T18:12:48.489Z · LW(p) · GW(p)

private_messaging's post is edited. I bet wedrifid quoted it as it originally was, and private_messaging edited it later to change the meaning. Edit2: (To change my posts's meaning, heh) or to clarify the original intended meaning.

Edit: fixed formatting error caused by not escaping the underscore private_messaging's name.

comment by ChristianKl · 2013-09-14T20:11:24.221Z · LW(p) · GW(p)

If there a visible step in the curve that would be interesting. If anyone has any sources that makes such a claim, please provide it.

comment by private_messaging · 2013-09-14T14:34:17.474Z · LW(p) · GW(p)

Well, it still seems odd that with different set ups of e.g. Milgram experiment, various conformity experiments, and such, around 2/3 is the number rather than some dramatically different fraction (which suggests that in practice the change in susceptibility is greater around that percentile, which is of course what I meant). There really is no data to use to get any sort of specific number for North Korea, at all, but if you have to guess you have to name something. I'd be cautious of over-estimating the power of brainwashing over there. Especially considering how many people they did have to put through prison camps and such.

Replies from: ChristianKl
comment by ChristianKl · 2013-09-14T20:13:03.408Z · LW(p) · GW(p)

Depending on the specifics which get used during the Milgram experiment you get different results. It matters whether the person being tortured is in the same room. Whether or not you use a setting that gives you 2/3 of the people is arbitary.

comment by Phenoca · 2016-09-06T17:37:06.652Z · LW(p) · GW(p)

I would say demonization and ostracism count as coercion. Religions use sexual identity shaming, existential fears, 'universal morality', and promises of eternal happiness in an 'afterlife' to fallaciously bring followers under bit & bridle. As soon as a religious authority stoops to the "you're being controlled by evil spirits" argument, it counts as brainwashing. Cult authorities will use this to demonize any and all forms of skepticism, sexual relationships, skipping worship sessions, or interaction with ex-members. Essentially, if you disagree with the head priest, you are going to have some sort of livestock factory farm-esque afterlife full of eternal torment! If that doesn't count as "systematic and often forcible pressure" then I don't know what does. Perhaps enforced chastity and demonization of orgasms..? Psychological coercion is extremely easy, as humans are extremely manipulable, controlled by emotions, and irrational. Add-in a few existential fears, some comforting fallacy, and perhaps some sex appeal, and you've gotten yourself a recruitment platform for your religion.