Unwitting cult leaders

post by Kaj_Sotala · 2021-02-11T11:10:04.504Z · LW · GW · 9 comments

Contents

9 comments

An insight that I’d kind of already had, but which this interview with Michael Taft (relevant section starts at about 32 minutes) helped crystallize:

We tend to think of a “cult leader” as someone who intentionally sets out to create a cult. But most cult-like things probably don’t form like that. A lot of people feel a strong innate desire to be in a cult.

In the podcast, Taft suggests that it’s rooted in an infant’s need to attach to a caregiver, and to treat them as a fully dependable authority to fix all problems – a desire which doesn’t necessarily ever go fully away. Once someone becomes a teacher of some sort, even if they had absolutely no desire to create a cult, they will regardless attract people who want to be their cultists.

There are people who want to find a fully dependable authority figure to look up to, and are just looking for someone who feels like a good fit for the role. (I should note that I have definitely not been immune to feeling this yearning myself.) To avoid having cultists, “not intending to create a cult” isn’t enough [LW · GW]; you have to actively fight against people’s tendency to idealize you, by doing things that force them to confront the fact that you are actually just a human.

I’m reminded of something I recall Eliezer Yudkowsky once saying: “if you tell your doting followers not to form a cult, they will go around saying ‘We Must Not Form A Cult, Great Leader Mundo Said So’.”

Once people do start pulling you towards a cult leader role, it’s going to feel very appealing. What it feels like from the inside is “all of these people like me and say that I’ve done a lot of good for them, so clearly I must be doing things right, and since they also listen to me, I can use my position to help them out even more”.

It’s not just that the cultists are getting “brainwashed” by their leader; it’s also that the leader is getting brainwashed by their cultists to take the role that they want the leader to take. Cults are said to use “love bombing” to attract new recruits, but in at least some cases, it probably also happens that the cult leader is getting love bombed by their followers.

And the temptation to take on that role is powerful not only because it feels nice personally, but also because it does allow you to use your power for good. One definition for a hypnotic trance that I’ve heard is that it’s a state in which a person’s critical faculty is bypassed, which allows the hypnotist to directly make changes in the mind of the person being hypnotized. And you can do a lot of good that way, such as by implanting suggestions that help people overcome their addictions or phobias. 

Being someone’s cultist (in this sense) is kind of like them having you in a hypnotic trance. It is possible for to use that power in a way that’s beneficial, because the critical faculty that might normally reject or modulate the leader’s suggestions gets partially bypassed.

But that same power makes it extremely dangerous, since people are not going to think critically about what you say, and may take your words far more literally than you intended, when you didn’t think of adding the obvious-to-you caveats about how it shouldn’t be interpreted.

I’ve been feeling this myself. I’ve written various things that people like. And I’ve been having a definite sense of some of my social environment trying to tug me more towards a role as a teacher and as an authority, getting the sense that some people are idealizing me. (And again, yes, there have been several times when I’ve had the cult follower energy myself, too – both towards online writers and in some of my romantic relationships.)

I’m reminded here again of Valentine’s essay on the “Intelligent Social Web [LW · GW]” and of how people tend to take the kinds of roles that their social environment recognizes and rewards… and how people try to tug others into the kinds of roles that they can recognize and know how to interact with, and the collective power of everyone doing this causes the social web as a whole to try to pull people into recognizable roles – including the role of “charismatic leader”. 

Here we come back to Taft’s suggestion that many people have an instinctive desire to get someone into a role that they recognize as a “trustworthy caretaker” one, because the “child” role is one that feels very easy to play – just surrender your judgment to the other person and do everything the way (you think that) they want you to.

And I’m also reminded of siderea’s analysis of kingship in Watership Down, and of how Hazel never thought of himself as a leader originally in the novel, until the characters around him started treating him as one – and how that might not be as good of a deal as our society makes “kingship” sound like:

If you demonstrate a concern for the wellbeing of the people in your people, they will start seeing their wellbeing as your concern. Start taking responsibility for how things go in a group, and people will start seeing you as responsible for how things go in a group.

This, right here, is what causes many people to back away from Kingship. Which is their right, of course. It’s totally legitimate to look at that deal and say, “Oh, hell no.”

Our society tells us that being King is awesome and everyone – well, everyone normal – wants to be one. “Every body wants to rule the world.” No, actually, they don’t. My experience tells me that most people are very reluctant to step into the job of King, and this consequence of the role is a primary reason why.

I don’t know, but it strikes me at least plausible that prospective leaders themselves getting partially deluded about what it is that they are up for, is what enables them to actually step into the role rather than just saying “oh hell no”.

9 comments

Comments sorted by top scores.

comment by Viliam · 2021-02-14T21:49:22.643Z · LW(p) · GW(p)

When some people start admiring you in the cultish way, it is difficult to get rid of them, because if you disappoint them, they might switch to hating you -- like it's your fault for not being who they thought you were, and they need to punish you, and also tell everyone else what a bad person you are.

Not sure what is the optimal approach here. The easy one is "make sure strangers won't notice you, because certain fraction of the strangers is crazy, and trust me you do not want to attract a crazy person's attention." But this means giving up all the good things that could happen. Not just good things as "influencing people for good", but also things like "using your skills in public". To you avoid being noticed, you need to give up all things that require social capital.

Possible solution: pseudonymity. Make a fake persona that you are willing to sacrifice when shit hits the fan. (I said "when", not "if".) Unfortunately, doxing is a popular pastime, and proper opsec is a lot of work.

comment by Purged Deviator · 2022-03-16T01:17:19.138Z · LW(p) · GW(p)

I kinda wonder if this is what happened with Eliezer Yudkowsky, especially after he wrote Harry Potter and the Methods of Rationality?

comment by Unreal · 2021-11-30T22:20:18.616Z · LW(p) · GW(p)

I was actually thinking of writing about a concept I have called 'cult brain' but this post covers the basic idea. I'm glad! Nice work, Kaj. 

One thing that I want to see more of in this post is treating cultists as agentic. I think for some reason we tend to treat them as non-agentic and helpless, and this frustrates me to no end. Potential cultists have the ability to do something about their own minds and how they use them. 

Leaders should notice their own reactions and behaviors in response to cultists, but I think it becomes codependent the moment a leader takes responsibility for a cultist's behavior or thought process. 

comment by Raemon · 2023-01-07T00:43:57.001Z · LW(p) · GW(p)

I've referenced this post, or at least this concept, in the past year. I think it's fairly important. I've definitely seen this dynamic. I've felt it as a participant who totally wants a responsible authority figure to look up to and follow, and I've seen in how people respond to various teacher-figures in the rationalsphere.

I think the rationalsphere lucked out in its founding members being pretty wise, and going out of their way to try to ameliorate a lot of the effects here, and still those people end up getting treated in a weird cult-leader-y way even when they're trying not to. (I recall one community leader telling me, 8 years ago, "look I don't know anything please don't overupdate on what I say" and somehow that made them feel like they were even more wise and I was treating them like Yoda even more.)

My thoughts on it are somewhat connected to "In Defense of Attempting Hard Things [LW · GW]" and discussion surrounding Leverage Research (which was triggered by a particular writeup by Zoe [LW · GW], but I think was more of a broader set of pent up frustrations). The rationalsphere/longtermist/EAcosystem are trying to do pretty hard things. Pretty hard things often require both commitment/dedication, and willingness to try weird strategies. This combination tends to produce cults or cult-adjacent things as a byproduct, which is worrisome and bad, but, man, it's still important to actually try the hard things.

The "hard things / weirdness" -> "cultishness" model is separate from the model in this post, but the fact that that (I think) Hard Weird Communities are important, makes the failure modes of the OP more costly.

This year I ran into a person who seemed to be accidentally attracting a cult around them, despite them seeming really innocuous in a lot of ways. I don't think I directly referred them them to this post but having the concept handy was to talk to them was useful.

comment by DirectedEvolution (AllAmericanBreakfast) · 2021-02-11T15:21:29.127Z · LW(p) · GW(p)

There are enormous numbers of people in the world who are leaders, but whose relationship with their followers doesn’t strike anybody as cult-like.

My intuitive model is that cult-like behavior emerges only in specific contexts. The grocery store manager probably will not turn the clerks into cultists.

Who has a shot at it? If I had to guess, it’s maybe people who deal in emotions, identities and ideas as their primary trade, and whose personal work doesn’t cash out in a concrete, worldly endeavor.

A CEO deals in ideas, but if they have to sell a product at the end of the day, I think they’re unlikely to become cult leaders. At least not the central case.

I speculate that this isn’t just because they don’t have time on their hands. I think it’s also because their image is wrapped up in something too concrete. And they’ll have to optimize for goals other than “like and let like.”

Is becoming a cult a result of Goodhart’s law? People use “do I like them” as a measure of “should I give them my attention?” And wind up with a cult?

comment by Elizabeth (pktechgirl) · 2021-12-13T09:31:28.221Z · LW(p) · GW(p)

Being someone’s cultist (in this sense) is kind of like them having you in a hypnotic trance.

To extend this: people may be ~self-hypnotizing and then be incredibly vulnerable without the leader knowing they are vulnerable or wanting that responsibility. Statements that were genuinely meant as "consider this and decide for yourself" become harmful when someone has disabled their own filter.

comment by remizidae · 2021-02-11T12:27:23.603Z · LW(p) · GW(p)

How do you think it would be possible for an incipient cult leader to fight the tendency for people to idolize him or her? While still maintaining the group and staying engaged with the group? Are there any examples of people successfully doing that?

It would be possible for the incipient leader to just walk away, and in that case I would expect either the group to lose a lot of its cohesion and the good parts of cultishness—or another incipient leader might step in.

Replies from: Kaj_Sotala, remizidae
comment by Kaj_Sotala · 2021-02-11T13:39:31.582Z · LW(p) · GW(p)

They have some suggestions of how to do that in the episode; one is just exhibiting behaviors that don't fit the idealized image they want to project on you. (Taft: "It's remarkably easy to break, at least for a little while, by just - you know - picking your nose or swearing or something. And if I notice someone doing [the idealization] - because you can tell when it's happening - I just keep breaking it and breaking it and breaking it until it breaks, and then probably they'll go away at that point if that was their goal, you know, 'he was not who I thought he was' and then they lose interest. But if they stick around after that, then they are probably seeing me quite a bit more for who I am.")

Another thing that he mentions is that while you do want to maintain boundaries - don't let crazy people call you at 3 AM - it's also good if you can reduce distance and let people in close. If people stay distant and never meet you, then it's easy to continue idealizing you, whereas meeting you in person makes it easier for them to see who you actually are. He used to invite anyone who was interested into his living room for his meditation class, and "while that was probably too much", he says it was good for getting that distance down.

comment by remizidae · 2021-02-11T12:40:37.959Z · LW(p) · GW(p)

Brainstorming some answers to my own question, I think it would help to maintain more standard social boundaries with followers. Avoid group living. Don’t have sex with followers (and don’t let on if you find any of them attractive). Don’t adopt followers into your family. Actively foster other leaders, so that group members’ dependence and demands are not centered solely on you.