Some thoughts on the cults LW had
post by Noosphere89 (sharmake-farah) · 2023-02-26T15:46:58.535Z · LW · GW · 28 commentsContents
Zizians None 28 comments
There have been several cults in LW history, whether it's the Zizians, or Leverage cults.
Today I want to talk about these cults and the tradeoffs LW makes.
Zizians
The Zizians were a cult that focused on relatively extreme animal welfare, even by EA standards, and used a Timeless/Updateless decision theory, where being aggressive and escalatory was helpful as long as it helped other world branches/acausally traded with other worlds to solve the animal welfare crisis.
They apparently made a new personality called Maia in Pasek, and this resulted in Pasek's suicide.
They also used violence or the threat of violence a lot to achieve their goal.
This caused many problems for Ziz, and she now is in police custody.
Edit: Removed the Vassarites due to updated information that they didn't use psychedelics to jailbreak people's minds.
Now I'll talk about what in my view are the tradeoffs of the level of cults.
And I think one tradeoff is that the less cults you have, the more you miss weird opportunities that generate most of the value.
EDIT: I have removed the takeaways section due to the fact that we don't have base rates for people entering cults..
28 comments
Comments sorted by top scores.
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2023-02-26T18:34:56.171Z · LW(p) · GW(p)
used a Timeless/Updateless decision theory
Please don't say this with a straight face any more than you'd blame their acts on "Consequentialism" or "Utilitarianism". If I thought they had any actual and correct grasp of logical decision theory, technical or intuitive, I'd let you know. "attributed their acts to their personal version of updateless decision theory", maybe.
Replies from: sharmake-farah, TAG, Hivewired↑ comment by Noosphere89 (sharmake-farah) · 2023-02-27T13:18:09.396Z · LW(p) · GW(p)
I agree they misused logical decision theories, I'm just stating what they claimed to use.
↑ comment by Slimepriestess (Hivewired) · 2023-02-27T01:07:49.451Z · LW(p) · GW(p)
maybe it would be more apt to just say they misused timeless decision theory to justify their actions timelessly correct actions may look insane or nonsensical upon cursory inspection, and only upon later inspection are the patterns of activity they have created within the world made manifest for all to see. ^_^
comment by tailcalled · 2023-02-26T16:08:03.704Z · LW(p) · GW(p)
Did you get your information about the Zizians and the Vassarites from personal experience with them? If you did not get your information about the Zizians and the Vassarites from personal experience with them, but instead got your information from what some other people said about them, can you list who those other people are?
Replies from: sharmake-farah↑ comment by Noosphere89 (sharmake-farah) · 2023-02-26T16:22:52.259Z · LW(p) · GW(p)
I thankfully didn't have personal experience with either of these cults. For Ziz, I was relying on Daniel Filan's post and it's comment section, and for the Vassarites, I was relying on Scott Alexander's comment in the post thst talked about jessicata's experiences at MIRI/CFAR.
Replies from: tailcalled↑ comment by tailcalled · 2023-02-26T16:50:24.977Z · LW(p) · GW(p)
I basically agree with the message that the Zizians are a dangerous cult that people should stay away from, unless they are law enforcement officers trying to keep them under control. (I don't have any particular new information, beyond public knowledge similar to what you cite from Daniel Filan. It just looks the way I would expect a cult to look.) There's more that I could speculate about them, but it doesn't seem terribly relevant for this thread.
As for the Vassarites, my impression is that Scott Alexander later withdrew some of his critiques about the Vassarites, and that the matter isn't particularly clearly settled. It looks more like a political conflict between MIRI/CFAR/EA and the Vassarites than like a genuine cult situation to me. Probably the most concerning part would be the recommendation of psychedelics, though I don't think ill-advised drug use = cult. (Ill-advised drug use in politically oriented subgroups doesn't seem uncommon in the rationalist community in general. Though one could argue that all of those subgroups are cults, I suppose, but in that case we might have a big cult problem on our hands, idk.)
(I've started finding some of the Vassarites' points compelling, and so if they are a dangerous cult, it would be worth it for me to know before I get entangled with them. So I guess please send any independent evidence my way. I'm not gonna do drugs tho.)
Replies from: cousin_it, sharmake-farah↑ comment by cousin_it · 2023-02-26T20:01:42.750Z · LW(p) · GW(p)
Idk, for me it's a matter of instinct. The stuff written by Vassar, and others like Benquo etc, feels like mindfuckery to me. Also I met Vassar in person and he came across as someone who's very confident and very wrong at the same time.
If I remember correctly, the topic of disagreement was healthcare systems. I said I liked the Swiss one. Vassar replied that it couldn't possibly work well, because the only working healthcare system in the world was Metamed, his startup at the time which later failed. A roomful of rationalists (quite high caliber, we were at a MIRI workshop) nodded along to him. I think Eliezer was in the room too but noncommittal? Anyway I was almost in disbelief about this, left the party pretty soon, and it must've played a role in my drifting away from the rationalist scene overall.
Replies from: lc, tailcalled↑ comment by lc · 2023-02-27T09:57:46.585Z · LW(p) · GW(p)
If I remember correctly, the topic of disagreement was healthcare systems. I said I liked the Swiss one. Vassar replied that it couldn't possibly work well, because the only working healthcare system in the world was Metamed, his startup at the time which later failed. A roomful of rationalists (quite high caliber, we were at a MIRI workshop) nodded along to him. I think Eliezer was in the room too but noncommittal? Anyway I was almost in disbelief about this, left the party pretty soon, and it must've played a role in my drifting away from the rationalist scene overall.
I've never actually lived there, but this is what I imagine the bay area in general is like, not just rationalists.
Replies from: habryka4↑ comment by habryka (habryka4) · 2023-02-28T02:42:08.584Z · LW(p) · GW(p)
(I live in the Bay Area, and this really seems extremely far from a representative experience of what it's like to live here. There are definitely many weird and often interesting contrarians, but 'seeing a room nod along with a crazy statement' is really not what I've observed happening here. Vigorous debate and disagreement is quite common)
↑ comment by tailcalled · 2023-02-26T20:13:03.829Z · LW(p) · GW(p)
I do have the impression that the Vassarites are underestimating how expensive information is, and overestimating how much is known.
I think this makes them mistaken about some things both on the object level (plausibly like in your case) and on the meta-level, and that it is an obstacle for some of their communication.
Maybe if I got involved with them, I could better convince them of the cost of information.
However I think they also have very different standards for what is acceptable performance, which seem in principle achievable if one had improvements in some of the areas Vassarites point as, but which is rarely achieved otherwise.
Replies from: tailcalled↑ comment by tailcalled · 2023-02-27T21:50:46.745Z · LW(p) · GW(p)
Update: In a twitter chatroom, I was trying to ask the Vassarites the complaints they have about rationalism and EA, so I could write it up in a list and interrogate rationalists/EAs about it, etc..
We got halfway done with making the list, and then Michael Vassar got frustrated and decided to leave, with the reasoning that giving me the complaints they have requires too much effort/attention, and I should already have information sufficient to notice that "EA is in general fraudulent if taken literally and is an attempt by non-literal language to prevent literal language if taken non-literally".
... I am not active in EA, so I am not sure where I would get the information from, to be honest.
Edit: Update 2: He might have changed his mind. We will see.
Replies from: tailcalled↑ comment by tailcalled · 2023-03-01T15:40:09.368Z · LW(p) · GW(p)
Ok NOW he left.
↑ comment by Noosphere89 (sharmake-farah) · 2023-02-26T17:02:21.209Z · LW(p) · GW(p)
The dangerous part I remember from Scott Alexander's accusations is the apparent willingness to induce psychotic breaks via psychedelics, and the worry here is that contra the movies, very few mental illnesses are actually an improvement, and the closest ones are high functioning autism and Asperger's.
That's the problem with the Vassarites, in a nutshell.
Replies from: tailcalled↑ comment by tailcalled · 2023-02-26T18:21:34.949Z · LW(p) · GW(p)
I assume you have no opinion on the rest of the Vassarites' ideas? The importance of sharing information, problems of fraud within EA, moral mazes, language optimized for deception, etc.?
Replies from: tailcalled↑ comment by tailcalled · 2023-02-26T18:32:00.783Z · LW(p) · GW(p)
Or rather:
- based on your OP I presumably should assume that your opinion on the rest of the Vassarites' ideas is mostly that they are bad
- but based on your later comment that your information about the Vassarites is mostly from Scott Alexander, I assume you haven't read the writing of the Vassarites and disagreed, but rather just haven't heard of their writings.
↑ comment by Noosphere89 (sharmake-farah) · 2023-02-26T18:37:07.324Z · LW(p) · GW(p)
Mostly, I think the Vassarites's ideas sound pretty good, it's just that they really need someone to prevent Vassar from doing any more jailbreaking/giving psychedelics, since his attempts to modify their mind is getting more dangerous, and I'm getting concerned at their willingness to promote jailbreaking their mind, given that unlike the movies, mental illnesses suck for the most part.
Replies from: ChristianKl, tailcalled↑ comment by ChristianKl · 2023-02-27T02:16:30.789Z · LW(p) · GW(p)
Scott did a lot of investigation into Vassar and does not stand by his initial accusations in regards to Vassar giving psychedelics to mindbreak people.
Replies from: sharmake-farah↑ comment by Noosphere89 (sharmake-farah) · 2023-02-27T02:37:30.172Z · LW(p) · GW(p)
Thank you, I'll remove the Vassarites from the post.
↑ comment by tailcalled · 2023-02-26T21:49:40.362Z · LW(p) · GW(p)
Actually there's a concept I kind of want to share, which I think is both useful to understand some problems rationalists face, and to understand why Vassar is sometimes considered cultish/dangerous.
I call it "high-energy memes". I assume that people here are familiar with the concept of a meme; an idea that can be shared from person to person, and spread throughout society. By "high-energy", I mean a meme that in some sense demands a lot of action, or shifts the political landscape a lot, or similar. For instance, one high-energy meme is "AGI will most likely destroy civilization soon"; taken seriously, it demands strong interventions on AGI development, and if such interventions are not taking, it recommends strong differences in life choices (e.g. less long-term planning, more enjoying the little time we have left).
One can create lots of high-energy memes, and most conceivable high-energy memes are false and harmful. (E.g. "if you masturbate then you will burn in hell unless you repent and strongly act to support our religion".) Furthermore, even if a high-energy meme originates from a source that is accurate and honest, it may be transformed along the process of sharing, and the original source may not be available, which may make it less constructive in practice.
Since high-energy memes tend to be bad, lots of social circles have created protections to suppress high-energy memes. But these protections also suppress important high-energy memes such as AGI risk. And they also tend to be irrational and exploitable, and to be able to protect the people in power from being held accountable.
So I see much of Vassarism as claiming: These protections against high-energy memes are harmful. We need to break them down so that we can properly hold people in power accountable, and freely discuss important risks.
This also means that Vassarites have broken down a lot of the protections against high-energy memes, which in turns means that they are a superconductor for high-energy memes, and I assume that is part of why they are considered dangerous.
(I may be wrong about some of this, haven't read that much from Vassar.)
(I don't know much about the jailbreaks you are talking about. There's been some discussion about how Vassar doesn't tend to use the term "jailbreak" much, and I don't know where you've gotten the idea that what he's doing has been getting more dangerous. That said I don't necessarily know that you are wrong either. Just wanted to register that the fact that I'm not directly commenting much on your claims of escalation of psychedelics/jailbreaking/etc. is not because I agree, but just because I do not feel I have that much information about it, and vaguely suspect your stories to be incomplete or misleading.)
Replies from: Spiracular, M. Y. Zuo↑ comment by Spiracular · 2023-02-27T21:05:42.169Z · LW(p) · GW(p)
I like something about this formulation? No idea if you have time, but I'd be interested if you expanded on it.
I'm not convinced "high-energy" is the right phrasing, since the attributes (as I seem them) seem to be:
- Diverges from current worldview
- High-confidence
- Expressed or uptaken, in a way that allows little space for uncertainty/wavering. May take a dark attitude on ensembling it with other worldviews.
- May have a lot of internal consistency.
- "Unreasonable internal consistency" is (paradoxically) sometimes a marker for reality, and sometimes a tell that something is truly mad and self-reinforcing.
- Pushes a large change in behavior, and pushes it hard
- The change is costly, at least under your original paradigm
- The change may be sticky (& here are some possible mechanisms)
- Activates morality or tribal-affiliation concerns
- "If you hear X, and don't believe X and convert X into praxis immediately... then you are our enemy and are infinitely corrupt" or similar attitudes and beliefs
- Hard to get data that updates you out of the expensive behavior
- ex: Ziz using revenge to try to change the incentive landscape in counterfactual/multiverse-branching universes, which you cannot directly observe? Can't observe = no clear way to learn if this isn't working, and update out. (I believe this is how she justifies resisting arrest, too.)
- The change in behavior comes with an exhortation for you to do lots of things that spread the idea to other people.
- This is sometimes an indicator for highly-contagious memes, that were selected more for virulence than usefulness to the bearer. (Not always, though.)
- Leaves you with too little slack to re-evaluate what you've been doing, or corrupts your re-evaluation metrics.
- ex: It feels like you'd need to argue with someone who is hard to argue with, or else you've dismissed it prematurely. That would be really bad. You model that argument as likely to go poorly, and you really don't want to...
- This sentiment shows up really commonly among people deeply affected by "reality warper" people and their beliefs? It shows up in normal circumstances, too. It seems much, much more intense in "reality warper" cases, though.
- ex: It feels like you'd need to argue with someone who is hard to argue with, or else you've dismissed it prematurely. That would be really bad. You model that argument as likely to go poorly, and you really don't want to...
- Activates morality or tribal-affiliation concerns
I would add that some people seem to have a tendency to take what is usually a low-energy meme in most hands, and turn it into a high-energy form? I think this is an attribute that characterizes some varieties of charisma, and is common among "reality warpers."
(Awkwardly, I think "mapping high-minded ideas to practical behaviors" is also an incredibly useful attribute of highly-practical highly-effective people? Good leaders are often talented at this subskill, not just bad ones. Discernment in what ideas you take seriously, can make a really big difference in the outcomes, here.)
Some varieties of couching or argumentation will push extreme change in behavior and action, harder than others, for the same idea. Some varieties of receptivity and listening, seem more likely to uptake ideas as high-energy memes.
I feel like Pascal's Mugging is related, but not the only case. Ex: Under Utilitarianism, you can also justify a costly behavior by arguing from very high certainty of a moderate benefit. However, this is usually not as sticky, and it is more likely to rapidly right itself if future data disputes the benefit.
Replies from: tailcalled↑ comment by tailcalled · 2023-02-27T21:42:34.795Z · LW(p) · GW(p)
I like something about this formulation? No idea if you have time, but I'd be interested if you expanded on it.
I have considered doing a series of posts on ideology and memetics. I likely will at some point in the future.
I'm not convinced "high-energy" is the right phrasing, since the attributes (as I seem them) seem to be:
The attributes... of what? Of the ideas that Vassarists want to promote? Or?
I am not sure what your list of attributes aims to explain, and the list of attributes does not map to any phenomenon that I am trying to model myself.
(Awkwardly, I think "mapping high-minded ideas to practical behaviors" is also an incredibly useful attribute of highly-practical highly-effective people? Good leaders are often talented at this subskill, not just bad ones. Discernment in what ideas you take seriously, can make a really big difference in the outcomes, here.)
I would be prone to agreeing with this.
↑ comment by M. Y. Zuo · 2023-02-27T00:09:29.315Z · LW(p) · GW(p)
This is an interesting formulation. Grading memes along a spectrum of energy intensity (?). Can you elaborate on some of these "protections against high-energy memes"?
Replies from: tailcalled↑ comment by tailcalled · 2023-02-27T07:59:36.172Z · LW(p) · GW(p)
Authoritarian empiricism - if you discuss sensitive topics then people may escalate evidence requirements to hard data that's rarely available.
The engineer and the diplomat - people try to derail conversations when they become relevant to a person's long-term interests.
Can crimes be discussed literally? - words that describe problems become coopted as calls to action, and so people do not evaluate descriptions of problems based on whether they are true, but instead based on whether the criticized entity is considered good or not.
Replies from: M. Y. Zuocomment by Jayson_Virissimo · 2023-02-26T19:09:46.626Z · LW(p) · GW(p)
How many LessWrong users are there? What is the base rate for cult formation? Shouldn't we answer these questions before speculating about what "should be done"?
Replies from: sharmake-farah↑ comment by Noosphere89 (sharmake-farah) · 2023-02-26T19:17:10.533Z · LW(p) · GW(p)
Yeah, I'll probably edit the post to remove takeaways and to talk about base rates.