Religion = Cult + Culture
post by Eneasz · 2024-04-02T16:44:27.010Z · LW · GW · 9 commentsThis is a link post for https://deathisbad.substack.com/p/religion-cult-culture
Contents
This Is About QC None 9 comments
[copied in full -- request to develop community knowledge/practices?]
Cults are not necessarily bad. Cults provide value. People join them to get things they need which aren’t provided elsewhere. Every cult is a spiritual start-up, doing its best to serve a neglected segment of the population.
Start-ups are famous for the intensity of focus and commitment they inspire in their founding cohort. Observations from Paul Graham:
Running a startup is not like having a job or being a student, because it never stops. This is so foreign to most people's experience that they don't get it till it happens.
I didn't realize I would spend almost every waking moment either working or thinking about our startup. You enter a whole different way of life when it's your company vs. working for someone else's company.
It's surprising how much you become consumed by your startup, in that you think about it day and night, but never once does it feel like "work."
Start-ups are unsustainable. The amount of work, focus, and stress they require always brings people to burn-out eventually. The runway of a start-up isn’t just measured in the money it needs to sustain itself and grow, it’s also measured in how many dozens of months its initial cohort can work this intensely before collapsing. The point of every start-up is to create something that can transition into a stable company before those resources run out.
Cults have a start-up culture. Everyone within them is excited and in love with their work and can focus on little else, and it’s great. But it is equally unsustainable. A cult is inspirational and fulfilling, but it doesn’t interface well with the wider world. The demands and pressures of real life stack higher and higher until eventually something breaks.
For a cult to continue to serve its members for many decades (or centuries), providing value to their children and grandchildren and the surrounding community, it must adopt adopt techniques that allow its members to lead functional lives outside of its confines. It has to interface with the wider world and be shaped by its practicalities.
Phil Goetz described what this looks like in 2009: [LW · GW] the culture surrounding a cult turns it into a religion by providing it with memetic antibodies — practices which allow the standard believer to interface normally with the rest of the world.
People who grow up with a religion learn how to cope with its more inconvenient parts by partitioning them off, rationalizing them away, or forgetting about them. Religious communities actually protect their members from religion in one sense - they develop an unspoken consensus on which parts of their religion members can legitimately ignore. New converts sometimes try to actually do what their religion tells them to do.
I remember many times growing up when missionaries described the crazy things their new converts in remote areas did on reading the Bible for the first time - they refused to be taught by female missionaries; they insisted on following Old Testament commandments; they decided that everyone in the village had to confess all of their sins against everyone else in the village; they prayed to God and assumed He would do what they asked; they believed the Christian God would cure their diseases. We would always laugh a little at the naivete of these new converts; I could barely hear the tiny voice in my head saying but they're just believing that the Bible means what it says...
This necessarily means a religion in the start-up phase (or “cult”) will lose some of its edge, but this is healthy. This shift mirrors the “start-up to stable company” transition. This is the defining difference between a cult and a religion — How well it adopts and incorporates the memetic antibodies of the surrounding culture to allow its adherents to live normal, happy, functional lives, while still providing the spiritual services humans need. #SystematizedWinning [LW · GW]
This Is About QC
Rationalism isn’t a religion, but it fulfills some of the functions a religion fulfills that everyone needs to some degree. And like a religion, it has memetic antibodies to prevent True Believer Cultist failure modes. Many of these developed through cultural evolution, like all previous religions. Strikingly, many of them were directly injected by Eliezer [? · GW] when he first wrote the Sequences in a deliberate, heroic attempt to prevent a cult forming around his ideas.
Nonetheless, some people fall through the cracks. All the antibodies miss them and they become Zealots, doing lasting damage to their lives, and then burning out spectacularly. QC was a recent example, but isn’t a unique phenomenon. Obviously a very young religion so close to its vital source will see this more often than one established for centuries. In historic terms, we’re doing better than any spiritual movement in any previous century. But via the powers of explicit reasoning perhaps we can do even better. Every case like QCs is tragic and should cause some measure of regret and introspection.
Where were the clergy that could see the warning signs of memetic immunity failure, and could guide QC away from fanaticism and towards greater integration with practical realities? They (we?) barely exist. Who’s even had the time to learn what to look for, or how to handle it, in the dozen+ years rationalism has been around?
What are the community norms for social protocols around such zealots? Goetz’s missionaries [LW · GW] knew to laugh at the new converts and correct them. Do we? I think in Denver we’ve lucked into a default culture that puts emphasis on first getting your life in order and functioning in default society, with rationalism complimenting that rather than overriding it. Is this common?
Rationalism is now large enough and old enough that these issues demand addressing. Rationalism has an ethos, it provides inspiration and meaning, it has an internal culture. We’re doing our best to grow communities to serve our people, but there don’t seem to be even an acknowledgement that this comes with some measure of responsibility. One of those responsibilities is to ensure that the wider normie cultural antibodies that prevent cultish death spirals are kept fit.
And, perhaps, a resource that organizers can turn to if they notice someone slipping into fanaticism would be nice. As far as I know, there isn’t a Best Practices Doc for this sort of thing.
9 comments
Comments sorted by top scores.
comment by aphyer · 2024-04-02T17:33:56.567Z · LW(p) · GW(p)
What is QC?
Replies from: jam_brand↑ comment by jam_brand · 2024-04-03T11:32:24.374Z · LW(p) · GW(p)
The person whose tweets were linked above when mentioning "they become Zealots, doing lasting damage to their lives, and then burning out spectacularly."
Replies from: Viliam↑ comment by Viliam · 2024-04-07T15:39:08.625Z · LW(p) · GW(p)
I have read that entire thread, and... it is hard to say something coherent in reply, and I am probably missing a lot of context... but it seems to me that bad things are happening, but also that people complaining about them make wrong conclusions (mostly in style: I see something bad happening, so I point at the most visible thing nearby and say: this is the cause of the bad things happening).
Makes me wonder, what would have happened if instead of living on the opposite side of the planet, I lived in the middle of all that chaos. Would I be a part of the insanity? Or a lonely voice of reason? Or just a random low-status guy whose opinion is irrelevant because no one listens to it and no one is going to remember it anyway? (Probably the last one.)
Basically, it confuses me when people point at things I consider good, and call them causes of things that I consider obviously bad and stupid. What is the proper lesson to take here? Maybe I am the stupid one, unable to see the obvious causality, and protected from my own stupidity by being far away from where important things happen. Or maybe other people are simply doing things wrong.
I keep dreaming about having a rationalist group with more than five members in my country, but if my wishes came true, would that automatically mean also getting our local version of Zizians/Leverage/etc.? Do these things happen automatically as a consequence of trying to be rational, or did just someone accidentally build the Bay Area community on top of an ancient Indian burial ground?
...but that's basically what this article is about.
I think in Denver we’ve lucked into a default culture that puts emphasis on first getting your life in order and functioning in default society, with rationalism complimenting that rather than overriding it. Is this common?
The rationalist scene in Vienna is also sane, as far as I know. We need more data points from other cities.
Or maybe it's something unrelated to "antibodies", like the people in Bay Area taking an order of magnitude more drugs than people anywhere else, and everything else is just downstream of this. (Or, from another perspective, perhaps "don't take drugs just because some guys who call themselves 'rationalists' told you it was a good idea" is the most relevant normie antibody.) The obvious counter-argument is that everyone in Bay Area takes drugs, so the fact that the drugs were always visibly involved in the most crazy cases is not as strong evidence as I make it. The obvious counter-counter-argument is that this is probably the reason why the crazy cases happen in Bay Area, as opposed to other places.
And, perhaps, a resource that organizers can turn to if they notice someone slipping into fanaticism would be nice. As far as I know, there isn’t a Best Practices Doc for this sort of thing.
My first idea is to make a short text that will document the existing bad cases and highlight the relevant parts of the Sequences. Document the bad cases to show that the problem exists and is serious. Quote the Sequences to... dunno, probably as a way to tell the people "hey, if you decide to ignore all of these warnings and do your own thing anyway, at least do not publicly blame Eliezer when shit hits the fan".
Replies from: D0TheMath↑ comment by Garrett Baker (D0TheMath) · 2024-04-07T17:03:23.364Z · LW(p) · GW(p)
Do these things happen automatically as a consequence of trying to be rational, or did just someone accidentally build the Bay Area community on top of an ancient Indian burial ground?
As someone “on the ground” in the Bay Area, my first guess would be that the EA and rationality community here (and they are mostly a single community here) is very insular. Many have zero friends they meet up with regularly who aren’t rationalists or EAs.
A recipe for insane cults in my book.
Replies from: Viliam↑ comment by Viliam · 2024-04-07T21:31:05.828Z · LW(p) · GW(p)
Okay, that sounds really bad, I agree. Definitely different from e.g. Vienna.
Let's go one level deeper and ask "why".
It is tempting to interact with the fellow rationalists; I also consider them preferable to non-rationalists, ceteris paribus. But even if there were hundred or thousand rationalists available around me, I still have a family, friends, colleagues, neighbors, people who share the same hobby, so I would keep interacting with many non-rationalists anyway. I suspect that in the Bay Area, many community members are either university students, or someone who moved to the Bay Area recently to join a local startup or an EA organization -- in other words, people who lost access to their previous social connections.
So the obvious move is to remind them regularly to create and maintain connections outside the rationalist community, and to treat any attempt to convince them otherwise (e.g. by their employer) as a huge red flag.
And, this is less likely to happen in a community where many members have already lived in the city.
The belief that the Singularity is near encourages you to throw all usual long-term planning out of window: if in a year, you will either be dead or live in a paradise, it is not so important whether during that year you have burned out, kept contacts with your family and friends, etc.
I am not going to object against a belief by appealing to consequences. In a world where Singularity actually comes in a year, and you have a 0.1% chance to change the outcome from hell to heaven, working as hard as you can is the right thing to do.
Instead, I suggest that people adjust both their timeline and the probability of their actual impact. With regards to timeline, consider the fact that there was already a rationalist minicamp on existential risk [LW · GW] in 2011, that is 13 years ago. And yet, the world did not end in a year, in two years, in five years, or in ten years. Analogically, there is a chance that the world will not end in the following five or ten years. In which case, burning out in one year is a bad strategy. From psychological perspective, ten years is a lot of time; you should keep working towards the good end, but you should also take care of your health, including your mental health. Run a marathon, not a sprint. (People have criticized Eliezer for taking time to write fan fiction and indulge in polyamorous orgies, but notice that he hasn't burned out, despite worrying about AI for decades. Imagine a parallel timeline, when he burned out in 2012, went crazy in 2013, and committed suicide in 2014. Would doing that help AI safety?)
And if you are considering your personal impact on the outcome of Singularity, most likely it is indistinguishable from zero, and before you go full Pascal and multiply the tiny probability by the number of potential future inhabitants of all galaxies in the universe, please consider that you don't even know whether that number indistinguishable from zero is positive or negative (so you can't automatically assume that even multiplying it by 3^^^3 necessarily results in a huge positive number). Working so hard that you burn out increases the absolute value a tiny bit, but still gives no guarantee about the sign, especially if other people afterwards use you as an example of how everyone who cares about AI safety goes crazy.
Ironically, unless you are one of the top AI safety researchers, if you live in the Bay Area, your best contribution would probably be keeping the rationalist community sane. Don't take drugs, don't encourage others to take drugs, help people avoid cults, be nice to people around you and help them relax, notice the bad actors in the community and call them out (but in a calm way). If this helps the important people stay sane longer, or prevents them from burning out, or just protects them from being dragged into some scandal that would have otherwise happened around them, your contribution to the final victory is more likely to be positive (although still indistinguishable from zero). Generally speaking, being hysterical does not necessarily mean being more productive.
Replies from: D0TheMath, D0TheMath↑ comment by Garrett Baker (D0TheMath) · 2024-04-08T07:45:12.138Z · LW(p) · GW(p)
I have a bit of a different prescription than you do: Instead of aiming to make the community saner, aim to make yourself saner, and especially in ways as de-correlated from the rest of the community. Which often means staying far away from community drama, talking with more people who think very differently than most in the community, following strings of logic in strange & un-intuitive directions, asking yourself whether claims are actually true when they're made in proportion to how confident community members seem to be in such claims (people are most confident when they're most wrong, for groupthink, tails come apart [LW · GW], and un-analyzed assumptions reasons), and learning a lot.
A kind of put on your own mask before others' sort of approach.
↑ comment by Garrett Baker (D0TheMath) · 2024-04-08T07:27:01.744Z · LW(p) · GW(p)
People have criticized Eliezer for taking time to write fan fiction and indulge in polyamorous orgies, but notice that he hasn't burned out, despite worrying about AI for decades.
Not really relevant to your overall point, but I in fact think Eliezer has burnt out. He doesn't really work on alignment anymore as far as I know.
comment by Viliam · 2024-04-08T15:06:10.283Z · LW(p) · GW(p)
I was thinking about a Defense Against Predators Doc, addressing various bad things that already happened in the rationalist community. I wonder whether it should or should not be the same document as the Best Practices Doc. On one hand, those are two quite different topics. On the other hand, there is also some overlap in the form of organizational zealotry (e.g. Leverage, Nonlinear).
Any other Docs that should be written for rationalists? By a Doc I think something that is dramatically shorter than the Sequences, because frankly most people are not going to read the Sequences. As you mentioned in the article, there already are various warnings in the Sequences, but people ignore them. The memes have a life of their own, and in the contrarian environment, the dangerous edgy ideas spread fast, and the warning mostly do not.
comment by the gears to ascension (lahwran) · 2024-04-02T17:52:18.423Z · LW(p) · GW(p)
Nah, cults are always bad, and this does not exclude religion. the core ways to avoid cults are, don't trust someone to be an authority when they want it and don't stay when someone insists you cannot leave. If you don't want to leave a thing, that's fine, whatever, but don't accept demands to not consider leaving, that's screwed up.