Has LessWrong Been Mind-Killed on the Topic of God and Religion?

post by Mahdi Complex (mahdi-complex) · 2021-11-06T09:13:24.137Z · LW · GW · 19 comments

Edit: Some thoughts and takeaways resulting from the conversation in the comments:

I published a post titled God Is Great a week ago.

It elicited a mixed/negative response on LessWrong (-5 karma, 17 votes at the moment of this writing).

A very quick TL;DR and clarification of the post:

I first explore how the way theists speak about God is very similar to how people on LessWrong speak about "reality" (i.e. the God of Spinoza).

I then examine how 'God' has served the function of "intersubjective Leviathan." It is the imaginary leader, the personification of the moral ideal, and has helped theists solve many coordination problems. No need for an actual mob boss to get people cooperating in a prisoner's dilemma if people believe in an imaginary one. In a sense, it has allowed this imaginary agent to become part of reality. The 'God' meme is arguably the most powerful and agentic meme humans have come up with.

And finally I explore how the religious view of an agentic environment is increasingly coming true as humans shape their environment, and if you take the singularitarian view seriously, the whole universe is going to become agentic soon, powered by a Superintelligence. The 'AGI is God' meme feels like a tired trope, but I think it's underrated. We have a big cultural baggage of thinking about what it's like to live in an agentic environment, and, in part for the purpose of shortening inferential distances, I think we maybe should co-opt it (more on this in future posts).

So, I have three hypothesis for the negative reaction to my post:

1. It's unclear what my point is, why I'm framing things the way I do, and what my ultimate motivation is. I've chosen a fairly "creative" way of broaching the topic. I think one reason it came out the way it did was because I tried to simultaneously address theists and rationalists, and tried to send in-group signals to both communities simultaneously, which might have ended up just being confusing.

I also didn't intend to publish this on its own at first. This is the first part of a larger project I'm working on, exploring, among other things, the tensions between the EA/LW worldview and some foundational assumptions in classical liberal political philosophy. I ended up publishing God Is Great on its own anyway because it seemed like a good fit for the EAF creative writing contest.

2. Some flaw I'm unaware of.

3. LW has been mind-killed [LW · GW] on the topic of God and religion.

In Politics Is The Mind-Killer, Yudkowsky describes how authors of artificial intelligence textbooks just don't seem able to resist posing problems that take a swipe at Republicans. Similarly, Yudkowsky hasn't been able to resist taking swipes at belief in God and religion throughout the Sequences.

All this is not to say that the LW consensus view on the validity of religious claims or the harm done by religions is false. Just that the way this topic has been approached has been extremely tribal, probably influenced by the new atheist movement that was in full swing at the time of the writing of the Sequences. The effective altruist and rationalist communities do seem to have had a large overlap with new atheism.

The way I now see it, LessWrong is deeply political, in the broader sense of the term. It's just not engaged in national politics, but in cosmic politics. While regular political debates deal with the short term affairs of nations, LW is concerned with the fundamental nature and ultimate destiny of humanity and our Universe. And so it has adopted a deeply tribal attitude towards the only other actors in cosmic politics: religions. It is thus no wonder that Yudkowsky has just not been able "to resist getting in those good, solid digs" throughout the Sequences.

But trying to save the world, and shape humanity's future for the better, while going out of your way to antagonize more than half of humanity seems like a bad strategy.

Religions have represented almost the whole of culture until only a few centuries ago, and cultural evolution is what got us to where we are today. In addition, the majority of humanity still derives meaning primarily from religion. Thus, forming an accurate view of religion as a naturalistic phenomena, and the role it has played in our development as a species seems like an important task. Doing so while engaged in a tribal battle against the evil irrational religionists, treating all arguments about the topic as soldiers, doesn't seem like a promising approach.

I might be wrong, but my sense is that any discussion of religion in a positive light, or exploration of the similarities between religious ideas and EA/rationalist ideas is unwelcome here. And I think that this is both a big epistemic and strategic mistake. It leads to a distorted view of religion, and closes us off to opportunities to present the LW/EA worldview in a way that's more palatable to a religious audience.

Frankly, writing God Is Great, and publishing it on LessWrong had me feeling like I was going to school in a clown suit. And maybe I am a clown. Maybe I am wrong and my post deserves the negative karma.

But can someone reassure me that if in future posts I frame religion as an important stepping stone in the process of cultural evolution, argue that belief in God has been instrumentally rational, or point out similarities in the rationalists' and theists' way of viewing the world (reversed stupidity is not intelligence), people will vote on the merits of the arguments and quality of the post, and not based on tribal antipathy towards theism, or some crude anti-religion pattern matching?

19 comments

Comments sorted by top scores.

comment by Vladimir_Nesov · 2021-11-06T10:38:18.253Z · LW(p) · GW(p)

Anti-epistemology lives in particular topics, makes it hard/socially costly/illegal to discuss them without committing the errors it instills. Its presence is instrumentally convergent in topics relevant to power (over people), such as religion, politics, and anything else politically charged.

Ideas are transported by analogies, and anti-epistemology grows on new topics via analogies with topics already infected by it, if it's not fought back from the other side with sufficient clarity. The act of establishing an analogy with such a topic is, all else equal, sabotage.

Replies from: mahdi-complex
comment by Mahdi Complex (mahdi-complex) · 2021-11-06T11:11:25.942Z · LW(p) · GW(p)

So you're claiming that religion (aka team green) is so bad and irrational that any analogy of rationalism (aka team blue) with it is dangerous and sabotage? Or that any positive talk of team green is a threat?

It seems to me that the LW (over)reaction to the irrationality of religion is pretty irrational and has nothing to do with 'clarity'. If you're rejecting apriori a line of inquiry because it seems threatening to the in-group, I don't consider that "rational."

Edit: This was an overly antagonistic response, in part due to an uncharitable reading of Vladimir_Nesov's comment.

Replies from: AllAmericanBreakfast
comment by AllAmericanBreakfast · 2021-11-06T23:32:07.968Z · LW(p) · GW(p)

No, he is making a different and more precise claim.

There is a phenomenon that can be called "anti-epistemology." This is a set of social forces that penalize or otherwise impede clear thought and speech.

Sometimes, a certain topic in a certain space is free of anti-epistemology. It is relatively easy to think about, research, and discuss it clearly. A central example would be the subject of linear algebra in the context of a class on linear algebra.

Other times, anti-epistemology makes thought, research, and discussion difficult for a certain topic in a certain context. A central example here would be the topic of market economics in a Maoist prison camp.

Unfortunately, what unites both of these cases is that they are so overt. The linear algebra class is set up with the explicit goal of supporting the study of linear algebra; the Maoist prison camp with the explicit purpose of suppressing market economics (among other ideas).

Less crushing than the Maoist prison camp, but still pernicious, are settings in which a certain topic is suppressed implicitly, by a semi-spoken or unspoken set of social monitoring, implicit loyalty tests, and softly enforced or reflexive analogies and framings. This is an anti-epistemology that we might encounter in our everyday lives. You may happen to be so blessed as to be entirely free of such social dynamics, or so incredibly brave as to be entirely free, or even blissfully unaware, that such pressures could exist. But for others, this is a routine struggle.

The claim that Vladimir Nesov is making is that the way such "soft suppressions" get set up in the first place is by establishing analogies. To that, I would add framings and choices of words. For example, if you wish to undermine support for market economics, start by referring to it as "capitalism." If you wish to undermine support for new development, refer to it as "gentrification." If you wish to undermine support for social justice, refer to it as "cancel culture." If you're an American who wishes to undermine support for the 2nd amendment, refer to action movies as "shoot 'em up movies."

Then you can start with the analogies. In your case, if you wish to undermine rationality, you might start by making an analogy with religion. It's a very common reflex. Social justice, capitalism, gun rights, sexual promiscuity, free speech, nationalism, and more have all been referred to many, many times as "religions" by their political opponents in one editorial after another.

Analogies aren't inherently bad. They can be useful to pick out a particular feature of a confusing thing and make it familiar to a novice by comparison with a familiar thing. I am a biomedical engineering student. If I wanted to explain what a hydrogel is to you, I might say that it's like snot. That's an analogy, and it's not anti-epistemology. I have a particular thing I want you to understand, and I choose a familiar example to help you get there.

But this has many properties in common with cherry-picking. You're selecting a particular feature, taken out of context, and focusing attention on it. You're asking for trust, putting yourself in a teaching role, conveying a picture of something with which your audience is unfamiliar to them, and putting a memory and association in their mind. For this reason, analogies can be effective ways to undermine support for a thing by making emotionally loaded but fundamentally lazy, misleading, narrow, or otherwise flawed analogies.

You are quite new to LessWrong, and will have to make up your own mind about the ideas you find here. Right now, you seem to be in a tense place - both viewing the site and its participants as irrationally religious in their denunciation of religion, and yet simultaneously feeling motivated to engage with it anyway.

My suggestion to you is to assess your own motives. Are you interested in what you find here, and do you think there's a good chance that you are the one with something to learn? If so, then consider that when you find yourself tempted to reflexively dismiss or unfavorably analogize what you find here.

If not, then consider simply stopping your participation. I say this not to be rude, but out of compassion. There is so much wrong stuff on the internet, that if you waste your time fighting it, you'll never discover what's right.

Replies from: mahdi-complex
comment by Mahdi Complex (mahdi-complex) · 2021-11-07T00:45:36.545Z · LW(p) · GW(p)

Thank you. I really appreciate this clarification.

I meant God Is Great as a strong endorsement of LessWrong. I am aware that establishing an analogy with religion is often used to discredit ideas and movements, but one of the things I want to push back against is that this move is necessarily discrediting. But this requires a lot of work (historical background on how religions got to occupy the place they do today within culture, classical liberal political philosophy...) on my part to explain why I think so, and why in the case of EA/LW, I think the comparison is flattering and useful. This is work I haven’t done yet, and I might be wrong about how I view this, so I guess I shouldn’t have been too surprised about the negative reaction.

I really should have written and posted something about my heterodox background assumptions first, and gotten feedback on them, before I published something building on them.

Replies from: AllAmericanBreakfast
comment by AllAmericanBreakfast · 2021-11-07T02:32:59.029Z · LW(p) · GW(p)

Framing, research, and communication are all skills that take practice! I hope you'll ultimately find this a helpful space to build your skills :)

comment by Richard_Kennaway · 2021-11-06T17:25:12.423Z · LW(p) · GW(p)

No, it's just that we've rejected the concept of "God" as wrong, i.e. not in accordance with reality. Some ancient questions really are solved, and this is one of them. Calling reality "God" doesn't make it God, any more than calling a dog's tail a leg makes it a leg. The dog won't start walking on it.

The claimed evolution of ideas of God towards "reality" is the evolution of those ideas towards "actually, there's no such thing."

Besides, you made a brand new account for that posting, acted plaintively injured when it got a poor reception, and then suggested we're not as open-minded as we might like to think. I've seen the pattern before on LessWrong. It was trolling then. Why should I not think that it is trolling now?

See also. [LW · GW]

Replies from: mahdi-complex
comment by Mahdi Complex (mahdi-complex) · 2021-11-06T23:45:04.632Z · LW(p) · GW(p)

No, it's just that we've rejected the concept of "God" as wrong, i.e. not in accordance with reality. Some ancient questions really are solved, and this is one of them. Calling reality "God" doesn't make it God, any more than calling a dog's tail a leg makes it a leg. The dog won't start walking on it.

The disagreement here seems to be purely over definitions? The way I use "God" to mean "reality" is the same way Scott Alexander uses "Moloch" to mean "coordination failure" or how both Yudkowsky and Scott Alexander have used "God" to mean "evolution."

The claimed evolution of ideas of God towards "reality" is the evolution of those ideas towards "actually, there's no such thing."

That's essentially the meaning I'm trying to get across in God Is Great, just using an unusual definition of 'God' to make the point more palatable to a theistic audience. The reality we have uncovered by rationality and the scientific method is all there is.

Besides, you made a brand new account for that posting, acted plaintively injured when it got a poor reception, and then suggested we're not as open-minded as we might like to think. I've seen the pattern before on LessWrong. It was trolling then. Why should I not think that it is trolling now?

You seem to consider me an outsider troll. I meant God Is Great as a ringing endorsement of the LessWrong community and worldview, admittedly presented in a heterodox fashion. I consider myself a part of this community and only wish the best for it. I meant this post as an earnest request for feedback, and to kick-start a discussion about what I think might be a blind-spot in LW’s understanding of religion and theism. I’m just trying to explore a good faith disagreement. I regret that it ended up sounding so hostile and confrontational.

Replies from: Dagon
comment by Dagon · 2021-11-08T16:07:42.562Z · LW(p) · GW(p)

The disagreement here seems to be purely over definitions? The way I use "God" to mean "reality" is the same way Scott Alexander uses "Moloch" to mean "coordination failure" or how both Yudkowsky and Scott Alexander have used "God" to mean "evolution."

Umm, ok?  Using misleading terms and then complaining that you don't generate good discussion seems unlikely to succeed here.

I don't know, but a straightforward propositional post (using more standard terms, or using a lot of non-poetic words to define rather than describe your points) might get some good traction.  Or you might just be confused and such a post is impossible because you're motte-and-bailey-ing "god", where you want it to mean "reality" when defending, but want it to mean more when people aren't paying attention.

comment by Vladimir_Nesov · 2021-11-06T11:24:48.339Z · LW(p) · GW(p)

Closer to the object level, I like the post [LW · GW] aesthetically, it's somewhat beautiful and well-crafted. I didn't find anything useful/interesting/specific in it, it only makes sense to me as a piece of art. At the same time, it fuels a certain process [LW(p) · GW(p)] inimical to the purpose of LW.

Compare this with Scott Alexander's Moloch post or even Sarah Constantin's Ra post. There's specific content that the mythical analogies help organize and present.

The positive role of the mythical analogies is the same as in your post, but my impression was that in your post the payload is missing, and the mythical texture is closer to functional anti-epistemology, where it's not yet ground down to the residue of artistic expression by distance from the origin and distortion in loose retelling.

discussion of religion in a positive light

Discussion in a negative light has its own problems, where instead of developing clarity of thought one is busy digging moats that keep an opposing ideology at bay, a different kind of activity that involves only a very pro forma version of what it takes not to drown in the more difficult questions that are relevant in their own right.

Replies from: mahdi-complex
comment by Mahdi Complex (mahdi-complex) · 2021-11-06T12:17:47.888Z · LW(p) · GW(p)

my impression was that in your post the payload is missing

Okay, that seems fair. It is true that just from that post, it's unclear what my point is (see hypothesis 1).

I think it matters how we construst our mythical analogies, and in Scott Alexander's Moloch, he argues that we should "kill God" and replace it with Elua, the god of human values. I think this is the wrong way to frame things. I assume that Scott uses 'God' to refer to the blind idiot god of evolution [LW · GW]. But that's a very uncharitable and in my opinion unproductive way of constructing our mythical analogies. I think we should use 'God' to refer to reality, and make our use of the word more in line with how more than half of humanity uses the word.

Is your point about "functional anti-epistemology" about it being clear from Scott Alexander's and Sarah Constantin's posts that they're not sympathetic to "actual" belief in Moloch or Ra, while in my post, I sound sympathetic to theism?

Replies from: Vladimir_Nesov
comment by Vladimir_Nesov · 2021-11-06T13:01:19.742Z · LW(p) · GW(p)

It doesn't matter if a discussion is sympathetic or not, that's not relevant to the problem I'm pointing out. Theism is not even an outgroup, it's too alien and far away to play that role.

Anti-epistemology is not a label for bad reasoning or disapproval of particular cultures, it's the specific phenomenon of memes and norms that promote systematically incorrect reasoning, where certain factual questions end up getting resolved to false answers, resisting argument or natural intellectual exploration, certain topics or claims can't be discussed or thought about, and meaningless nothings hog all attention. It is the concept for the vectors of irrationality, the foundation of its staying power.

comment by ChristianKl · 2021-11-06T13:42:33.671Z · LW(p) · GW(p)

1. It's unclear what my point is, why I'm framing things the way I do, and what my ultimate motivation is. 

Writing is not about the writer and his motivations but about whether the writing provides value for the reader.

Generally, the more charged a topic happens to be, the less free you are in the form of addressing it. If you want to have a good discussion about religion on LessWrong don't spend your weirdness points [LW · GW]on being creative and using unusual style to make them.

In Politics Is The Mind-Killer [LW · GW], Yudkowsky describes how authors of artificial intelligence textbooks just don't seem able to resist posing problems that take a swipe at Republicans. Similarly, Yudkowsky hasn't been able to resist taking swipes at belief in God and religion throughout the Sequences.

The point of politics is the mindkiller is that if you talk about X and use a political example Y to illustrate X, your readers are more likely to focus on Y and not on X in general. 

To the extend that you claim you have written an article titled 'God is Great' that's about an X that's not about religion and you expect, readers to instead have focused on the religious arguments you made, it seems like you did poorly at writing the post. 

comment by Evenflair (Raven) · 2021-11-06T13:50:40.521Z · LW(p) · GW(p)

I found the post long and difficult to read, and the bit I did read appeared to be not interesting. I'm also strongly dislike religion (yes, I know it has good parts but it also has a shit ton of bad). In particular, I have no desire to see more content glorifying it here, and your post appeared to be doing that from the cursory inspection I gave it. Thus, the downvote.

I didn't strong downvote because I hadn't read the entire post or given it a proper chance. A weak downvote is minor dislike, a desire to see less of that sort of thing. A strong downvote for me is an expression of utter contempt, or a belief that the writing is irredeemably bad in some way. I didn't think you fell into the latter category.

In other words, it was mostly #1 but #3 was the nail in the coffin.

comment by shminux · 2021-11-06T19:54:39.578Z · LW(p) · GW(p)

The other post is long and meandering, which only works if you are a good writer who expresses their point in koans or something. I couldn't even tell what your point was, or what your personal view on theism and religion is. Or why you are motivated to discuss belief in supernatural in a generally atheist crowd. Like, what are the windmills you are fighting?

comment by Jay · 2021-11-06T17:17:08.255Z · LW(p) · GW(p)

I think most people on LW fall into one of two groups:

  • People who were raised in the urban liberal milleu.  Religion simply isn't part of their worldview; their attitude toward it is not even unbelief.  For them going to church is like raising alpacas; they are aware that some people do it but they don't see much value in it, it doesn't fit into their lifestyles, and it would take a rather long intellectual journey to convince them to do it themselves.
  • People who, like me, were raised around religion.  As LWers are generally thoughtful people, we have generally considered religion to our satisfaction many years ago.  Each of us had a particular journey but I suspect my conclusions are, if not typical, then not highly atypical either:
    • My understanding of neuroscience has convinced me that consciousness is fundamentally dependent on the brain.  I don't believe in an afterlife (cryogenics and mind-uploading ideas notwithstanding).
    • Since I do not expect to face divine judgement, I am not greatly concerned about the existence or nonexistence of God.  I think of theology as much like an ant trying to understand presidential politics.  Understanding is highly unlikely and the ant has better things to do.
    • The Durkheimian "society worshiping itself" phenomenon is real, common, and by no means limited to religion as traditionally defined.  It is often wildly irrational and is pretty much the opposite of what LW aspires to.

If you're trying to reach the first group, I would recommend trying to bring them into contact with organized religion via some sort of common interest (probably effective altruism).  The second group is generally going to be much harder to reach.

Replies from: mahdi-complex
comment by Mahdi Complex (mahdi-complex) · 2021-11-06T22:13:00.822Z · LW(p) · GW(p)

Thank you, I find this comment quite constructive.

My understanding of neuroscience has convinced me that consciousness is fundamentally dependent on the brain.

I had a similar journey.

The Durkheimian "society worshiping itself" phenomenon is real, common, and by no means limited to religion as traditionally defined. It is often wildly irrational and is pretty much the opposite of what LW aspires to.

I guess this can take a pretty nasty and irrational form, but I see this continuous with other benign community bonding rituals and pro-social behavior (like Petrov day or the solstice).

Replies from: Jay, Jay
comment by Jay · 2021-11-07T14:21:39.495Z · LW(p) · GW(p)

I should mention that, like many people who were raised religious and lost their faith, I miss it.  It was comforting to believe that the world was in good hands and that it all could work out in the end.  I had friends at church.  Many of them were attractive females.

Losing my religion felt less like an act of will and more like figuring out the answer to a math problem.  It wasn't something I wanted, rather the opposite.  I fought it for a while, but there's no cure for enlightenment.  I've tried to go back to church, but it just doesn't work when you don't believe in it.  I no longer see God there, just some schmuck wearing felt.

comment by Jay · 2021-11-07T00:59:20.098Z · LW(p) · GW(p)

I guess this can take a pretty nasty and irrational form, but I see this continuous with other benign community bonding rituals and pro-social behavior (like Petrov day or the solstice).

I agree, I just think that community bonding rituals have such a strong tendency to lead to ingroup-vs-outgroup conflicts that I am much more skeptical of the whole idea than you seem to be.  

Part of this is my perception that generally neither group is entirely right about every issue, and therefore no group I pick will have my wholehearted support.  This is acceptable; compromise on less crucial matters is often the price of working toward your most important goals.  Having said that, I think it's important to remember what your important goals are and to periodically ask yourself whether the gains are still worth the compromises.  Durkheimian worship is rather directly contrary to this sort of cost-benefit analysis.

Or it could just be that I'm Aspergian, and my normal modes of thinking are highly anti-correlated with religion

comment by Slider · 2021-11-06T13:45:21.306Z · LW(p) · GW(p)

Inquiries on the line of "what they even mean when they say god"? have found some purchace even thought that is more of a deconstruction of religion.

To me it was not totally without payload but it is more tainted than useful. Others when discussion adjacent things have been very careful that the concepts are groundable and clear with very small wiggleroom for natural confusion. Here no such distinguishment is attempted. And I additionally believe that its not just a case of having apperance of shaky thoughts but actually containing shaky thoughts.

Even at the very beginning, how am I supposed to replicate divine revelevation? Pray and check how I feel? This is just reference and reliance on methods which not have been proved to be epistemical tools and have been proven to be epistemically misleading.

There seems to be a tone that is focused on details of impression of the idea, a kind of "appeal to aesthetics" which is kind of diffcult to engage with. It also has smells of writing the bottom line first and coming up with the argument. If I follow the arguments from top to bottom it justifies/grounds like only half or a slight shadow rather than a "religious attitudes are resonable" what it seems to actually be cornerned with.

The topic is not an issue but if you are playing with fire and don't even have a bucket of water nearby I am going to discourage you, not because fire is a forbidden tool but because you are being reckless in the way you are going about it.

It would be fine if X-ray radiation discoverer gets burned by it, they couldn't reasonably know. But since we know about the dangers, we don't let you get burned by your private knowledge just as we take tinder out of the hands of children (or superwise the play to not burn down houses).