Sasha Chapin on bad social norms in rationality/EA

post by Kaj_Sotala · 2021-11-17T09:43:35.177Z · LW · GW · 22 comments

This is a link post for https://sashachapin.substack.com/p/your-intelligent-conscientious-in

Contents

22 comments

So, I’ve noticed that a significant number of my friends in the Rationalist and Effective Altruist communities seem to stumble into pits of despair, generally when they structure their lives too rigidly around the in-group’s principles. Like, the Rationalists become miserable by trying to govern their entire lives through nothing but rationality, and the EAs feel bad by holding themselves to an impossible standard of ethics. [...]

I’ve tried to figure out why this happens, and I’ve tried to write about it several times, batting around complex notions that, as I examine them, reveal themselves to be fake models that make me sound smart but don’t explain anything.

But today I realized that it’s generally much simpler than I thought previously. Most of it is just toxic social norms. These groups develop toxic social norms. In the Rationalist community, one toxic norm is something like, “you must reject beliefs that you can’t justify, sentiments that don’t seem rational, and woo things.” In the EA community, one toxic norm is something like, “don’t ever indulge in Epicurean style, and never, ever stop thinking about your impact on the world.”

Generally, toxic social norms don’t develop intentionally, nobody wants them to happen, they’re not written down, and nobody enforces them explicitly. (The intentional development of toxic social norms is otherwise known as founding a cult.) What happens is that there are positive social norms, like, “talking about epistemics and being curious about beliefs is cool,” or “try to be intentional about the positive impact you can have on the world.” These norms are great! But then, the group acts like a group, which is to say, people confer status depending on level of apparent adherence to values. This leads insecure people who completely depend on the group to over-identify with the set of values, to the extent that even slightly contrary actions become forbidden. Not forbidden in the like “we’ll arrest you” way, but in the like “everyone in the room immediately looks at you like you’re being rude if you talk about spirituality” way. 

And then the second, more sinister stage occurs—the point at which these toxic norms are internalized such that they apply to you when you’re in a room alone. As Wittgenstein noted, it’s hard to tell where aesthetics end and ethics begin; it can start to feel unethical, like, dirty, to perform behaviors your peers would think distasteful. Toxic norms eventually pervade completely, to the point where you don’t even want to think bad thoughts. 

Sometimes—often—these forbidden thoughts/actions aren’t even contrary to the explicit values. They just don’t fit in with the implied group aesthetic, which is often a much stricter, more menacing guideline, all the more so because it’s a collective unwritten fiction. “Rationality is cool” becomes “rationality is the best framework” becomes “Rationalist and Rationalist-flavored stuff is a better use of your time than anything else” becomes “it’s uncool if you want to spend a lot of time doing stuff that has nothing to do with testable beliefs, or our favorite issues.” This is all unintentional and implicit. No Rationalist has ever said, to my knowledge, that you shouldn’t write poetry, but a few Rationalists have told me that they feel like they shouldn’t make weird art because it’s dumb and un-Rationalist to do so—they feel they ought to produce useful thoughts instead, even though their hearts are trying to steer them somewhere else. I point out to them that Scott Alexander wrote a fantasy novel for fun, but somehow this isn’t persuasive enough.

Here, I should probably stop and define toxic norms. I think a toxic norm is any rule where following it makes you feel like large parts of you are bad. The EA version is thinking that you’re evil if your soul/body/emotions are crying out for you to relax, slack off a bit, and spend money on yourself, because you ought to be spending every possible moment working on human flourishing. I’ve heard tales of people struggling with their decision to buy a tasty snack rather than donate $5 to charity, and, more worryingly, people feeling guilty that they want to have children, since that would distract them from the work of Improving Humanity. This obviously leads to burnout and self-loathing. Meanwhile, the Rationalist version is thinking that you’re stupid and not worth talking to if you yearn for the spiritual/aesthetic/woo/non-justifiable, or if you haven’t been able to come to grips with your issues through rational means. This leads to emotional damage being ignored, intuition being dismissed, and systematizing being preferred inappropriately above all other modes of thinking and feeling.

One sign of toxic social norms is if your behavior does deviate from the standard, you feel that the only way of saving face is through explaining your behavior via the group values. Like, if you watch the Bachelor all the time, and one of your smart peers finds out about that, you might find yourself hastily explaining that the series is enjoyable to you as an applied experiment in evolutionary psychology, when, in fact, you just like social drama because watching humans freak out is fun. I will never forget hearing a Rationalist friend ask a non-Rationalist friend whether he loved riding motorcycles because it was an experiment in social status, rather than, y’know, vroom vroom fun thing go fast.

I’m not mentioning these communities because I think they’re extra toxic or anything, by the way. They’re probably less toxic than the average group, and a lot of their principles are great. Any set of principles, if followed too strictly and assigned too much social value, can become a weird evil super-ego that creeps into the deepest crevices of your psyche. (One must imagine the serene Yogi seized with crippling shame over their perfectly normal road rage.) These groups are just the ones I’m most familiar with right now, and thus the places where I see these patterns most often. In the past, I would’ve used examples from, like, the chess scene, or the artsy prose scene, but I’m not close to those scenes currently, and haven’t been for years, so I’m not even remotely qualified to talk about them. I’ve heard that the startup scene, the internet poker scene, and the crypto scene have all kinds of native pathologies, but someone else will write those essays.

Also, these norms aren’t toxic for everyone! There are a few people who are, in fact, happiest when they’re entirely, or almost entirely, devoted to the fancy intellectual principles of a specialized group. But this is not most people. And this can actually compound the problem! If there are people in the group who are perfect examples of the desired behavior, they can be positive exemplars, but also negative exemplars—constant reminders that you are falling short. (Also, certain group leaders can quietly, and probably unintentionally, inflect the norms in a subtle way, thus accentuating the degree to which they are seen as exemplary, and the degree to which others are seen as inferior.)

This is, perhaps, an inevitable danger for nerdy people. For lots of intellectual weird people that don’t fit in, their first social stage is rejection from society in general, and then, later on, their second social stage is finding understanding in a tightly-knit subculture. And they cling to this subculture like a life-raft and are willing—happy, even—to initially reject any parts of themselves that don’t fit within this new community. And their new peers, unintentionally, facilitate this rejection. They don’t feel that this is toxic, because they feel like they’ve already seen what social toxicity is: it’s the prime normie directive that we learn in school: don’t be weird, ever. [...]

And the people being deferred to—the senior members of the group—don’t want this dynamic at all, but they don’t necessarily notice that it’s happening, because the outward manifestation of this is people being really impressed by you. Like, if you’re big in the EA scene, and a young freshly minted EA can’t stop talking about how excited they are to do good, and how inspired they are by your virtuousness, there’s maybe no obvious sign that they’ve started rejecting every part of themself that is not congruent to this new identity. You would have no reason to worry about that. You would probably just feel good, and glad that your principles are so convincing. So it’s hard to even see this issue sometimes, let alone figure out how to solve it. (Although I’ve heard from Rationalist luminary Aella that some Rationalists are, indeed, beginning to take it seriously, which is great.)

I don’t know whether all of this can be avoided entirely. Part of it is just growing up. It’s regular Kegan Stage 4 stuff. You conceive of who you are by seeing the world through some epistemic/moral lens, usually the one relied upon by the group who abuses you least. Eventually, you notice the flaws in that lens, and then you become your own thing, clearly related to the group, but distinct from it, not easily captured by any label or list of properties. 

(more behind the link)

22 comments

Comments sorted by top scores.

comment by ChristianKl · 2021-11-17T12:54:04.706Z · LW(p) · GW(p)

My experience of the rationality community is one where we value Daniel's poems and Raemon's songs. The vibe of the LessWrong community weekend is not one of cultural values that tell people to avoid art. 

To the extend that this share is true, for what subset of the rationality community suffers from it?

(By the way, Eliezer Yudkowsky, this is what post-Rationalists are, it’s not that complicated—they don’t have explicit principles because they’ve moved on from thinking that life is entirely about explicit principles. Perhaps you don’t intuitively grasp this because you don’t see the social group you’ve founded as a social group.)

They key reason why he won't grasp that is because he doesn't think that life is entirely about explicit principles.

Replies from: Kaj_Sotala
comment by Kaj_Sotala · 2021-11-17T14:34:25.904Z · LW(p) · GW(p)

To the extend that this share is true, for what subset of the rationality community suffers from it?

I recall having had this feeling, in particular I once mentioned to another member of the community that I was thinking about working on a fiction-writing project but I also felt bad to admit it, because I was afraid he'd look down on me for spending time on something so frivolous. (This was quite a while ago, as in 2014 or something like that.)

Replies from: ChristianKl
comment by ChristianKl · 2021-11-18T09:26:41.458Z · LW(p) · GW(p)

Was that a member of the local community in your country? How much EA contact and how much rationality contact did they have?

Replies from: Kaj_Sotala
comment by Kaj_Sotala · 2021-11-18T19:45:38.833Z · LW(p) · GW(p)

From another country; to be clear, when I told them this they were genuinely surprised and confused by me feeling that way.

comment by LVSN · 2021-11-17T12:05:25.274Z · LW(p) · GW(p)

It seems like any cultural prospiracy to increase standards to exceptional levels, which I see as a good thing, would be quickly branded as 'toxic' by this outlook. It is a matter of contextual objection-solving whether or not large parts of you can be worse than a [desirably [normatively basic]] standard. If it is toxic, then it is not obvious to me that toxicity is bad, and if toxicity must be bad, then it is not clear to me that you can, in fair order, sneak in that connotation by characterizing rationalist self-standards as toxic.

Replies from: AllAmericanBreakfast, GWS, Kaj_Sotala, alexander-1
comment by DirectedEvolution (AllAmericanBreakfast) · 2021-11-17T15:42:24.741Z · LW(p) · GW(p)

The OP provides examples to illustrate what they mean by an overly extreme standard. They also say that many EA/rationalist principles are good, and that there’s less toxicity in these communities than in others.

Might be good to taboo “toxicity.” My definition is “behavior in the name of a worthy goal that doesn’t really deliver that goal, but makes the inflictor of the toxicity feel better or get a selfish benefit in the short run, while causing problems and bad feelings for others.”

For example, a trainer berating trainees in the name of high standards after a failure, in an attempt to punish them into working harder, or in order to make the trainees into the objects of blame by the trainer’s superiors and not the trainer.

Or a person beating themselves up over a $5 purchase for themselves in the name of charity, only to burn out on EA entirely after a few years. This isn’t obviously toxic, by the definition above, except through some sort of internal family systems framework in which one part of themselves is getting off on the mental beratement, while another part suffers. Seems linked to Eliezer’s idea of “hammering down on one part of yourself” from his most recent post here.

comment by Stephen Bennett (GWS) · 2021-11-17T15:33:13.750Z · LW(p) · GW(p)

This critique seems to rely on a misreading of the post. The author isn’t saying the rationality community has exceptionally toxic social norms.

I’m not mentioning these communities because I think they’re extra toxic or anything, by the way. They’re probably less toxic than the average group, and a lot of their principles are great.

Rather that goals, even worthy goals, can result in certain toxic social dynamics that no one would endorse explicitly:

Sometimes—often—these forbidden thoughts/actions aren’t even contrary to the explicit values. They just don’t fit in with the implied group aesthetic, which is often a much stricter, more menacing guideline, all the more so because it’s a collective unwritten fiction.

There’s a bit of an aesthetic parallel to ai alignment. It would be surprising if the poorly understood process that produces social dynamics just so happened to be healthy for everyone involved in the case of the rationality project. Verbalizing some of the implicit beliefs gives people the ability to reflect on which ones they want to keep.

I would expect the author to agree that most (all?) communities contain toxic dynamics.

comment by Kaj_Sotala · 2021-11-17T14:45:03.407Z · LW(p) · GW(p)

It seems like any cultural prospiracy to increase standards to exceptional levels, which I see as a good thing, would be quickly branded as 'toxic' by this outlook.

I read it not as saying that having high standards would be bad by itself, but that the toxicity is about a specific dynamic where the standards become interpreted as disallowing even things that have nothing to do with the standards themselves. E.g. nothing about having high standards for rationality requires one to look down on art.

comment by Alexander (alexander-1) · 2021-11-17T13:22:28.725Z · LW(p) · GW(p)

You make good points. Toxicity is relative to some standard. A set of social norms that are considered toxic from the perspective of, say, a postmodern studies department (where absolute non-offensiveness is prime), might be perfectly healthy from the perspective of a physics research department (where objectivity is prime). It’s important to ask, “Toxic according to who, and with respect to what?”

Emile Durkheim asked his readers to imagine what would happen in a “society of saints.” There would still be sinners because “faults which appear venial to the layman” would there create scandal.

comment by AppliedDivinityStudies (kohaku-none) · 2021-11-18T00:50:04.968Z · LW(p) · GW(p)

Command-f quote marks.

It's highly suggestive that every single "quote" Sasha uses here to illustrate the supposed social norms of the EA/Rat community is invented. He literally could not find a single actual source to support his claims about what EA/Rats believe.

“don’t ever indulge in Epicurean style, and never, ever stop thinking about your impact on the world.”

Does GiveWell endorse that message on any public materials? Does OpenPhil? FHI? The only relevant EA writing I'm aware of (Scott Alexander, Ben Kuhn, Kelsey Piper) is about how that is specifically not the attitude they endorse.

Come on, this is pure caricature.

Replies from: lincolnquirk, PeterMcCluskey, Kaj_Sotala, ChristianKl
comment by lincolnquirk · 2021-11-18T10:18:01.409Z · LW(p) · GW(p)

I don’t think I agree that this is made-up though. You’re right that the quotes are things people wouldn’t say but they do imply it through social behavior.

I suppose you’re right that it’s hard to point to specific examples of this happening but that doesn’t mean it isn’t happening, just that it’s hard to point to examples. I personally have felt multiple instances of needing to do the exact things that Sasha writes about - talk about/justify various things I’m doing as “potentially high impact”; justify my food choices or donation choices or career choices as being self-improvement initiatives; etc.

this article points at something real

comment by PeterMcCluskey · 2021-11-19T17:07:31.014Z · LW(p) · GW(p)

The drowning child argument comes close enough to endorsing that message that Eliezer felt a need to push back on it [LW · GW].

comment by Kaj_Sotala · 2021-11-18T22:39:59.671Z · LW(p) · GW(p)

There's certainly been discussion of people in EA feeling a moral obligation to spend all their time and money on making a positive impact. I've personally felt it and know several others who have, and e.g. these [1 2] articles discuss it, to name just a few examples.

I have probably spent hundreds of hours reading EA material, and have literally never come across an institutional publication with a phrase of the variety:

And Sasha never claims that you would! In fact he explicitly notes that you won't:

Generally, toxic social norms don’t develop intentionally, nobody wants them to happen, they’re not written down, and nobody enforces them explicitly.

comment by ChristianKl · 2021-11-18T13:16:01.880Z · LW(p) · GW(p)

Social norms and what's publically endorsed are not the same thing. It's still debatable whether those norms exist but this is a bad argument.

comment by Adam Zerner (adamzerner) · 2021-11-18T22:41:23.675Z · LW(p) · GW(p)

I agree with the general consensus in the comments that Sasha is under the wrong impression of what the rationality community is about. However, I think that this false impression is very telling. I suspect a lot of people also have this same wrong impression of what the rationality community is about. This seems like an important thing for us to pay attention to.

What can we do about it? I'm not sure. One thing that comes to mind is to simply repeat ourselves a lot.

comment by Rudi C (rudi-c) · 2021-11-21T19:07:38.550Z · LW(p) · GW(p)

I myself sometimes feel bad when I engage in, say, writing fiction. (Reading fiction is pretty obviously useless, so I know I am just doing it for “fun.” It doesn’t confuse me the way producing fiction does.) I was like this before I even knew there was a Rationality subculture. I don’t try to justify these behaviors at all; I am just not sure if they are aligned with my values, or not, and in what quantities they are healthy.

So while I agree with the gist of this post, I believe the core issue to be more of a tradeoff rather than an obvious evil.

comment by Alexander (alexander-1) · 2021-11-17T10:28:40.347Z · LW(p) · GW(p)

This was highly insightful. Thanks for sharing.

How would we go about disincentivizing this drift towards undesirable social norms? This seems like a situation in which individuals acting in their parochial best interest (virtue signalling, gaining social acceptance, over-identifying with an ideal) is detrimental to the group as a whole—and ultimately to the individuals whose identity has become defined by the group. I’m reminded of this quote from The Greatest Show on Earth by Dawkins:

Natural selection […] chooses between rival individuals within a population. Even if the entire population is diving to extinction, driven down by individual competition, natural selection will still favour the most competitive individuals, right up to the moment when the last one dies. Natural selection can drive a population to extinction, while constantly favouring, to the bitter end, those competitive genes that are destined to be the last to go extinct.

I don’t think these phenomena are particular to rationality and EA communities and I don’t deny their existence. My personal experiences (for what they are worth) of these communities have been largely positive. I find LW to be reasonably tolerant. I recall reading a funny criticism of LW on RationalWiki claiming that LW is too tolerant of the less epistemically rigorous ideas. I’ve read horror stories on reddit about toxic spirituality communities (mere anecdotes, I don’t have data on the toxicity of spirituality vs rationality). The drift towards cultishness is present in any human community, and as argued elsewhere on LW, it takes an unwavering effort to resist this drift. 

Replies from: Viliam
comment by Viliam · 2021-11-17T23:47:14.815Z · LW(p) · GW(p)

How would we go about disincentivizing this drift towards undesirable social norms?

Perhaps it could be useful if we had some high-status members in the community, who would sometimes very visibly do something non-rational, non-effective, non-altruist, just because it is fun for them.

As an extreme thought experiment, imagine Eliezer Yudkowsky writing and publishing fan fiction. LOL

Replies from: alexander-1, Bezzi
comment by Alexander (alexander-1) · 2021-11-18T01:02:25.977Z · LW(p) · GW(p)

writing and publishing fan fiction

That made me chuckle. Or writing some of the funniest philosophical humour [LW · GW] I've read.

I don't understand the view that "rationalists" are emotionless and incapable of appreciating aesthetics. I haven't seen much evidence to back this claim, mere anecdotes. If anything, people who see reality more clearly can see more of its beauty. As Feynman put it, a scientist can see more beauty in the world than an artist because the scientist can see the surface level beauty as well as the beauty in the layers of abstraction all the way down to fundamental physics.

If someone consistently fails to achieve their instrumental goals by adhering too firmly to some rigid and unreasonable notion of "rationality", then what they think rationality is must be wrong/incomplete.

comment by Bezzi · 2021-11-20T10:48:46.877Z · LW(p) · GW(p)

Downvoted. Do you actually consider HPMOR non-rational and non-effective? It isn't just fan fiction, it's a tiny layer of fan fiction wrapped around the Sequences. Judging from the numerous comments in the Open Threads starting with "I've discovered LW through HPMOR", I think we could argue that HPMOR was more effective than the Sequences themselves (at least with respect to the goal of creating more aspiring rationalists).

More generally, every single piece of fiction written by EY that I've read so far involves very rational characters doing very rational things, and that's kind of the point. No one is saying that you shouldn't write fiction in general, but I do say that you shouldn't stop being rational while writing fiction. Or poetry. A rationalist poet should lean toward didactic poetry or the like (at least, that's what I would do [LW · GW]). I am probably biased against silly poetry in general, but I personally regard writing dumb verses as I regard eating unhealthy cookies... do it if you need them to have fun, but you shouldn't be proud of this.

Replies from: Viliam
comment by Viliam · 2021-11-21T00:45:17.024Z · LW(p) · GW(p)

More generally, every single piece of fiction written by EY that I've read so far involves very rational characters doing very rational things, and that's kind of the point. No one is saying that you shouldn't write fiction in general, but I do say that you shouldn't stop being rational while writing fiction.

This feels to me like a goalpost being moved.

Yes, Eliezer's characters do smart things, but the likely reason is that he likes writing them that way, the audience enjoys reading that, and he has a comparative advantage doing that. (Kinda like Dick Francis writes about horse racing all the time.)

And I guess HPMOR really is "the Sequences for people who wouldn't read the Sequences otherwise". But was is also strategically necessary to write this or this? New audience, perhaps, but strongly diminishing returns.

The original objection was that rationalists and effective altruists feel like they are not allowed to do things that are not optimal (fully rational, or fully altruistic). Writing HPMOR could have been an optimal move for Eliezer, but the following stories probably were not. They are better explained by a hypothesis that Eliezer enjoys writing.

comment by Nicholas / Heather Kross (NicholasKross) · 2021-11-19T01:14:41.949Z · LW(p) · GW(p)

I wrote my thoughts on this here [LW · GW].