Twitter thread on postrationalists

post by Eli Tyre (elityre) · 2022-02-17T09:02:54.806Z · LW · GW · 32 comments

I wrote the following as a thread on twitter. Kaj Sotala asked me to share it on LessWrong as well, so I have reproduced it here with minimal editing.

(If you are interested in my twitter content, but don't use twitter, I recommend reading it on threadreader. The user experience of reading long twitter threads on twitter is pretty frustrating, and threadreader is somewhat better. Also, I think engaging with twitter is net-bad for most people, and I feel morally obliged to make sure my content is hosted elsewhere, so that I don't add to the incentive for people to do things that are bad for them.)

Here's my theory of the cluster of people that call themselves post-rationalists.

Long ago, @ESYudkowsky wrote the sequences. The sequences had a lot of explicit content, but they also had a "vibe." 

The vibe was something like "intelligence dominates everything; the way to power and winning is thinking really well. If you think well enough, you will be awesome." 

Also, "arrogance and snark about how insane everyone / the world is."

Also, "ambitious, heroic, glorious." 

People read the sequences, and they got whatever they got out of them, but a LOT of what people picked up on was the "vibe". 

For instance, one major thing that I got from the sequences is that "rationalists should win". That I should be able to get what I want by being clever, and that I shouldn't often be in situations where I'm not getting what I want. And I should be indignant about them. 

[Note: when I say "indignant" here, I don't mean an attitude of "this is unfair. Now I will mope." It was more like, "I'm better than this. I will now rise above."]

In 2015, this became a sort of mantra for me. 

In places where other people would be satisfied with the lemons that life gave them, _I_ should win. 

The actual post where it says that rationalists should win is about Newcobm's problem: it's a [reasonably] abstract point about the philosophy of rational choice. 

I mostly wasn't, in this instance, making use of the explicitly stated content. It's not like I was using some _specific_ reasoning technique that is more effective than baseline, and that's why I would win. 

I was adopting an attitude that I picked up on. 

(Compare this to the case where a person reads the sequences and starts doing explicit Bayesian calculations, [edit: regularly], and expects to do better that way. In doing that, a person is responding more to the content, and less to the vibe.) 

I want to emphasize that there's a kind of arrogance, here, of "I'm different than other people, such that they sometimes don't get what they want, but that is beneath my dignity", which I totally had before reading the sequences. 

I didn't get this entirely from Eliezer. Rather, Eliezer put out a thing that resonated with me. 

The sequences matched with, and _justified_ some attitudes that I already had. And they inculcated some new attitudes in me, via mimesis. 

That's not unique to me. Many of the people who were attracted to the sequences, were attracted by the _vibe_, mostly. 

It isn't like people were ignoring the explicit content, but reading the sequences is FUN for some people. 

Some of that is because it is just so fascinating. 

But some of it is because you either resonate with, or enjoy, or feel ego-boosted, by the vibe. 

You'll like the vibe of the sequences more if you like the the snark and the superior attitude, instead of people who were offended by it. Or if you are proud of your intelligence, maybe? 

So the sequences predictably attracted some particular types of people: 

  • Some people who are smart and non-functional, and sincerely see "rationality" as a way to turn their life around, which is comfortable for them, because it involves doing what they were already good at: school stuff / playing with ideas.
  • Some people who are smart and non-functional, who are less trying to be better, and more looking for a way to feel like _they're_ actually the superior ones; everyone else is irrational, and that's what matters.
  • Some people who want to feel important, and found this x-risk narrative as a way to feel important, or have meaning.
  • Some people who resonate a lot with "thinking better is the path to success", because that has actually worked for them in their experience.
  • Lots and lots of people who were nerd-sniped by the ideas, and find social satisfaction from the rationalist social game: jockeying with each other to show how much you know by bringing to bear community shibboleths and/or generically "smart frames" on a question.

The "people pick up on the vibe of the sequences" effect is magnified, because if you went to a rationalist meetup, it would be filled with people who self selected on the basis of resonating with or liking the vibe of the sequences. 

**And if the explicit content deviates from the vibe, the content sometimes gets lost or muddled.** 

[The paragraph below] explicitly says "Don't be a straw Vulcan. Rationality is not about being emotionless!" 

But also, the energetic "feel" of this paragraph, for me, is abstracted, and cognitive, and kind of keeping the emotions at arms length. 

And people will often interpret sentences like these according to the vibe that they project on to them.

And the result is that people say words like "rationality is not about only using your S2", but ALSO they mostly end up trying to do things with explicit S2 thinking. 

People manage to do things like ignoring their emotions (while also thinking they're awesome at emotions) [LW · GW]. 

And it is pretty hard to notice if you're doing something like this! 

So "rationality" for a lot people, is, for better or for worse, largely about that vibe. 

(Especially, though not uniquely, for people who are far from the in person community. People who live in the Bay have more feedback channels with how this rationality stuff lives in the likes of Eliezer, and Anna.) 

And sometimes people live by that ethos for a while, and then find that it doesn't serve them. 

Like, the stance that they're taking towards the world, or towards themselves, isn't good for them, and doesn't help them get what they want. 

The way they relate with their emotions is bad for them. 

Or the way they "overthink" things is neurotic instead of helpful. 

Or the way that they intellectualize everything is isolating. They're not actually satisfied with the human connection that they get that way. 

And so they reject the ethos which they've come to associate with "rationality". 
(Which, they probably learned, in part, from the sequences, and in part had all along. 

And, even if they were doing it all along, rationality culture probably gave them some new and better tools and justifications for hurting themselves.) 

But when they reject the rationality ethos, something kind of funny happens, because every time they make a specific claim about what they're rejecting, some rationalist (like me!) will pop up and point out that the specific claim that they're making about why rationality is not adequate is not only a thing that rationalists are aware of, but is RIGHT THERE in the sequences! 

They'll say something like "rationality is about thinking through everything, explicitly, and that's not actually effective." 

And then I and others will face-palm and be like "did you READ the sequences? Attend a CFAR workshop? We say in a like a hundred places that the point is not to reason through everything explicitly!"

https://twitter.com/ESYudkowsky/status/1462132727583440896?s=20&t=JWoyf4oI_wHmM3ZJgf_pfg

But that's ok. People picked up on a vibe, and the way they embodied that vibe didn't get them what they valued, and so they're correctly pushing back against [something] that they don't want. 

In summary: 

It's all good. People should do things that are good for them. 

Communication is hard. 

It might be nice if people were more curious about how "rationality" sits in others, instead of fighting about what it is or isn't. But no one has to do that either. : )

32 comments

Comments sorted by top scores.

comment by ztzuliios · 2022-02-17T23:32:19.694Z · LW(p) · GW(p)

I think the problem is less in the "vibes" and more in the kind of person that is attracted to rationalism for rationalism's sake.  Ironically, this is also something discussed in the sequences. [LW · GW] I once introduced the sequences to the anarchists I did activism with in college, and while some of them rejected most of it out of hand because it seemed politically unorthodox, some found real value in it and were able to use it to improve their lives or work in various ways. I've met many people since who were not "rationalists" or part of the "rationalist community" but certainly also did the same thing.  

Maybe I am just too traumatized and autistic, but I find it hard to read the sequences without seeing a very hard, emotional overtone.  The tone of the early sequences seems almost desperate.  It seems hard to read "if the hot iron approaches your face, it is rational to feel fear" as something disembodied.  It seems impossible to read "shut up and multiply!" as something unemotional.  But "disembodied" has become a very trendy accusation in the last few years, and I'm not surprised to see it standing in as the more obviously ableist slur "autistic."

Replies from: ChristianKl
comment by ChristianKl · 2022-02-18T11:19:03.245Z · LW(p) · GW(p)

The tone of the early sequences seems almost desperate.  

Disassociating emotions is one of the standards ways people deal with being desperate. That's a way for people to get disembodied. 

Replies from: ztzuliios
comment by ztzuliios · 2022-02-20T17:32:41.464Z · LW(p) · GW(p)

So if the sequences are unemotional, they're disembodied, but if they are emotional, they're also disembodied?

Edit: In hindsight I'm conflating the article Kaj Sota links and the OP's tweets about the sequences keeping emotion "at an arm's length." I don't really agree that the sequences keep emotions at an arm's length, but sure, if this is not the same thing as being "disembodied," they might still be emotional and disembodied.

That said, what is the test that tells us something is not disembodied? Is any attempt to improve one's life through reason on its own evidence for disembodiment? What does the alternate-universe "embodied" version of the sequences look like? 

Replies from: ChristianKl, gworley, mr-hire
comment by ChristianKl · 2022-02-20T19:44:20.947Z · LW(p) · GW(p)

The word disembodied is more about how people relate to their emotions than whether or not they have them. 

If I imagine myself 15 years ago, I don't think it would have been easy for me to grasp what people mean when they do speak about disembodied in a case like this. There's probably room for a good post that explains the concept.

comment by Gordon Seidoh Worley (gworley) · 2022-02-21T19:11:29.811Z · LW(p) · GW(p)

I framed it another way, but I made a whole post about who lot of people, including non-rationalists, are disembodied in a variety of ways, although I use a different term to describe it. See "You are Dissociating (probably) [LW · GW]". Maybe that will help clear up some of the confusion about what people mean when they say "disembodied".

Replies from: ztzuliios
comment by ztzuliios · 2022-02-24T22:01:01.259Z · LW(p) · GW(p)

I'm already familiar, at least at that level, with dissociation, derealization, and depersonalization. That said, the claim made in the OP, and in the article Kaj Sota links echoing the same viewpoint, seems to be less that there are emotions dealt with in unhealthy ways within the sequences, and more that there are no emotions at all in the sequences, that rationalism is a project to replace all intuitive/automatic/uncontrolled processing with explicit/intentional/controlled processing.

Personally I think the construct actually being discussed seems more like avoidance than "embodiment" or dissociation. Clinically, dissociation tends to be very severe; when clinicians talk about PTSD survivors feeling "disembodied," they're referring to tactile hallucinations representing a noticeable disruption of one or more senses. I think your post is similarly a much more expansive reading of the clinical definitions, though IANApsychiatrist. 

Replies from: gworley
comment by Gordon Seidoh Worley (gworley) · 2022-02-26T02:39:40.642Z · LW(p) · GW(p)

Yes, I agree. I think I say as much in the post itself.

comment by Matt Goldenberg (mr-hire) · 2022-02-21T17:25:03.343Z · LW(p) · GW(p)

If the sequences relate to emotions from a disembodied frame, they're disembodied.

Replies from: ztzuliios
comment by ztzuliios · 2022-02-24T22:12:03.711Z · LW(p) · GW(p)

What is the appropriate way to relate to emotions? How could the sequences have avoided disembodiment? The person who originally used that term seems to think that anything like the sequences would be similarly "disembodied," which makes me think that this issue is less about the inappropriate way the sequences relate to emotion, and more about the hubris of attempting to self-improve in ways other than those described in The Tao of Fully Feeling.

Replies from: mr-hire
comment by Matt Goldenberg (mr-hire) · 2022-02-28T01:54:59.766Z · LW(p) · GW(p)

I think there are other things like the sequences around that are much more embodied.

Replies from: elityre
comment by Eli Tyre (elityre) · 2022-08-20T19:55:51.952Z · LW(p) · GW(p)

Examples?

comment by Gordon Seidoh Worley (gworley) · 2022-02-18T01:13:36.527Z · LW(p) · GW(p)

Copying over my replies to this thread on Twitter:

perhaps a subtle variation on this story, but one of the things that pushed me to grow the postrationality meme in the early days was that other people were sucked into this vibe and it just seemed like the best way to get people out was to get them to move on.

many things i wrote were because people, including people in the Bay and who attended cfar workshops, just utterly sucked at applying the lessons of the sequences because they gave off a vibe that made everyone get it wrong, such that it didn't matter what was actually written

i now kind of regret it. not because i think i was wrong, but because i failed to fully appreciate how much the postrationality memeplex would suck in prerationalists. now postrationality things seem like a lot of people not getting they have to be rationalists first.

anyway, I've mostly moved on. not because it doesn't matter at all, but because I've come out the other side of the process and know there's so much work each person has to do for themselves in their own way that it mostly doesn't matter as much as i thought it did

Replies from: gworley, Chris_Leong
comment by Gordon Seidoh Worley (gworley) · 2022-02-18T03:19:05.021Z · LW(p) · GW(p)

One thing I didn't say in that thread that's probably worth adding:

I think a lot of my most contentious posts over the years come down to me arguing against the vibe of the sequences and people trying desperately to refute me by pointing to the content while actively giving off the vibe and me flailing in desperation to get them to see that they're failing to actual live the things they're claiming to believe. I'm not sure it always came off that way, but that's certainly what it felt like.

The only reason my posts are less contentious now is I've given up on trying to directly point out the vibe/content mismatch. I instead come at it with more subtlety and stick to the content. I'm not sure if this is actually better or not, though. I mean sure, don't have tons of people downvoting me in the comments and long comment threads where people talk past each other, but also maybe some people who need to hear they are living the vibe rather than the content don't get that message because it isn't made directly enough.

This framing might at least be a way for me to return to my old ways. I can just say "hey, look, I don't care what you have to say about the content of the sequences, I'm talking about the rationalist vibe and what's wrong with it".

comment by Chris_Leong · 2022-02-18T01:52:55.367Z · LW(p) · GW(p)

Agreed, the biggest problem with post-rationality is people falling into the pre/trans fallacy:

 

Replies from: jalex-stark-1
comment by Jalex Stark (jalex-stark-1) · 2022-02-19T14:32:05.351Z · LW(p) · GW(p)

I mostly know this idea as pre-rigor and post-rigor in mathetmatics:
https://terrytao.wordpress.com/career-advice/theres-more-to-mathematics-than-rigour-and-proofs/

comment by Kaj_Sotala · 2022-02-17T20:05:02.092Z · LW(p) · GW(p)

Thank you for sharing the post. :)

For the benefit of others (I know you saw this already), Rival Voices also had a previous article making a very similar argument on how the content of the Sequences are separate from their vibe, and how that has affected the way they've been received. Excerpt:

This thesis shows how ambient meaning can be encoded in a work of literature, above and beyond the truth-value of the propositions in that work. The thesis is long (150 words) so here’s an intuition pump instead, using hyperintentionality: basically there’s a million things of saying the same thing, in terms of referents and their relations. (The usual example is contrasting “the evening star” to “the morning star”. They both refer to the same thing “out there” but there is a stylistic difference in using one or another and thus the choice of using one or another is informative.) So we get that stylistic choices - which don’t affect content! - are informative, but what are they informative of? The claim is that it is informative of the author’s worldview. Of how it is like that the world feels to them, of the what-it’s-like to be them, in the world. [...]

If you don’t wanna read that just think of Eliezer, Moldbugg’s, or Zero HP’s writing. They each have a super strong and distinct from one another vibe to their work. Ask each of the to express the same proposition and you get three very different works that feel very different.

So I’m trying to make the point that you can conceptually separate a work’s content from its vibe.

Given that, and that EY’s has a body of work, his work can be separated into two parts:
(1) content, and
(2) vibe.

I’d claim that the vibe can be summed up with one word: “disembodied”. Reading him you get the what-it’s-like, the feeling, of being an extremely smart mind that forgot it even has a body, much less it is one. [...]

As the work can be divided into content and vibe you can get objections to the former or the latter.

The objections to the latter, to its vibe, its mood, its aesthetics, give you the post-rats. They don’t deny Eliezer’s content, but they very much deny his vibe. In this denial they substitute Eliezer’s vibe with whichever other vibe. There is no coherent doctrine, no issue positioning, nothing unifying other than the criticism of rats - knowingly or unknowingly because of their disembodied aesthetic - and an attention to aesthetics, since that is what was being disregarded. This is why postrats aren’t a coherent group and why postrat is just an umbrella term. This is why there are “1000 schools of post-rationality”. Everyone sides in their favorite aesthetic and there is no quorum on it so there isn’t a real group.

comment by Adele Lopez (adele-lopez-1) · 2022-02-21T00:50:16.605Z · LW(p) · GW(p)

It seems likely that the "disembodied vibe" is doing something important w.r.t. rationality. For example, it seems like being embodied could make it more difficult to be scope sensitive. From what I've seen, people who get into embodiment/post-rationality also have a strong tendency to get into woo and/or otherwise throw epistemic hygiene to the wind.

That's not to make an excuse for being comfortable with having a "disembodied vibe", it does seem to have downsides as well (at least in my experience of being someone with such a vibe). My point is more that it's likely to be a mistake to throw the entire "disembodied vibe" out wholesale - some parts of it are worth keeping.

Replies from: mr-hire, elityre
comment by Matt Goldenberg (mr-hire) · 2022-02-21T17:23:48.725Z · LW(p) · GW(p)

I think this is more the case of the pre/trans fallacy.  There are "post rationalists" who are drawn to the type of things post-rationalists are doing, but don't have the same ability to sandbox the frames they're stepping into from their epistemic sensemaking, because they were never actually rationalists.

comment by Eli Tyre (elityre) · 2022-02-25T07:41:30.597Z · LW(p) · GW(p)

That's not to make an excuse for being comfortable with having a "disembodied vibe"

I feel like no excuses need to be made for having a disembodied vibe. The word sort of has negative connotations, but I think it doesn't have to. This is just one way that people can be, and it has costs and benefits, just like everything.

comment by Rafael Harth (sil-ver) · 2022-02-20T15:12:24.265Z · LW(p) · GW(p)

Not arguing that the vibe isn't the bigger part in many cases, but I do think there is a mechanism to underestimate the content because parts of it become so obvious that you don't question them. That is, until you talk to someone who e.g. thinks evidence = scientific evidence and that the simulation hypothesis isn't true because it's not scientific.

comment by ChristianKl · 2022-02-17T23:19:05.114Z · LW(p) · GW(p)

There's a thing with people who deconvert from Christianity. They often have something like a god-sized hole and sometimes they start to believe in evolution in a way that's silly. They had all sorts of ideas to evolution that an professor in the subject doesn't have. You could call the professor a "postevoluionist" but that has all sorts of implications that aren't really warrented.

Taking rationality as being about "rationality is about thinking through everything" is something that comes easy to people who haven't invested much time in rationality. If I see someone who has that mindset I don't think "well that's a rationalist". On the other hand if I hear someone say "The value of information of spending five more minutes thinking about that is good while the value of information of spending an hour on it isn't" that's to me an immediate flag for someone showing the rationalist vibe. 

Rationality is about thinking well and that's what the vibe is about, but "all thinking should be explicit" is just a thinking error. 

Replies from: Slider
comment by Slider · 2022-02-18T18:31:37.978Z · LW(p) · GW(p)

If there are a variety of responces to the material that the author didn't intend and we only count "positive" deviations as "being true" to the original and count "negative" deviations as "not being true" that paints a rosier picture than the communication actually accomplishes.

Then there is also the effect that if the listener is smart and can imagine what would have been a smart thing for the speaker to say or "what they must have meant".

If there is a reasonable listenening process that results in a unstandard interpretation that should probably be taken seriously.

Replies from: ChristianKl
comment by ChristianKl · 2022-02-19T18:30:45.644Z · LW(p) · GW(p)

If you take Quantum physics, then you find that some people upon learning of it start talking about Quantum consciousness and others start doing proper physics.

Doing proper physics takes work. Rationality also takes work. Without putting in the work neither is worth much. 

comment by EniScien · 2022-06-05T05:37:57.601Z · LW(p) · GW(p)

I just now realized that before I did not understand the motives of the post-rationalists at all and was extremely confused about it, this post made it much clearer for me.

comment by tristanhaze · 2022-02-18T03:11:31.239Z · LW(p) · GW(p)

This is an instance of arc that clever people have been going through for ages, so I'd like to see more teasing apart of the broader phenomenon from the particular historical episode of the Sequences etc.

A lot of the mixed feelings and lack of identification as rationalists on the part of lots of people who found the Sequences interesting reading is to be explained in terms of their perceiving the vibe you describe and being aware of its pitfalls.

comment by Alexander (alexander-1) · 2022-02-27T01:16:21.477Z · LW(p) · GW(p)

Excellent post, thanks, Eli. You've captured some core themes and attitudes of rationalism quite well.

I find the "post" prefix unhelpful whenever I see it used. It implies a final state of whatever it is referring to.

What meaning of "rationality" does "post-rationality" refer to? Is "post-rationality" referring to "rationality" as a cultural identity, or is it referring to "rationality" as a process of optimisation towards achieving some desirable states of the world?

There is an important distinction between someone identifying as a rationalist but acting in self-defeating and antisocial ways and the abstract concept of optimisation itself.

I started attending in-person LessWrong meetups a few months ago, and I've found that they attract a wide range of people. Of course, there are the abrasive "truth-seekers" who won't miss an opportunity to make others feel terrible for saying anything that they deemed to be factually or morally imperfect. However, on the whole, this is not much different from any other group of people I engage. I fail to see how prefixing a word with "post" solves any problems.

comment by Mathieu Putz · 2022-02-19T20:05:00.426Z · LW(p) · GW(p)

Sorry for the tangent, but how do you recommend engaging with Twitter, without it being net bad?

Replies from: sil-ver, elityre, mr-hire
comment by Rafael Harth (sil-ver) · 2022-02-20T15:16:09.295Z · LW(p) · GW(p)

My advice: follow <50 people, maybe <15 people, and always have the setting on "Latest Tweets". That way, the algorithms don't have power over you, it's impossible to spend a lot of time (because there just aren't that many tweets), and since you filtered so hard, hopefully the mean level of interesting-ness is high.

comment by Eli Tyre (elityre) · 2022-02-25T07:59:32.682Z · LW(p) · GW(p)

Different setups for different people, but for me twitter is close to a write-only platform.

I use a roam extension that let's me write twitter threads from my roam, so that I can post things without needing to go to the site. (I have a special roam database for this, because otherwise I would be concerned about accidentally posting my personal journal entries to twitter.)

I have twitter blocked on my main machine. I have a separate chromebook that I use to browse twitter itself. 

Even on that twitter-specific chromebook, I've blocked the twitter feed, and I use intention to limit my engagement to less than an hour a day, with "cool-downs". I've sometimes relaxed these constraints on my twitter laptop, but when I don't have intention set up, for whatever reason, I'll often get sucked into 4 hour, engaging / interesting, but highly-inefficient twitter conversations.

Every few days, I'll check twitter on on my twitter machine, mostly looking through and responding to my messages, and possibly looking at the pages of some of my favorite people on twitter. 

All of this is to avoid the dopamine loops of twitter, which can suck up hours of my life like nothing else can. 

The character of my personal setup makes me think that maybe it is unethical for me to use twitter at all. Posting, but mostly not reading other people's content, in particular, seems like maybe a defection, and I don't want to incentivize people to be on the platform. (My guess is that twitter is only a litter worse for me than for the median twitter user, though there are also twitter users for which it just straight up provides a ton of value). 

To counter this, I add all my threads to threadreader, so that people can read my content without needing to touch the attention-eating cesspool.

Replies from: Mathieu Putz
comment by Mathieu Putz · 2022-02-27T00:13:07.467Z · LW(p) · GW(p)

That's a very detailed answer, thanks! I'll have a look at some of those tools. Currently I'm limiting my use to a particular 10-minute window per day with freedom.to + the app BlockSite. It often costs me way more than 10 minutes (checking links after, procrastinating before...) of focus though, so I might try to find an alternative.

comment by Matt Goldenberg (mr-hire) · 2022-02-21T17:28:54.395Z · LW(p) · GW(p)

Find and follow people you actually want to be friends with and interact with them as you would actual friends.

When you post, ask yourself if this is something that your friends would find fun or valuable or useful.

comment by Richard_Kennaway · 2022-06-05T09:39:20.222Z · LW(p) · GW(p)

The concept of "vibe" seems very open to being used as an excuse to read whatever you want into a text, whether it is there or not. Anyone not seeing what you think you see can then be dismissed exactly because they do not see it. Any request for evidence of the "vibe" can be dismissed as missing the point by asking for evidence. What answer can someone give to the accusation, "Yes, you didn't say that, but you vibed it"?

The concept is similar to "subtext", but fuzzier. So much fuzzier, that I think perceived "vibe" comes more from the reader than the writer.