How can we stop talking past each other when it comes to postrationality?
post by ChristianKl · 2021-03-12T15:15:20.948Z · LW · GW · 19 commentsContents
19 comments
In LessWrong: A Community for Intellectual Progress w/ Oliver Habryka and Ben Pace at The Stoa the two answer to a question about what about postrationalism and metarationalism.
In it they assert that David Chapman's critique isn't direct at the rationality at Lesswrong and that David Chapman hasn't read LessWrong. Given that at the time of this writing David Chapman having 220 karma at LessWrong, I don't think it's fair to say that he doesn't understand LessWrong.
As I understand Oliver, from his perspective rationality at LessWrong is at the moment a pre-paradigmatic field. That means it's a field without a paradigm and we don't have a paradigm of rationality yet. From David Chapman's perspective Eliezer Yudkowsky proposed a paradigm of Bayesianism in the sequences. David Chapman wrote a critique of that paradigms based on his work as a MIT Phd is artificial intelligence in 2015.
From David Chapman perspective some people adopt that paradigm after reading the sequences. Being at the stage of Kegan 4 means being able to reason according to a paradigm. When David Chapman uses the word rationalism he means both people who picked up a paradigm of rationality after reading the sequences and also people who got a paradigm of reasoning from a good university education and are able to reason according to it.
When In the talk Ben speaks about there was a time where he actually did reason from one paradigm and couldn't really engage with other schools of thought. Later Ben moved on and did actually become able to engage with other schools of thoughts. This moving on is likely what David Chapman would model as the transition from Kegan 4 to Kegan 5.
One perspective would be to say that when Ben read the sequences at 13, he adopted a suboptimal paradigm and later moved on from that paradigm. From the perspective of Kegan's framework, adopting that paradigm was however likely very good for Ben's development as it allowed him to go from Kegan 3 to Kegan 4 which is an important step in development. Not everyone moves from Kegan 3 to Kegan 4 and many people need a good university education to make the transition. Making that step at 13 is fast cognitive development.
Globally, we have a problem that while universities used to be good at getting people to make the Kegan 3 to Kegan 4 transition postmodern thought reduced the ability of university programs to get their students to make that transition. At Kegan 3 people have the issue that "other people are experienced as sources of internal validation, orientation, or authority". A lot of toxic culture war dynamics come from people waging power who haven't made it to Kegan 4.
If people who self educate or are in a failing university program can read the sequencing and HPMOR and move from Kegan 3 to Kegan 4 by adopting a paradigm of rationality that's great. It's a lot better then telling them: "We are pre-paradigmatic at the moment, so sorry we don't have a paradigm to give you to make your Kegan 3 to Kegan 4 transition".
Taking that stance has a similar issue as the problem that the postmodernists who attacked paradigms from the perspective of Kegan 5 had. You can't bring people directly from 3 to 5, so a lot of the new generation stayed stuck on Kegan 3 with all the bad consequences that this entails. The people who put postrationalism on their bannar have no good way of dealing with people at Kegan 3 and face the risk of getting filled with people at that level.
19 comments
Comments sorted by top scores.
comment by habryka (habryka4) · 2021-03-12T17:51:13.256Z · LW(p) · GW(p)
David Chapman has said himself that when he is referring to rationality, what he is talking about has nothing to do with LessWrong. He is referring to the much older philosophical movement of "Rationalism". The whole thing with Chapman is literally just an annoying semantic misunderstanding. He also has some specific critiques of things that Eliezer said, but 95% of the time when he critiques rationalism has absolutely nothing to do with what is written on this site.
Replies from: habryka4, Kaj_Sotala↑ comment by habryka (habryka4) · 2021-03-12T17:55:19.310Z · LW(p) · GW(p)
Also, having 220 karma on the site is really not much evidence you understand what rationality is about. David Gerard has over 1000 karma and very clearly doesn't understand what the site is about either.
I am pretty sure Chapman has also said he hasn't read the sequences, though generally I think he understands most content on the site fine. The problem is again not that he doesn't understand the site, but just that he is using the word rationality to mean something completely different. I like a bunch of his critique, and indeed Eliezer made 90% of the same critiques when he talks about "old Rationality" in the sequences.
See this tweet:
https://twitter.com/Meaningness/status/1298019579978014720
Replies from: ChristianKlNot sure to what extent I’m subtweeted here, but in case clarification is helpful, by “rationalism” I do NOT mean LW. It’s weird that the Berkeley people think “rationalism” is something they invented, when it’s been around for 2600+ years.
Prototypical rationalists are Plato, Kant, Bertrand Russell, not Yudkowsky. If you want a serious living figure, Chomsky or Dennett maybe.
↑ comment by ChristianKl · 2021-03-12T18:34:04.143Z · LW(p) · GW(p)
The problem is again not that he doesn't understand the site, but just that he is using the word rationality to mean something completely different. I
Yes, but he does use the word Bayesanism to talk about the paradigm of the sequences. He has written a substantial criticism of Bayesanism (with is Yudkowsky and not Plato, Kant or Bertrand Russell).
David Gerard has over 1000 karma and very clearly doesn't understand what the site is about either.
David Gerard not only has 1000 karma but for a long time admin rights at as least our Wiki. I think it's strawmanning him to say that he just doesn't understand LessWrong when he spent years in our community and then decided that it's not the right place for him anymore.
I also think there's an issue here of saying that people who spent most of their time on LessWrong long before you signed up your account left and had critiques simple don't understand what LessWrong was about.
I think David has a strong sense that it's important to put faith in established authorities and he correctly assess that LessWrong is not about following established authority. It's the same clash that gets him to write against crypto currency.
Replies from: Viliam, habryka4, habryka4↑ comment by Viliam · 2021-03-12T23:59:38.904Z · LW(p) · GW(p)
I looked at how exactly David [LW · GW] got the 1000 karma points here, curious whether there were some super popular articles I missed. The answer seems to be that he created lots of Open Threads and similar, each of them getting a few upvotes. The second most frequent things is links to external articles (including two articles about why you shouldn't buy Bitcoins, written in 2011 [LW · GW] and 2014 [LW · GW], heh).
Looking for a text he wrote that isn't a short comment on a linked external page, I found "Attempt to explain Bayes without much maths, please review [LW · GW]", a reasonable short non-technical summary of what Bayesianism means, currently at 17 karma.
Plus lots of comments, most written at 2014 or sooner, and many of those are quite good! (Then he got downvoted in debates about chiropractors [? · GW] and neoreaction [? · GW], and then he quit. Returned briefly in 2017 when I triggered him by my comment [LW · GW] on his edits at Wikipedia page about LessWrong.)
I think David has a strong sense that it's important to put faith in established authorities and he correctly assess that LessWrong is not about following established authority. It's the same clash that gets him to write against crypto currency.
To me, this seems like the essence of RationalWiki. Rationality = what the mainstream authorities approve of. Going beyond that, that's crazy talk, and needs to be called out. For example, the RW page on LW cannot resist pointing out that the Sequences present "many-worlds (a mainstream, but by no means universally accepted interpretation of quantum mechanics) as if it was proven fact". Like, the authorities don't even oppose this, they merely don't universally accept it, and RationalWiki already needs to warn the reader about the heresy.
Dunno, seems to me that David is really passionate about things, and he means well, and he tried to like LessWrong despite all its faults... but he has a very strong need to be right / to be on the right side, and he takes it really hard when something wrong happens, he can't tolerate disagreement even if it's in the past and no one other than him cares anymore. Basilisk is a history, it was scary at first and then it was boring and then it got annoying; neoreaction is a history, we flirted with them a bit and then they got boring and annoying and we showed them the door; we don't think about them anymore. For David, it's probably still alive, still worth talking about, he needs a closure. It must be documented on Wikipedia that he was right and we were wrong. Even his bullying of Scott, it's kinda like when religious fanatics are harassing an apostate; from their own perspective they mean well, it is tough love, they are trying to make him repent and save his soul! (Except, oops, now David is officially banned from editing the Wikipedia article on Scott.)
Of course, this is all mind-reading, and I may be completely wrong. But it's my current best guess. I know David is a good guy in his own story, now I believe I got a glimpse of it, and it just makes me feel sad. I wish him well, and I also wish he could get over LessWrong and the rationalist community already. Though that seems unlikely to happen; internet is a small place, we will keep bumping into each other.
Replies from: TAG↑ comment by TAG · 2021-03-13T00:32:57.275Z · LW(p) · GW(p)
LW cannot resist pointing out that the Sequences present “many-worlds (a mainstream, but by no means universally accepted interpretation of quantum mechanics) as if it was proven fact”. Like, the authorities don’t even oppose this, they merely don’t universally accept it, and RationalWiki already needs to warn the reader about the heresy
You say that like it's a bad thing.
Knowing how sure science is about it's claims is an important part of science. A member of the reading public who believes that the interpretation of quantum mechanics is a complex subject that even the experts don't understand has a better understanding than someone who thinks MWI is 99.999% certain.... even if MWI is correct.
....warn the reader about the heresy
Science says there are a number of possible answers, lesswrong was there is one...who is being more religious?
Replies from: Viliam↑ comment by Viliam · 2021-03-13T15:49:18.046Z · LW(p) · GW(p)
Suppose you have three hundred scientists, and three competing interpretations. Hundred scientists believe S1, hundred scientists believe S2, and hundred scientists believe S3. LW believes S1.
I guess my point is that I wouldn't consider it necessary to add a disclaimer to the three hundred scientists. Therefore I don't consider it necessary to add such disclaimer to LW.
But of course, adding disclaimers to everyone is also a consistent opinion.
To me it seems like a funny misson creep, in context of RationalWiki: start with calling out pseudoscience, end with calling out people who agree with some-but-not-all mainstream scientists.
Replies from: TAG↑ comment by TAG · 2021-03-13T16:10:05.079Z · LW(p) · GW(p)
If you have a hundred scientists scattered through the world who believe S1 , and have nothing else in common, that's one thing. If they all live together, know each other and go to the same church, then there is reason to believe their acceptance of S1 is groupthink, and not pure scientific objectivity.
↑ comment by habryka (habryka4) · 2021-03-12T20:43:47.973Z · LW(p) · GW(p)
No, his critique of bayesianism is also attacking something very different from the sequences, it is again talking about something much narrower. Indeed, substantial fractions of the sequences overlap with his critique of bayesianism (in particular all the stuff about embededness, logical uncertainty, incomputability and TDT-style concerns). I don't think he agrees with everything in the sequences, but when he writes critiques, I am pretty sure he is responding to something else than the sequences.
↑ comment by habryka (habryka4) · 2021-03-12T20:45:18.570Z · LW(p) · GW(p)
David Gerard not only has 1000 karma but for a long time admin rights at as least our Wiki. I think it's strawmanning him to say that he just doesn't understand LessWrong when he spent years in our community and then decided that it's not the right place for him anymore.
No, just because you spend years here does not mean you understand the core ideas.
I think we have plenty of evidence that David Gerard frequently completely makes up random strawmans that have nothing to do with us, and maybe there is a small corner of his mind that does have an accurate model of what we are about, but almost always when he writes something he says random things that have very little to do with what we actually do.
↑ comment by Kaj_Sotala · 2021-03-12T18:13:59.480Z · LW(p) · GW(p)
Chapman has also specifically said that he does not understand LW:
I frequently emphasize that by “rationalism” I am specifically NOT referring to the LW usage, and that is not the target of my critique. I gave up on trying to understand LW rationalism ~5 years ago.
comment by spkoc · 2021-03-12T16:10:24.142Z · LW(p) · GW(p)
I have a shallow read a few posts about it overview of the post-rationality vs rationality debate, but to me it just seems like a semantic debate.
Camp "post-rationalism isn't a thing" argues that rationality is the art of winning. Therefore any methods that camp "post-rationalism" uses that work better than a similar method used by people in camp "post-rationalism isn't a thing" is the correct method for all rationalists to use.
The rationalist definition is sort of recursive. If you live the ideology correctly than you should replace worse methods with better ones. If it turns out that bayesian thinking doesn't produce good(or better than some alternative) results, rationalist dogma says to ditch bayesianism for the new hotness.
Taken to an extreme: in a brute survival context a lot of the current ... aesthetics or surface level features of rationalism might have to be abandoned in favour of violence, since that is what survival/winning demands.
But it can't be that simple or there wouldn't be a debate so what am I missing?
Replies from: Viliam↑ comment by Viliam · 2021-03-13T00:40:36.606Z · LW(p) · GW(p)
I believe this is mostly correct, and the missing part is the "post-rationalism" camp saying: but this is only what you say you would do, but you never actually do it. Talking about the nameless virtue of rationality [LW · GW] is not the same as actually practicing it. Like, you preach that a map is not the territory, and then you take your Bayesian map and refuse to look at anything else. You don't even have the ability to seriously consider multiple maps at the same time, a.k.a. the Kegan level 5.
Well, that's the motte. The bailey is that the "post-rationalism" camp believes they already found the better method, and it's mysterious [? · GW], cannot be explained to someone who doesn't have it already (but the people who have it can recognize each other); it is related to Buddhism, but wise spiritually grownup people of other religious traditions probably also have it, only atheist nerds don't have it; and it gives you an immediate perception of reality. From their perspective, "rationalists" are just retarded kids. If you don't believe this, it just proves you're stupid. (Rationalist: "What?!" Post-rationalist: "Exactly.")
My preferred answer is that I am at Kegan level 6 (a level so high that Kegan himself didn't know it existed), so of course the post-rationalists who are mere Kegan level 5 simpletons cannot understand me. Your turn.
comment by Ben Pace (Benito) · 2021-03-14T06:42:14.096Z · LW(p) · GW(p)
When In the talk Ben speaks about there was a time where he actually did reason from one paradigm and couldn't really engage with other schools of thought. Later Ben moved on and did actually become able to engage with other schools of thoughts. This moving on is likely what David Chapman would model as the transition from Kegan 4 to Kegan 5.
I've think I've never used the word 'kegan levels' before, but I admit I'm more likely to start using it if you keep go about saying that I'm at level 5 ;)
comment by Emiya (andrea-mulazzani) · 2021-03-12T15:53:26.089Z · LW(p) · GW(p)
One perspective would be to say that when Ben read the sequences at 13, he adopted a suboptimal paradigm and later moved on from that paradigm. From the perspective of Kegan's framework, adopting that paradigm was however likely very good for Ben's development as it allowed him to go from Kegan 3 to Kegan 4 which is an important step in development. Not everyone moves from Kegan 3 to Kegan 4 and many people need a good university education to make the transition. Making that step at 13 is fast cognitive development.
I think this would be more smooth to understand if you defined what Kegan 3 means before you use the term in this paragraph. It left me struggling a bit to follow, but I'm pretty tired at the moment so maybe that's just on me.
It was a very interesting read and I think it's a good frame to look at things.
I'm a bit puzzled that one who read (if meant as "learned") the sequences would be left at Kegan 4, if I understood correctly and all there is to Kegan 5 is to be able to engage with other school of thoughts, then the sequences seem to hammer on that point pretty often. I've taken as a personal rule to allow people to try persuade me of their reasons/school of thoughts specifically because it was one of the main points of the sequences.
Replies from: ChristianKl↑ comment by ChristianKl · 2021-03-12T18:49:15.811Z · LW(p) · GW(p)
I think this would be more smooth to understand if you defined what Kegan 3 means before you use the term in this paragraph.
Kegan stages are a complex concept and I don't think I can do a good job for getting the concept accross by writing a one-paragraph definition.
I'm a bit puzzled that one who read (if meant as "learned") the sequences would be left at Kegan 4
That's not the claim I'm making.
You can read the sequences in many different ways and depending on where you start it will have a different outcome. If you read it in a state where you are at Kegan's stage 4 and have an established paradigm according to which you can reason you get into a situation where you are faced with two competing paradigms. It's not surprising if that can have the effect of going "There's not one paradigm that has all the answers, different paradigms have their use at different points in time" and opening up to more different
↑ comment by ChristianKl · 2021-03-15T10:13:26.704Z · LW(p) · GW(p)
I care more about substance then what things sound like.
comment by tomNth · 2021-03-20T13:43:24.381Z · LW(p) · GW(p)
There are different uses of the word rationality in different contexts , finding which is which is several uses of them.
Even in Plato, Kant, Bertrand Russell , Chomsky and Dennett , there are probebly several uses (some overlape with those of the others and with LW and David Chapman uses).
↑ comment by ChristianKl · 2021-03-16T09:50:00.682Z · LW(p) · GW(p)
It seems how things sound is very important to you. As I said above I care more about the underlying substance.
↑ comment by ChristianKl · 2021-03-15T19:33:20.534Z · LW(p) · GW(p)
I think there a huge difference between using concepts from adult cognitive development from a Harvard psychology Phd and what a science fiction author wrote together.