Rationalism before the Sequences 2021-03-30T14:04:15.254Z
Eric Raymond's Shortform 2021-03-26T15:59:39.649Z


Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-04-29T13:33:35.832Z · LW · GW

One big difference is that there are theoretical cracks in the lightspeed wall that don't have any go-to-another-quantum-world analog.  The Alcubierre solution to the field equations is a thing, after all. More importantly for this discussion, we can construct thought experiments about superluminal travel that have truth conditions because we know what a starfield would look like from N lightyears thataway. Quantumporting doesn't have analogues of either of those things.

But that's kind of a distraction.  The interesting question for this discussion is how, if at all, the two claims "galaxies receding outside our light cone continue to exist" and "Russell's teapot exists" are different.  I think we agree that there is a predictivist account of "teapot". 

You assert that a predictivist definition of meaning and truth value cannot sustain an account of the "galaxies" claim, and that predictivism is therefore insufficient. I, a predictivist, deny your assertion - you have smuggled in an assumption that predictivists somehow aren't allowed to assign meaning to counterfactuals that violate physical law, which I (a predictivist) am quite willing to do as long as hypotheically violating that physical law would not bar us from being able to cash out a truth claim in expected experiences.

I believe I am a predictivist who understands predictivism correctly and consistently.  I believe you are a predictivist in practice who has failed to understand predictivism in theory.

How can we investigate, confirm, or refute these claims?

Comment by Eric Raymond (eric-raymond) on Against "Context-Free Integrity" · 2021-04-15T13:43:35.076Z · LW · GW

Terminological point: I don't think you can properly describe your hypothetical rationalist in Stalinist Russia as "paranoid".  His belief that he is surrounded by what amounts to a conspiracy out to subjugate and destroy him is neither fixated nor delusional; it is quite correct, even if many of the conspiracy's members would choose to defect from it if they believed they could do so without endangering themselves.

I also note that my experience of living in the US since around 2014 has been quite similar in kind, if not yet in degree.  I pick out 2014 because of the rage-mobbing of Brendan Eich; that was the the point at which "social justice" began presenting to me as an overtly serious threat to free speech.  Six years later, political censorship and the threat from cancel culture have escalated to the point where, while we may not yet have achieved Soviet levels of repression, we're  closing in fast on East Germany's.

Comment by Eric Raymond (eric-raymond) on Specializing in Problems We Don't Understand · 2021-04-15T12:35:55.090Z · LW · GW

Endorsed.  A lot of this article is strongly similar to an unfinished draft of mine about how to achieve breakthroughs on unsolved problems.

I'm not ready to publish the entire draft yet, but I will add one effective heuristic.  When tackling an unsolved problem, try to model how other people are likely to have attacked it and then avoid those approaches.  If they worked, someone else would probably have achieved success with them before you came along. 

Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-04-15T10:36:36.032Z · LW · GW

To be fair, I haven't followed Less Wrong all that closely over the years. It's more that I've known some of the key people for a while, notably Eliezer himself and Scott Alexander.

Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-04-15T03:12:36.358Z · LW · GW

It seems to me that you've been taking your model of predictivism from people who need to read some Kripke. In Peirce's predictivism,  to assert that a statement is meaningful is precisely to assert that you have a truth condition for it, but that doesn't mean you necessarily have the capability to test the condition.

Consider Russell's teapot.  "A teapot orbits between Earth and Mars" is a truth claim that must unambiguously have a true or false value.  There is a truth condition on on it; if you build sufficiently powerful telescopes and perform a whole-sky survey you will find it. It would be entirely silly to claim that the claim is meaningless because the telescopes don't exist. 

The claim "Galaxies continue to exist when they exit our light-cone" has exactly the same status. The fact that you happen to to believe the right sort of telescope not only does not exist but cannot exist is irrelevant - you could after all be mistaken in believing that sort of observation is impossible.  I think it is quite likely you are mistaken, as nonlocal realism seems the most likely escape from the bind Bell's inequalities put us in.

MWI presents a a subtler problem, not like Russell's Teapot, because we haven't the faintest idea what observing another quantum world would be like.  In the case of the overly-distant galaxies, I can sketch a test condition for the claim that involves taking a superluminal jaunt 13 billion light-years thataway and checking all around me to see if the distribution of galaxies has a huge NOT THERE on the side away from Earth.  I think a predictivist would be right to ask that you supply an analogous counterfactual before the claim "other quantum worlds exist" can be said to have a meaning.

Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-04-08T04:12:49.951Z · LW · GW

Eliezer was more influenced by probability theory, I by analytic philosophy, yes.  These variations are to be expected.  I'm reading Jaynes now and finding him quite wonderful.  I was a mathematician at one time, so that book is almost comfort food for me - part of the fun is running across old friends expressed in his slightly eccentric language.

I already had a pretty firm grasp on Feynman's "first-principles approach to reasoning" by the time I read his autobiographical stuff.  So I enjoyed the books a lot, but more along the lines of "Great physicist and I think alike! Cool!" than being influenced by him.  If I'd been able to read them 15 years earlier I probably would have been influenced.

One of the reasons I chose a personal, heavily narratized mode to write the essay in was exactly so I could use that to organize what would otherwise have been a dry and forbidding mass of detail. Glad to know that worked - and, from what you don't say, that I appear to have avoided the common "it's all about my feelings" failure mode of such writing.

Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-04-08T04:01:50.519Z · LW · GW

I have run across Bucky Fuller of course.  Often brilliant, occasionally cranky, geodesic domes turned out to suck because you can't seal all those joints well enough.  We could use more like him.

Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-04-08T03:58:50.973Z · LW · GW

Great Mambo Chicken and Engines of Creation were in my reference list for a while, until I decided to cull the list for more direct relevance to systems of training for rationality.  It was threatening to get unmanageably long otherwise. 

I didn't know there was a biography of Korzybski.  Thanks!

Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-04-08T03:55:30.257Z · LW · GW

 "Galaxies continue to exist after the expanding universe carries them over the horizon of observation from us" trivially unpacks to  "If we had methods to make observations outside our light cone, we would pick up the signatures that galaxies after the expanding universe has carried them over the horizon of observation from us defined by c."

You say "Any meaningful belief has a truth-condition".  This is exactly Peirce's 1878 insight about the meaning of truth claims, expressed in slightly different language - after all, your "truth-condition" unpacks to a bundle of observables, does it not?

The standard term of art you are missing when you say "verificationist" is "predictivist".

I can grasp no way in which you are not a predictivist other than terminological quibbles, Eliezer. You can refute me by uttering a claim that you consider meaningful, e.g. having a "truth-condition", where the truth condition does not implicitly cash out as hypothetical-future observables - or, in your personal terminology, "anticipated experiences"

Amusingly, your "anticipated experiences" terminology is actually closer to the language of Peirce 1878 than the way I would normally express it, which is influenced by later philosophers in the predictivist line, notably Reichenbach.

Comment by Eric Raymond (eric-raymond) on Mythic Mode · 2021-03-31T19:33:42.191Z · LW · GW

The reference to the Book of the Law was intentional.  The reference to chaos magic was not, as that concept had yet to be formulated when I wrote the essay - at least, not out where I could see it.

I myself do not use psychoactives for magical purposes; I've never found it necessary and consider them a rather blunt and chancy instrument.  I do occasionally take armodafinil for the nootropic effect, but that is very recent and long postdates the essay.

Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-03-31T19:03:59.175Z · LW · GW

Probably, but there is something else more subtle.

Both the cultures you're pointing at are, essentially, engines to support achieving right mindset. It's not quite the same right mindset, but in either case you have to detach for "normal" thinking and its unquestioned assumptions in order to be efficient at the task around which the culture is focused.

Thus, in both cultures there's a kind of implicit mysticism.  If you recoil from that word because you associate it with anti-rationality I can't really blame you, but I ask you to consider the idea of mysticism as "techniques for consciousness alteration" detached from any particular beliefs about the universe.

This is why both cultures a have a use for Zen. It is  a very well developed school of mystical technique whose connection to religious belief has become tenuous.  You can take the Buddhism out of it and the rest is still coherent and interesting.

Perhaps this implicit mysticism is part of the draw for you. It is for me.

Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-03-31T18:39:51.667Z · LW · GW

I think a collection of examples and analysis would be a post in itself.

But I can give you one suggestive example from Twelve Virtues itself: "If you speak overmuch of the Way you will not attain it."

It is a Zen idea that the essence of enlightenment cannot be discovered by talking about enlightenment; rather one must put one's mind in the state where enlightenment is.  Moreover, talk and chatter - even about Zen itself - drives that state away.

Eliezer is trying to say here that the the center of rationalist practice is not in what you know about rationality or how much cleverness you can demonstrate to others but in achieving a mental stance that processes evidence correctly and efficiently.

He is borrowing the rhetoric of Zen to say that because, as with Zen, the center of our Way is found in silence and non-attachment.  The Way of Zen wants you to lose your attachment to desires; the Way of rationality wants you to lose your attachment to beliefs.

Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-03-31T13:39:07.704Z · LW · GW

I actually wouldn't call Zen a "central theme".  More "a recurring rhetorical device".  It's not Zen Buddhist content that the Sequences use, it's the emulation of Zen rhetoric as a device to subtly shift the reader's mental stance. 

Comment by Eric Raymond (eric-raymond) on Eric Raymond's Shortform · 2021-03-31T07:04:41.771Z · LW · GW

I described myself as a subject-matter expert in epistemology.  That means I'm familiar with the branch of philosophy that considers the maintenance and justification of knowledge. and considers different theories of same.

Since you're using the name 'metatroll', I think I'll leave it at that. 

Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-03-31T06:13:31.895Z · LW · GW


Comment by Eric Raymond (eric-raymond) on Eric Raymond's Shortform · 2021-03-31T01:16:32.487Z · LW · GW

I know who Deutsch is, and I'd never even heard that he had a movement around him.

Which is relevant.  I've had my ear to the ground for interesting rationality training since, oh, 1975 or so, and I definitely run in the right circles to pick up rumors of stuff like this.  The fact that your report is my first sign for that crew is from my POV pretty good evidence that its impact was very, very low.

I also question some of your other premises.  Speaking as a person who approaches the Yudkowskian reform from a perspective formed by a previous rationality movement, I don't think it has all that much difficulty communicating with outsiders at all, certainly not compared to the culture around General Semantics.  To the extent it does: well, science is hard.  There's not much point in trying to pitch the Sequences to people much below the American mean IQ level, at least not before our tutorial techniques get a lot better than they are now. 

Nor, speaking as a person with considerable subject-matter expertise in epistemology, do I think this movement has a particularly "immodest" epistemology.  If one doesn't think one's theory knowledge can explain the justification of knowledge in very broad generality, there's not much point in maintaining it at all, is there?

Speaking as a semi-outsider, it's not clear to me that this community has mandatory writings at all. Yes, a lot of us have read parts of the Sequences, if not all (I'm not-all myself) but I see no sign that one's in-groupness depends on having done that.  It's very easy for me to imagine someone fitting into this movement although never having read a word of Yudkowsky, simply by being able to adopt the community's discourse habits and its concerns. 

Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-03-30T23:49:11.015Z · LW · GW

There's a technical problem.  My blog is currently frozen due to a stuck database server; I'm trying to rehost it.  But I agree to your plan in principle and will discuss it with you when the blog is back up.

Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-03-30T23:46:11.909Z · LW · GW

Heh. Come to think of it from that angle, "a bit true, but not really" would have been exactly my assessment if I were in your shoes. Thanks, I appreciate the nuanced judgment. 

Comment by Eric Raymond (eric-raymond) on Eric Raymond's Shortform · 2021-03-30T21:01:07.477Z · LW · GW

Since you've mentioned Rootless Root, I will say that there is another essay I am now thinking of writing about the playful use of Zen tropes.  The rationalist community and the hacker culture both have strong traditions of this sort of play...but, the functional reasons for the tradition are not the same!  And the way they differ is interesting.

That's enough of a teaser for now. :-)

Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-03-30T20:25:49.283Z · LW · GW

I don't really have an interesting answer, I'm afraid. Busy life, lots of other things to pay attention to, never got around to it before.

Now that I've got the idea, I may re-post some rationality-adjacent stuff from my personal blog here so the LW crowd can know it exists.

Comment by Eric Raymond (eric-raymond) on Mythic Mode · 2021-03-30T20:19:08.143Z · LW · GW

Author of "Dancing with the Gods" checks in.

First, to confirm that you have correctly understood the points I was trying to make. I intended "Dancing with the Gods" to be a rationalist essay, in the strictest Yudkowskian-reformation sense of the term "rationalist", even though the beginnings of the reformation were seven years in the future when I wrote it. 

<insert timeless-decision-theory joke here>

Second, that I 100% agree with your analysis of why "Meditations on Moloch" was important.

Third and most importantly, to say that I like your use of the term "sandbox" a lot, and I'm going to adopt it. Maintaining a hard distinction between inside the sandbox and outside really is an important tactic for dealing with mythic mode in general, and magic/theurgy in particular.

You got it from infosec jargon, of course, and I'm going to emphasize its use as a verb. A lot of people have damaged themselves through not understanding that they need to sandbox, and a lot of other people (including, as you imply, many rationalists) fear mythic mode unnecessarily because they don't know that sandboxing is possible.

Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-03-30T19:43:27.858Z · LW · GW

You have an outside view of my writing, so I'm curious. On a scale of 0 = "But of course" to 5 = "Wow, that was out of left field", how surprising did you find it that I would write this essay?

If you can find anything more specific to say along these lines (why it's surprising/unsurprising) I would find that interesting.

Comment by Eric Raymond (eric-raymond) on Rationalism before the Sequences · 2021-03-30T18:35:28.777Z · LW · GW

Ironically, I disagree a bit with lukeprog here - one of the few flaws I think I detect in the Sequences is due to Eliezer not having read enough philosophy.  He does arrive at a predictivist theory of confirmation eventually, but it takes more effort and gear-grinding than it would have if he had understood Peirce's 1878  demonstration and expressed it in clearer language.

Ah well.  It's a minor flaw.

Comment by Eric Raymond (eric-raymond) on Eric Raymond's Shortform · 2021-03-30T14:11:11.629Z · LW · GW

Essay is up.

Comment by Eric Raymond (eric-raymond) on Eric Raymond's Shortform · 2021-03-27T18:34:03.452Z · LW · GW

Alas, I can't give you a sweeping history of a bunch of movements and factions.  The last group really comparable to today's rationalist movement was the community around Alfred Korzybski's General Semantics.  My essay will talk about them.

What is now being mulled over by my beta readers is somewhat more personal and depends on the premise that my experience was representative of a lot of 20th-century proto-rationalists, including in particular Eliezer.  Fortunately I don't have to handwave this; there's reasonably good evidence that it's true, some of which is indicated in the essay itself.

Comment by Eric Raymond (eric-raymond) on Eric Raymond's Shortform · 2021-03-27T17:46:17.271Z · LW · GW

I have a draft I'm fairly pleased with.  Has gone out to some beta readers. 

Comment by Eric Raymond (eric-raymond) on Eric Raymond's Shortform · 2021-03-26T18:45:43.408Z · LW · GW

Now I'm laughing, because looking through those explicit lists I am finding pretty much all of the two dozen or so sources I expected to find based on various hints and callbacks. Almost all of them books very familiar to me as well.

Yes, this essay is going to be fun to write.

Comment by Eric Raymond (eric-raymond) on Eric Raymond's Shortform · 2021-03-26T18:41:07.997Z · LW · GW

Broader history, focusing on certain important developments in the 20th century.

Comment by Eric Raymond (eric-raymond) on Eric Raymond's Shortform · 2021-03-26T16:42:31.870Z · LW · GW

I actually have not seen such a bibliography, though I could infer a lot from his language choices in essays like Twelve Virtues.  Can you share a pointer to his list of forerunners?  

I don't expect there is much on it that will surprise me, but I would very much like to read it nevertheless.

Comment by Eric Raymond (eric-raymond) on Open, Free, Safe: Choose Two · 2021-03-26T16:18:14.916Z · LW · GW

I agree that the distinction you pose is important. Or should be.  I remember when we could rely on it more than we can today.

Unfortunately, one of the tactics of people gaming against freedom is to deliberately expand the definition of "interpersonal attack" in order to suppress ideas they dislike. We have reached the point where, for example: 

  1. The use/mention distinction with respect to certain taboo words is deliberately ignored, so that a mention is deliberately conflated with use and use is deliberately conflated with attack.
  2. Posting a link to a peer-reviewed scientific paper on certain taboo subjects is instantly labeled "hate facts" and interpreted as interpersonal attack.

Can you propose any counterprogram against this sort of dishonesty other than rejecting the premise of safetyism entirely?  

Comment by Eric Raymond (eric-raymond) on Eric Raymond's Shortform · 2021-03-26T15:59:39.881Z · LW · GW

I'm considering writing, as a first post, a reflection on "Rationality Before The Sequences": some history on what the public project of less-wrongness looked like before Eliezer's heroic attempt at systematization.

This is a probe to discover if there would be significant interest in such an essay.

Comment by Eric Raymond (eric-raymond) on Open, Free, Safe: Choose Two · 2021-03-21T00:13:55.494Z · LW · GW

I agree with the reasoning in this essay.

Taken a bit further, however, it explains why valuing "safety" is extremely dangerous - so dangerous that, in fact, online communities should consciously reject it as a goal.

The problem is that when you make "safety" a goal, you run a very high risk of handing control of your community to the loudest and most insistent performers of offendedness and indignation. 

This failure mode might be manageable if the erosion of freedom by safetyism were still merely an accidental and universally regretted effect of trying to have useful norms about politeness.  I can remember when that was true, but it is no longer the case. 

These days, safetyism is often - even usually - what George Carlin memorably tagged "Fascism masquerading as good manners".  It's motivated by an  active intention to stamp out what the safetyists regard as wrongspeech and badthink, with considerations of safety an increasingly thin pretext.

Whenever that's true, the kinds of reasonable compromises that used to be possible with honest and well-intentioned safetyists cannot be made any more. The only way to counterprogram against the dishonest kind is radical rejection - telling safetyism that we refuse to be controlled through it.

Yes, this means that enforcing useful norms of politeness becomes more difficult.  While this is unfortunate, it is becoming clearer by the day that the only alternative is the death of free speech - and, consequently, the strangulation of rational discourse.

Comment by Eric Raymond (eric-raymond) on In defence of epistemic modesty · 2017-11-02T11:33:27.591Z · LW · GW

I think this is utterly horrible advice.

I have blogged a detailed response at Against modesty, and for the Fischer set.