Rationality Reading Group: Part P: Reductionism 101

post by Gram_Stone · 2015-12-17T03:03:49.172Z · LW · GW · Legacy · 2 comments

Contents

  P. Reductionism 101
None
2 comments

This is part of a semi-monthly reading group on Eliezer Yudkowsky's ebook, Rationality: From AI to Zombies. For more information about the group, see the announcement post.


Welcome to the Rationality reading group. This fortnight we discuss Part P: Reductionism (pp. 887-935). This post summarizes each article of the sequence, linking to the original LessWrong post where available.

P. Reductionism 101

189. Dissolving the Question - This is where the "free will" puzzle is explicitly posed, along with criteria for what does and does not constitute a satisfying answer.

190. Wrong Questions - Where the mind cuts against reality's grain, it generates wrong questions - questions that cannot possibly be answered on their own terms, but only dissolved by understanding the cognitive algorithm that generates the perception of a question.

191. Righting a Wrong Question - When you are faced with an unanswerable question - a question to which it seems impossible to even imagine an answer - there is a simple trick which can turn the question solvable. Instead of asking, "Why do I have free will?", try asking, "Why do I think I have free will?"

192. Mind Projection Fallacy - E. T. Jaynes used the term Mind Projection Fallacy to denote the error of projecting your own mind's properties into the external world. The Mind Projection Fallacy generalizes as an error. It is in the argument over the real meaning of the word sound, and in the magazine cover of the monster carrying off a woman in the torn dress, and Kant's declaration that space by its very nature is flat, and Hume's definition of a priori ideas as those "discoverable by the mere operation of thought, without dependence on what is anywhere existent in the universe"...

193. Probability is in the Mind - Probabilities express uncertainty, and it is only agents who can be uncertain. A blank map does not correspond to a blank territory. Ignorance is in the mind.

194. The Quotation is Not the Referent - It's very easy to derive extremely wrong conclusions if you don't make a clear enough distinction between your beliefs about the world, and the world itself.

195. Qualitatively Confused - Using qualitative, binary reasoning may make it easier to confuse belief and reality; if we use probability distributions, the distinction is much clearer.

196. Think Like Reality - "Quantum physics is not "weird". You are weird. You have the absolutely bizarre idea that reality ought to consist of little billiard balls bopping around, when in fact reality is a perfectly normal cloud of complex amplitude in configuration space. This is your problem, not reality's, and you are the one who needs to change."

197. Chaotic Inversion - If a problem that you're trying to solve seems unpredictable, then that is often a fact about your mind, not a fact about the world. Also, this feeling that a problem is unpredictable can stop you from trying to actually solve it.

198. ReductionismWe build models of the universe that have many different levels of description. But so far as anyone has been able to determine, the universe itself has only the single level of fundamental physics - reality doesn't explicitly compute protons, only quarks.

199. Explaining vs. Explaining Away - Apparently "the mere touch of cold philosophy", i.e., the truth, has destroyed haunts in the air, gnomes in the mine, and rainbows. This calls to mind a rather different bit of verse:

One of these things Is not like the others One of these things Doesn't belong

The air has been emptied of its haunts, and the mine de-gnomed—but the rainbow is still there!

200. Fake ReductionismThere is a very great distinction between being able to see where the rainbow comes from, and playing around with prisms to confirm it, and maybe making a rainbow yourself by spraying water droplets, versus some dour-faced philosopher just telling you, "No, there's nothing special about the rainbow. Didn't you hear? Scientists have explained it away. Just something to do with raindrops or whatever. Nothing to be excited about." I think this distinction probably accounts for a hell of a lot of the deadly existential emptiness that supposedly accompanies scientific reductionism.

201. Savannah PoetsEquations of physics aren't about strong emotions. They can inspire those emotions in the mind of a scientist, but the emotions are not as raw as the stories told about Jupiter (the god). And so it might seem that reducing Jupiter to a spinning ball of methane and ammonia takes away some of the poetry in those stories. But ultimately, we don't have to keep telling stories about Jupiter. It's not necessary for Jupiter to think and feel in order for us to tell stories, because we can always write stories with humans as its protagonists.

 


This has been a collection of notes on the assigned sequence for this fortnight. The most important part of the reading group though is discussion, which is in the comments section. Please remember that this group contains a variety of levels of expertise: if a line of discussion seems too basic or too incomprehensible, look around for one that suits you better!

The next reading will cover Part Q: Joy in the Merely Real (pp. 939-979). The discussion will go live on Wednesday, 30 December 2015, right here on the discussion forum of LessWrong.

2 comments

Comments sorted by top scores.

comment by torekp · 2015-12-24T18:41:05.900Z · LW(p) · GW(p)

Righting a Wrong Question is indeed a great trick, but not so simple. It can take a lot of tries to get a right question. You can even, sometimes, make things worse when you try to replace one set of terms (e.g., about external world events) with another allegedly more epistemically primitive set (e.g., about subjective perceptions). Philosophy is a bitch.

Replies from: username2
comment by username2 · 2015-12-25T14:39:05.471Z · LW(p) · GW(p)

Interesting. Suppose we select a random bunch of questions and do that. Are we likely to make things better or worse? Was Eliezer's example specifically chosen to make it work? I have no idea.