Rationality Reading Group: Part C: Noticing Confusion

post by Gram_Stone · 2015-06-18T01:01:27.351Z · LW · GW · Legacy · 3 comments

Contents

  C. Noticing Confusion
None
3 comments

This is part of a semi-monthly reading group on Eliezer Yudkowsky's ebook, Rationality: From AI to Zombies. For more information about the group, see the announcement post.


Welcome to the Rationality reading group. This week we discuss Part C: Noticing Confusion (pp. 81-114)This post summarizes each article of the sequence, linking to the original LessWrong post where available.

C. Noticing Confusion

20. Focus Your Uncertainty - If you are paid for post-hoc analysis, you might like theories that "explain" all possible outcomes equally well, without focusing uncertainty. But what if you don't know the outcome yet, and you need to have an explanation ready in 100 minutes? Then you want to spend most of your time on excuses for the outcomes that you anticipate most, so you still need a theory that focuses your uncertainty.

21. What Is Evidence? - Evidence is an event connected by a chain of causes and effects to whatever it is you want to learn about. It also has to be an event that is more likely if reality is one way, than if reality is another. If a belief is not formed this way, it cannot be trusted.

22. Scientific Evidence, Legal Evidence, Rational Evidence - For good social reasons, we require legal and scientific evidence to be more than just rational evidence. Hearsay is rational evidence, but as legal evidence it would invite abuse. Scientific evidence must be public and reproducible by everyone, because we want a pool of especially reliable beliefs. Thus, Science is about reproducible conditions, not the history of any one experiment.

23. How Much Evidence Does It Take? - If you are considering one hypothesis out of many, or that hypothesis is more implausible than others, or you wish to know with greater confidence, you will need more evidence. Ignoring this rule will cause you to jump to a belief without enough evidence, and thus be wrong.

24. Einstein's Arrogance - Albert Einstein, when asked what he would do if an experiment disproved his theory of general relativity, responded with "I would feel sorry for [the experimenter]. The theory is correct." While this may sound like arrogance, Einstein doesn't look nearly as bad from a Bayesian perspective. In order to even consider the hypothesis of general relativity in the first place, he would have needed a large amount of Bayesian evidence.

25. Occam's Razor - To a human, Thor feels like a simpler explanation for lightning than Maxwell's equations, but that is because we don't see the full complexity of an intelligent mind. However, if you try to write a computer program to simulate Thor and a computer program to simulate Maxwell's equations, one will be much easier to accomplish. This is how the complexity of a hypothesis is measured in the formalisms of Occam's Razor.

26. Your Strength as a Rationalist - A hypothesis that forbids nothing permits everything, and thus fails to constrain anticipation. Your strength as a rationalist is your ability to be more confused by fiction than by reality. If you are equally good at explaining any outcome, you have zero knowledge.

27. Absence of Evidence Is Evidence of Absence - Absence of proof is not proof of absence. But absence of evidence is always evidence of absence. According to the probability calculus, if P(H|E) > P(H) (observing E would be evidence for hypothesis H), then P(H|~E) < P(H) (absence of E is evidence against H). The absence of an observation may be strong evidence or very weak evidence of absence, but it is always evidence.

28. Conservation of Expected Evidence - If you are about to make an observation, then the expected value of your posterior probability must equal your current prior probability. On average, you must expect to be exactly as confident as when you started out. If you are a true Bayesian, you cannot seek evidence to confirm your theory, because you do not expect any evidence to do that. You can only seek evidence to test your theory.

29. Hindsight Devalues Science - Hindsight bias leads us to systematically undervalue scientific findings, because we find it too easy to retrofit them into our models of the world. This unfairly devalues the contributions of researchers. Worse, it prevents us from noticing when we are seeing evidence that doesn't fit what we really would have expected. We need to make a conscious effort to be shocked enough.



This has been a collection of notes on the assigned sequence for this week. The most important part of the reading group though is discussion, which is in the comments section. Please remember that this group contains a variety of levels of expertise: if a line of discussion seems too basic or too incomprehensible, look around for one that suits you better!

The next reading will cover Part D: Mysterious Answers (pp. 117-191). The discussion will go live on Wednesday, 1 July 2015 at or around 6 p.m. PDT, right here on the discussion forum of LessWrong.

3 comments

Comments sorted by top scores.

comment by Gram_Stone · 2015-06-18T01:04:52.486Z · LW(p) · GW(p)

Reposting something that I originally posted elsewhere:

I'm reading Rationality and I'm paraphrasing/summarizing the posts so that I understand and internalize them better. I got to Focus Your Uncertainty, and it felt a bit more opaque than the others, what with all of the sarcasm and italics. I compared my summary with the wiki summary, but I felt like the wiki summary was sort of like a dictionary definition that uses its own word. I'd appreciate it if someone could give me feedback on my summary:

Even when you don't have an explicit prior to refer to, you still have an implicit, intuitive prior that is more likely to be accurate than a prior that assigns equal probability to all outcomes, and you should use it.

comment by [deleted] · 2015-06-30T13:03:38.269Z · LW(p) · GW(p)

This was one of the ideas I never really understood in the sequences, I filed it away as a high-level power tool for the unusually intelligent and knowledgeable. I mean the most common reason for feeling confused is not having complete information: we feel up blindly three different parts of an elephant, and feel a tube, a column and a rope, and just go WTF could it be??? It is not fiction or reality that usually confuses us, you have to be a really top level brain master to have that sort of sitution. It is just missing pieces from the puzzle.

I think I understand conservation of expected evidence, I just don't understand how and especially what for to use it, or even more properly, how to notice when not using it. Of course if I am confident about something then fail an important prediction my jaw is all over the floor and I get speechless - how else should it feel? But why does it require an article - doesn't everyone? Suppose John is really confident his spouse loves him and therefore predicts it is really, really unlike that she would cheat on him. Everybody expects that if he finds evidence of cheating he will be utterly shocked and surprised, no? What else would "confident" mean? Does anyone in that sort of situation start rationalizing the new evidence away? No, when it matters, then people' don't, I think. I think Conservation of Expected Evidence deserves a deeper meta, namely "Do you really care about what you believe or you are just pretending?"

Replies from: Gram_Stone
comment by Gram_Stone · 2015-07-01T00:15:26.541Z · LW(p) · GW(p)

Maybe I'm just a top-level brain master, but I would disagree; or at least claim that we're using different words for the same thing. There are plenty of times I can think of where noticing confusion would have been possible and useful. My sister lies a lot, and there have been lots of times where I have this wordless sensation that roughly corresponds to the internal vocalization, "This does not make sense;" I think lots of people experience things like that, and I think that's what we're all talking about here, regardless of whether you call it missing puzzle pieces, or the difference between fiction and reality.

And it doesn't have to be someone explicitly lying as in my example and Eliezer's. I would say that it applies more broadly to resisting that impulse to attempt to explain anything that often accompanies a feeling of confusion. I think it's telling that your example about the cuckold is very mundane. By juxtaposition, it reminds me of the Parable of the Dragon in the Garage: when your epistemic standards don't have a very clear immediate effect on you, you can easily go wild. You don't just pray when your kid is starving; you pray and you try to fix it yourself, promises that He helps those who help themselves notwithstanding. Likewise, delude yourself as to your wife's affairs, and you know it's going to turn out badly. But other things can be so far removed from everyday experience that being practical is not the default action; there's no immediately apparent consequence for being right or wrong. And that's when it's really important to say, not "That's confusing; well, it's probably this," or, "That's confusing; whatever," but, "That's confusing; wait just a cotton-picking minute."