Expecting Short Inferential Distances

post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-10-22T23:42:01.000Z · LW · GW · Legacy · 106 comments

Contents

106 comments

Homo sapiens’s environment of evolutionary adaptedness (a.k.a. EEA or “ancestral environment”) consisted of hunter-gatherer bands of at most 200 people, with no writing. All inherited knowledge was passed down by speech and memory.

In a world like that, all background knowledge is universal knowledge. All information not strictly private is public, period.

In the ancestral environment, you were unlikely to end up more than one inferential step away from anyone else. When you discover a new oasis, you don’t have to explain to your fellow tribe members what an oasis is, or why it’s a good idea to drink water, or how to walk. Only you know where the oasis lies; this is private knowledge. But everyone has the background to understand your description of the oasis, the concepts needed to think about water; this is universal knowledge. When you explain things in an ancestral environment, you almost never have to explain your concepts. At most you have to explain one new concept, not two or more simultaneously.

In the ancestral environment there were no abstract disciplines with vast bodies of carefully gathered evidence generalized into elegant theories transmitted by written books whose conclusions are a hundred inferential steps removed from universally shared background premises.

In the ancestral environment, anyone who says something with no obvious support is a liar or an idiot. You’re not likely to think, “Hey, maybe this person has well-supported background knowledge that no one in my band has even heard of,” because it was a reliable invariant of the ancestral environment that this didn’t happen.

Conversely, if you say something blatantly obvious and the other person doesn’t see it, they’re the idiot, or they’re being deliberately obstinate to annoy you.

And to top it off, if someone says something with no obvious support and expects you to believe it—acting all indignant when you don’t—then they must be crazy.

Combined with the illusion of transparency and self-anchoring (the tendency to model other minds as though the were slightly modified versions of oneself), I think this explains a lot about the legendary difficulty most scientists have in communicating with a lay audience—or even communicating with scientists from other disciplines. When I observe failures of explanation, I usually see the explainer taking one step back, when they need to take two or more steps back. Or listeners assume that things should be visible in one step, when they take two or more steps to explain. Both sides act as if they expect very short inferential distances from universal knowledge to any new knowledge.

A biologist, speaking to a physicist, can justify evolution by saying it is the simplest explanation. But not everyone on Earth has been inculcated with that legendary history of science, from Newton to Einstein, which invests the phrase “simplest explanation” with its awesome import: a Word of Power, spoken at the birth of theories and carved on their tombstones. To someone else, “But it’s the simplest explanation!” may sound like an interesting but hardly knockdown argument; it doesn’t feel like all that powerful a tool for comprehending office politics or fixing a broken car. Obviously the biologist is infatuated with their own ideas, too arrogant to be open to alternative explanations which sound just as plausible. (If it sounds plausible to me, it should sound plausible to any sane member of my band.)

And from the biologist’s perspective, they can understand how evolution might sound a little odd at first—but when someone rejects evolution even after the biologist explains that it’s the simplest explanation, well, it’s clear that nonscientists are just idiots and there’s no point in talking to them.

A clear argument has to lay out an inferential pathway, starting from what the audience already knows or accepts. If you don’t recurse far enough, you’re just talking to yourself.

If at any point you make a statement without obvious justification in arguments you’ve previously supported, the audience just thinks you’re crazy.

This also happens when you allow yourself to be seen visibly attaching greater weight to an argument than is justified in the eyes of the audience at that time. For example, talking as if you think “simpler explanation” is a knockdown argument for evolution (which it is), rather than a sorta-interesting idea (which it sounds like to someone who hasn’t been raised to revere Occam’s Razor).

Oh, and you’d better not drop any hints that you think you’re working a dozen inferential steps away from what the audience knows, or that you think you have special background knowledge not available to them. The audience doesn’t know anything about an evolutionary-psychological argument for a cognitive bias to underestimate inferential distances leading to traffic jams in communication. They’ll just think you’re condescending.

And if you think you can explain the concept of “systematically underestimated inferential distances” briefly, in just a few words, I’ve got some sad news for you . . .

106 comments

Comments sorted by oldest first, as this post is from before comment nesting was available (around 2009-02-27).

comment by Constant2 · 2007-10-23T01:24:07.000Z · LW(p) · GW(p)

The explanation from ancestral environment seems likely. However, there is also a rational argument for refusing to accept a claim unless all the steps have been laid out from your own knowledge to the claim. While there are genuine truth seekers who have genuinely found truth and who we therefore should, ideally, believe, nevertheless a blanket policy of simply taking these people at their word has the unfortunate side-effect of also rendering us vulnerable to humbug, because we are not equipped to tell apart the humbug from the true statements many steps removed from our knowledge.

At the same time, people do not universally reject claims that are many steps removed from their own experience. After all, scientists have made headway with the public. And unfortunately, humbug also regularly makes headway. There have always been niches in society for people claiming esoteric knowledge.

Replies from: None
comment by [deleted] · 2015-07-08T09:47:21.497Z · LW(p) · GW(p)

I think it's about the extent you have reason to believe you trust authority without evidence. What if someone meets 'omega' who is as 100% trustable as the laws of gravity, in their empirical experience? Then, it's 100% rational to trust them, perhaps over their own senses, which are sometimes illusory.

Induce fear to get people to stick with the status quo or make a non-choice, and frustrated anger to get to them to take risks. When people are told something without explanation, they might react with fear out of awe, or anger out of frustration that you haven't presented something rational to them. Vice-versa is possible too. Therefore, I would predict that the inferential distance doesn't have a 1:1 relationship with the uptake of that information.

comment by Robin_Hanson2 · 2007-10-23T01:33:03.000Z · LW(p) · GW(p)

Eliezer, this is a great insightful observation.

comment by Constant2 · 2007-10-23T01:40:08.000Z · LW(p) · GW(p)

The young seem especially vulnerable to accepting whatever they are told. Santa Claus and all that, but also any nonsense fed to them by their schools. Schools for the young are particularly effective instruments for indoctrinating a population. In contrast, the old tend to be quite a bit more resistant to new claims - for better and for worse.

An evolutionary explanation for this is fairly easy to come up with, I think. Children have a survival need to learn as much as they can as quickly as they can, and adults have a vital role as their teachers. In their respective roles, it is best for adults to be unreceptive to new claims, so that their store of knowledge remains a reliable archive of lessons from the past, and it is best for the young to accept whatever they are told without wasting a lot of time questioning it.

Replies from: meanerelk, AlanCrowe
comment by meanerelk · 2010-03-02T02:08:38.857Z · LW(p) · GW(p)

It is too easy to come up with a just so story like this. How would you rephrase it to make it testable?

Here is a counterstory:

Children have a survival need to learn only well-tested knowledge; they cannot afford to waste their precious developmental years believing wrong ideas. Adults, however, have already survived their juvenile years, and so they are presumably more fit. Furthermore, once an adult successfully reproduces, natural selection no longer cares about them; neither senescence nor gullibility affect an adult's fitness. Therefore, we should expect children to be skeptical and adults to be gullible.

Replies from: Kingreaper
comment by Kingreaper · 2011-10-05T13:29:00.594Z · LW(p) · GW(p)

This counterstory doesn't function.

A child's development is not consciously controlled; and they are protected by adults; so believing incorrect things temporarily doesn't harm their development at all.

If you wish to produce a counterstory, make it an actual plausible one. Even if it were the case that children tended to be more skeptical of claims, your story would REMAIN obviously false; whereas Constant's story would remain an important factor, and would raise the question of why we don't see what would be expected given the relevant facts.

comment by AlanCrowe · 2011-10-05T14:12:03.074Z · LW(p) · GW(p)

I've just learned that there is interesting research on this topic. Sorry I don't have better links.

Replies from: JoshuaZ
comment by JoshuaZ · 2011-10-05T14:52:39.073Z · LW(p) · GW(p)

Interesting. Although that strongly suggests that children in fact are more gullible are specifically religious stories. I'd have to wonder if they are actually more gullible than those, have been primed to think that religious stories are allowed to have more fantastic elements and still be true, or have found out that expressing skepticism of such stories is more likely to result in negative consequences. The last seems unlikely to me.

comment by TGGP4 · 2007-10-23T04:12:13.000Z · LW(p) · GW(p)

As long as we're on the subject of evolutionary-psychology/sociobiology/whatever if someone tries to argue against it by saying it's just a bunch of reactionaries trying to justify inequity you can point to the data which says it ain't so. Another soldier sent against the army of reductionism defeated, surely a signal from Providence that all will be assimilated.

comment by tc · 2007-10-23T05:20:11.000Z · LW(p) · GW(p)

For example, talking as if you think "simpler explanation" is a knockdown argument for evolution (which it is)

I don't quite agree - by itself, X being "simpler" is a reason to increase my subjective belief that X is true (assuming that X is in a field where simplicity generally works) but it's not enough to prove e.g. creationism false. Rather, it is the total lack of evidence for anything supernatural that is the knockdown argument - if I had reason to believe that even one instance of say, ghosts or the effects of prayer were true, then I'd have to think that creationism was possible as well.

comment by Psychohistorian2 · 2007-10-23T06:29:54.000Z · LW(p) · GW(p)

This is certainly an insightful post. I'm not sure the example is that compelling though.

If you argue with a young earth creationist, they could full well understand what you mean, but simply disagree and claim "God did it," is a simpler explanation still. In fact, if we were to presuppose an intelligent being of infinite power existed and created things, it seems it would actually be a simpler explanation.

Most people, though perhaps not all, who have no belief in an omnipotent designer will pretty quickly accept evolution. So that might not be the cognitive problem in that situation.

I'm sure there are examples out there (opposition to free trade, perhaps?), but the rejection of evolution in favor of creationism is rather more complex and deep rooted.

Replies from: Arandur
comment by Arandur · 2011-07-31T19:30:58.513Z · LW(p) · GW(p)

Not necessarily. The introduction of God into the story actually makes the theory quite a bit more complex, as far as amount of information stored goes. The length of time it takes to explain your theory does not necessarily correlate to how simple it is. "God did it" is monumentally more complex than "The random process of natural selection ensures that those organisms which have mutations that lend them a better chance of survival will, on average, be more likely to survive and pass those mutated genes on to the next generation than an organism without beneficial mutations, etc etc etc."

Though actually, if you look closely at the two arguments above, they don't necessarily contradict each other. :3 I personally feel that "God did it" is a simpler explanation than "Amino acids magically combined via processes we don't understand and haven't been able to duplicate, creating life essentially ex nihilo"... but that doesn't at all mean taht either of these explanations are objectively simple!

Replies from: shokwave, ciphergoth, rdecal, dlthomas
comment by shokwave · 2011-08-01T02:06:31.040Z · LW(p) · GW(p)

I personally feel that "God did it" is a simpler explanation than "Amino acids magically combined via processes we don't understand and haven't been able to duplicate, creating life essentially ex nihilo"

Is "God did it" a simpler explanation than "amino acids combined via complex and unlikely processes we understand and can even replicate crudely, creating life from a perhaps murky but essentially non-magical source"?

What is your gut reaction?

Replies from: Arandur
comment by Arandur · 2011-08-01T02:14:57.184Z · LW(p) · GW(p)

When isolated in this manner, my gut reaction is "no".

comment by Paul Crowley (ciphergoth) · 2011-08-01T05:38:53.426Z · LW(p) · GW(p)

Have you read Occam's Razor?

Replies from: Arandur
comment by Arandur · 2011-08-01T17:53:45.233Z · LW(p) · GW(p)

I just reread it; thank you for allowing to see one of Eliezer's posts in a new light. Always a pleasure.

However, I have other data at hand that seems to lend credence to the "God exists" theory; I don't have to reply on the results of one test. If I did, then by that same logic, we would always have to assume that a coin once flipped would be 100% biased toward the side upon which is landed.

Your program, in order to describe the universe, has to be the best model of every single point in the universe. I'm sure there were people who argued that Newton's equations were simpler than General Relativity. But the data cannot be denied.

Replies from: Nebu
comment by Nebu · 2015-01-14T05:33:46.094Z · LW(p) · GW(p)

I think there are two distinct concepts here: One of them is Bayesian reasoning, and the other is Solomonoff induction (which is basically Occam's Razor taken to its logical extreme).

Bayesian reasoning is applicable when you have some prior beliefs, usually formalized as probabilities for various theories being true, (e.g 50% chance God did it, 50% amino acids did it), and then you encounter some evidence (e.g. observe angels descend from the sky), and you now want to update your beliefs to be consistent with the evidence you encountered (e.g. 90% chance God did it, 10% amino acids did it). To emphasize, Bayesian reasoning is simply not applicable unless some prior belief to update.

However, I have other data at hand that seems to lend credence to the "God exists" theory;

Sounds like you're referring to Bayesian reasoning here. You're saying without that "other data", you have some probabilities for your various theories, but then when you add in that data, you're inclined to update your probabilities such that "God did it" becomes more probable.

In contrast, Occam's Razor and Solomonoff induction do not work with "prior beliefs" (in fact, Solomonoff is often used, in theory, to bootstrap the Bayesian process, providing the "initial belief", from which you can start using Bayesian to update from). When using Solomonoff, you enumerate all conceivable theories, and then for each theory, you check whether it is compatible with the the data you currently have. You don't think in terms of "this theory is more probable given data set 1, but that theory is more probable given data set 2". You simply mark each theory as "compatible" or "not compatible". Once you've done that, you eliminate all theories which are "not compatible" (or equivalently, assign them a probability of 0). Now all that remains is to assign probabilities to the theories the remain (i.e. the ones which are compatible with the data you have). One naive way to do that is to just assign uniform probability to all remaining theories. Solomonoff induction actually states that you should assign probabilities based on the complexity of the theory.

If I did, then by that same logic, we would always have to assume that a coin once flipped would be 100% biased toward the side upon which is landed.

That's actually not true. Coincidentally, I wrote a web app which illustrates a similar point: http://nebupookins.github.io/binary-bayesian-update/

Mentally relabel the button "Witnessed Failure" with "Saw a coin come up tails" and "Witnessed Success" with "Saw a coin come up heads", then click the "Witnessed Success"/"Saw a coin come up heads" button.

Note that the results is not "You should assume that a coin is 100% biased towards head."

Instead, the results are "There's a 0% chance that the coin is 100% biased towards tail, a tiny chance that the coin is 99% biased towards tail, a slightly larger chance that the coin is 98% biased towards tail" and so on until you reach "about a 2% chance the coin is 100% biased towards head", which is currently your most probable theory. But note that while "100% biased towards head" is your most probable theory, you're extremely non-confident in that theory (only a 2% chance that the theory is true). You need to witness a lot more coin flips to increase you confidence levels (go ahead and click on the buttons a few more times).

Disclaimer: This web app actually uses the naive solution of initially assigning uniform probability to all possible theories, rather than the Solomonoff solution of assigning probability according to complexity.

comment by rdecal · 2011-11-09T00:46:15.659Z · LW(p) · GW(p)

There is no scientist who claims amino acids magically appeared on earth. We have been able to simulate amino acid synthesis using conditions and simple inorganic molecules present on the young earth. Read the Wikipedia article for abiogenesis for a primer if you want to educate yourself.

comment by dlthomas · 2011-11-09T00:58:45.603Z · LW(p) · GW(p)

Once you have posited a God to take care of the creation of the amino acids, "God did it" becomes much simpler an explanation of the rest - referring to an entity that has been established to exist is not a terribly long message.

comment by igor · 2007-10-23T08:58:09.000Z · LW(p) · GW(p)

When you say "A clear argument has to lay out an inferential pathway, starting from what the audience already knows or accepts. If you don't recurse far enough, you're just talking to yourself."

this strongly reminds me of what it is like to try talking, as an atheist, with a christian about any religious issue. I have concluded years ago that I just shouldn't try anymore, that reasonable verbal exchange is not possible...

I suppose that I should recurse... but how and how far where I am not sure.

Replies from: Arandur
comment by Arandur · 2011-07-31T19:33:04.870Z · LW(p) · GW(p)

I'm sure that the Christian feels the same way. ;D The problem there isn't inferential differences. It's belief in belief. The best way to disabuse a Christian of any false notions - under the assumption that those notions are false - would be to lead them to Less Wrong. :P

Of course, you can lead a horse to water...

Replies from: wedrifid
comment by wedrifid · 2011-07-31T21:30:21.239Z · LW(p) · GW(p)

The best way to disabuse a Christian of any false notions - under the assumption that those notions are false - would be to lead them to Less Wrong. :P

I don't agree. I think the best way to disabuse them of such notions would be to lead them to extremely high status atheists including a community of highly attractive potential mates. You change group affiliation beliefs by changing desired group affiliation.

Replies from: Arandur, pianoforte611, Insert_Idionym_Here
comment by Arandur · 2011-07-31T21:41:27.700Z · LW(p) · GW(p)

I think our disagreement stems from a fuzzy definition of the word "best". I believe that it is better to believe something for good (read: valid) reasons than to believe it for bad reasons, regardless of the truth value of the thing being believed. So yes, your suggestion may lead more Christians to toss their Christianity, but mine makes them more rational thinkers, which (under the assumption that their Christian beliefs are wrong, which assumption I decline to assign a truth value in this post) leads them to atheism as a side benefit.

Essentially, this is the question posed: Which is the greater sin, if Christianity is wrong? Christianity, or irrationality?

Replies from: wedrifid, Pavitra
comment by wedrifid · 2011-07-31T21:51:10.585Z · LW(p) · GW(p)

So yes, your suggestion may lead more Christians to toss their Christianity, but mine makes them more rational thinkers

The same influences that make people toss Christianity are also what will influence people to become more rational. Leading people to lesswrong on average makes them scoff then add things to their stereotype cache.

Which is the greater sin, if Christianity is wrong?

If Christianity is wrong then I'd say neither. ;)

Replies from: Arandur, Will_Newsome, Bongo
comment by Arandur · 2011-07-31T22:13:27.638Z · LW(p) · GW(p)

Leading people to lesswrong on average makes them scoff then add things to their stereotype cache.

This, if true, is horribly sad, and I concede the point, letting go of my faith in the inherent open-mindedness of humanity. Of course, I might have known better; my own efforts have reaped no fruit except my wife thinking of Eliezer Yudkowsky as a rabid crackpot. :/

If Christianity is wrong then I'd say neither. ;)

Ha! Then let me elucidate, and define the term "sin" to mean that action which runs against a given moral code.

comment by Will_Newsome · 2011-07-31T23:39:34.949Z · LW(p) · GW(p)

Leading people to lesswrong on average makes them scoff then add things to their stereotype cache.

You often say things with a certain simple realism that jives with me. I've definitely learned to appreciate the style more since I joined LW, and 10 times moreso since really absorbing a few subskills of a few SingInst folk. How much social psychology-like stuff have you studied? I get a weak impression that it's not much more than the average LW regular but that unlike the average LW regular you have the good habit of regularly explicitly talking about (and thus assuredly explicitly thinking about) certain simple but oft-ignored phenomena of standard social epistemology---or perhaps they'd generally be better described as signalling games/competitions with an epistemic flavor. The very-related skill of "being constantly up a meta level" is really the only prerequisite skill for building the master-skill of being able to automatically immediately generate decent models of any real or imagined social epistemic scenario or automatically with-some-effort generate thorough complex models. You strike me as one of the people on LW who could build up this skill and make it a very sharp weapon, which would be generally useful to any community or organization in the coming years that is trying to raise its sanity waterline. (Vladimir_M also obviously has some kind of related skillset.)

I could link you to a concrete example or two in LW comments if you don't quite follow what skill it is I'm getting at or how it's cool.

Replies from: wedrifid
comment by wedrifid · 2011-08-01T09:22:06.246Z · LW(p) · GW(p)

How much social psychology-like stuff have you studied?

Quite a lot but it is not specialised (into PUA etc). I've also probably forgotten a lot, since my interest peaked a few years back.

comment by Bongo · 2011-08-01T10:37:14.788Z · LW(p) · GW(p)

Leading people to lesswrong on average makes them scoff then add things to their stereotype cache.

This is probably because of the site design and not necessary.

Replies from: wedrifid
comment by wedrifid · 2011-08-01T19:26:31.521Z · LW(p) · GW(p)

Leading people to lesswrong on average makes them scoff then add things to their stereotype cache.

This is probably because of the site design and not necessary.

That no doubt makes a difference but my appeal was to universal human behavior. Exposure to new, unusual behaviours from a foreign tribe will most often invoke a rejection and tweaking of social/political positions rather than an object level epistemic update. Because that's what humans care about.

(This doesn't preclude directing interested parties to lesswrong or other sources of object level information. We must just allow that there will be an extremely low rate of updating.)

comment by Pavitra · 2011-08-01T00:59:58.895Z · LW(p) · GW(p)

Which is the greater sin, if Christianity is wrong? Christianity, or irrationality?

I think this would depend considerably on which particular non-Christian set of beliefs turned out to be right. Asking "how should we behave in a non-Christian universe?" sounds to me like asking "what should we feed to a non-cat?".

Replies from: Arandur
comment by Arandur · 2011-08-01T01:09:05.442Z · LW(p) · GW(p)

I'll ask you to review the child of this post wherein I provide a clearer definition of the term "sin". It is a generally held consensus that there is in fact an objective morality which is causally disconnected from (or at least causally unaffected by) any extant religion. In that sense, my question is, I believe, sensical.

The above is predicated upon my inference, from your comment, that you read into my use of the word "sin" a religious connotation. Another possible inference is that you legitimately believe that we live in a Christian universe, and therefore that supposing counterfactuals is useless. In that case, I wonder how you get by during the day without making any plans based upon hypothetical events.

.... and I also, in that case, appreciate not being the only Christian on this site. ;D But that doesn't forgive your error.

Replies from: Pavitra
comment by Pavitra · 2011-08-01T01:20:26.579Z · LW(p) · GW(p)

I did see the comment in which you defined sin.

I'm not sure where our assumptions disconnect, so I'll just try to spell out as many of mine as I can think of.

I assume that Christianity contains or constitutes claims about what the correct moral code is, such that accepting Christianity is true necessarily implies accepting a certain standard of right and wrong. I further assume that there exist at least two mutually-incompatible non-Christian claims about what the correct moral code is.

That is, if we reject Christian moral values, we still have to decide between Buddhism and Hinduism.

Replies from: Arandur
comment by Arandur · 2011-08-01T01:40:28.518Z · LW(p) · GW(p)

Let me verify your meaning before I respond in earnest: You are operating under the proposition that morality necessarily derives from religion?

Replies from: Pavitra
comment by Pavitra · 2011-08-01T01:58:11.499Z · LW(p) · GW(p)

...not exactly. It would be more accurate to say that I'm assuming that most religions, and Christianity in particular, imply moralities, but there may also be nonreligious moralities.

I realize I'm hugely oversimplifying (for example, by treating "Christianity" as internally homogeneous), but I need to omit most of the variables in order to get anything done in finite time.

This started with the phrase "if Christianity is wrong"; are you saying that this was not meant to imply anything along the lines of "if Christian morality is wrong", that it was meant entirely as an empirical proposition, holding moral values constant? [edit: ...holding terminal moral values constant?]

Replies from: Arandur
comment by Arandur · 2011-08-01T01:59:55.355Z · LW(p) · GW(p)

Oh! I see. :3 Yes, that is what I'm saying. If I wasn't Christian, I certainly wouldn't start murdering people.

Replies from: Pavitra
comment by Pavitra · 2011-08-01T02:01:46.766Z · LW(p) · GW(p)

Interesting.

Do you believe, then, that God commands a thing because it is good, rather than that a thing is good because God commands it?

Replies from: Arandur
comment by Arandur · 2011-08-01T02:11:45.847Z · LW(p) · GW(p)

Yes and no. :3 This is one of those "large inferential distances" things, but I'll take a stab at explaining.

First, there are laws that God is bound to; laws of morality, not just laws of physics, although I think He's also, in all probability, bound by the laws of physics (not necessarily as we understand them). This is evidenced by the number of times that God has told us that He is "bound"; if He did not follow these rules, He would "cease to be God".

On the other hand! God gave rules to the Jews (a la all of Deuteronomy) that do not apply to modern-day Christians, because Jesus' sacrifice "fulfilled" that law. God gives different commands at different times to different people: for example, God has at various times in history endorsed polygamy for various peoples, but He has indicated that polygamy outside His explicit instructions is sinful (cf. Jacob 2, D&C 132).

So: Everything that God commands us to do is Good, but not everything that is Good is something that God has explicitly commanded us to do.

comment by pianoforte611 · 2012-08-17T12:27:50.403Z · LW(p) · GW(p)

Is reviving dead threads frowned upon here? That was an incredibly insightful comment to me because it explains my deconversion (from Catholicism) and Leah Libresco's conversion to it (she has a blog on patheos called unequally yoked)*. I wonder how general this is?

*Status is obviously defined by the person whose group affiliation is changing. The high status atheists that changed my desired group affiliation were some atheists on debate.org, who were a lot more like me than any catholics I had met. The high status Catholics that changed Leah's desired group affiliation were her friends, the people in her debating club and her Catholic boyfriend, whom she went to mass with (willingly) for more than a year.

Replies from: wedrifid, DaFranker
comment by wedrifid · 2012-08-17T17:13:07.069Z · LW(p) · GW(p)

Is reviving dead threads frowned upon here?

No, by all means go ahead and comment wherever you have something to say.

comment by DaFranker · 2012-08-17T18:10:54.635Z · LW(p) · GW(p)

As wedrifid said, reviving "dead threads" is fully acceptable and even encouraged in many occasions, AFAICT.

The one thing to be careful of is to enter argument mode or ask questions or offer specific, targeted insight to a particular poster on a very old post. Many of us have wasted some time early on by answering the questions or debating the assertions of an old comment originally made on Overcoming Bias before the transfer and where the author is long gone or never came to LessWrong in the first place.

comment by Insert_Idionym_Here · 2012-09-11T05:04:00.243Z · LW(p) · GW(p)

That is what happened to me.

comment by Gray_Area · 2007-10-23T09:16:41.000Z · LW(p) · GW(p)

This reminds me of teaching. I think good teachers understand short inferential distances at least intuitively if not explicitly. The 'shortness' of inference is why good teaching must be interactive.

Replies from: Viliam_Bur
comment by Viliam_Bur · 2011-08-24T16:53:46.366Z · LW(p) · GW(p)

I think Vygotsky's expression "zone of proximal development" means "one inferential step away", so in theory professional teachers should understand this. I prefer to imagine knowledge like a "tech tree" in a computer game.

When teaching one student, it is possible to detect their knowledge base and use their preferred vocabulary. I remember explaining some programming topics to a manager: source code is like a job specification; functions are employees; data are processed materials; exceptions are emergency plans.

Problem is, when teaching the whole class, everyone's knowledge base is very different. In theory it shouldn't be so, because they all supposedly learned the same things in recent years, but in reality there are huge differences -- so the teacher basicly has to choose a subset of class as target audience. Writing a textbook is even more difficult, when there is no interaction.

comment by g · 2007-10-23T10:32:31.000Z · LW(p) · GW(p)

Psychohistorian: it depends on what you mean by "simple" and "explanation". The sense in which "it's the simplest explanation" is a powerful argument for something is not one in which "God did it" is the simplest explanation for anything.

comment by Silas · 2007-10-23T14:01:50.000Z · LW(p) · GW(p)

Eliezer_Yudkowsky: I've seen the kinds of failures of explanation you refer to, and there's also the possibility that the explainer just isn't capable of explaining all of the inferential steps because he doesn't know them. In that case, the explainer is basically "manipulating symbols without understanding them". This is why I've formulated that principle (sort of a corollary to what you've argued here) that:

"If you can't explain your idea/job/research to a layman, given enough time, and starting from reference to things he already understands, you don't understand it yourself."

Replies from: Arandur
comment by Arandur · 2011-07-31T19:39:54.279Z · LW(p) · GW(p)

That seems so simple as to be tautological. After all, you were a layman yourself once.Ideas/jobs/researches don't spring whole-spun from the ether. You have to be led along that same path yourself - either by a teacher, or by your own mind bumping along down dark corridors.

Replies from: AndyC
comment by AndyC · 2014-04-22T11:01:35.242Z · LW(p) · GW(p)

But it's not true. Consider by analogy: if you can't explain something to a 4-year-old, you don't understand it yourself. After all, you were a 4-year-old once yourself.

No, actually, sometimes you can't explain something to someone because you don't have a good enough understanding of their mental processes. It doesn't matter if you once experienced those same mental processes; the relevant memories of that time are very likely lost to you now. Explaining math to novices is a different skill than understanding math. It requires the ability to figure out why the other person has got it wrong and what they need to hear. That isn't a mathematical skill.

A distinguished math professor is probably inferior at explaining arithmetic to 8 year olds than an experienced mathematics educator, but it doesn't mean the latter has the better understanding of math. They just have a better understanding of 8 year olds.

comment by Laura · 2007-10-23T16:38:59.000Z · LW(p) · GW(p)

I have experienced this problem before-- the teacher assumes you have prior knowledge that you just do not have, and all of what he says afterwards assumes you've made the logical leap. I wonder to what extent thoughtful people will reconstruct the gaps in their knowledge assuming the end conclusion is correct and working backwards to what they know in order to give themselves a useful (but possibly incorrect) bridge from B to A. For example, I recently heard a horrible biochem lecture about using various types of protein sequence and domain homology to predict function and cellular localization. Now, the idea that homology could be used to partially predict these things just seemed logical, and I think my brain just ran with the idea and thought about how I would go about using the technique, and placed everything he said piece-wise into that schema. When I actually started to question specifics at the end of the lecture, it became clear that I didn't understand anything the man was saying at all outside of the words "homology" and "prediction", and I had just filled in what seemed logical to me. How dangerous is it to try to "catch up" when people take huge inferential leaps?

comment by Noumenon · 2007-10-23T17:36:29.000Z · LW(p) · GW(p)

Yes, this is good stuff, I wish I could identify the inferential gaps when I communicate!

comment by TGGP4 · 2007-10-24T01:49:53.000Z · LW(p) · GW(p)

Silas, aren't there some things it is simply impossible for some people to understand?

Replies from: Arandur
comment by Arandur · 2011-07-31T19:44:49.462Z · LW(p) · GW(p)

Yes (maybe?), but that lends no argument against Silas' corollary.

If you cannot explain, then you do not understand.

Therefore: If you do understand, then you can explain.

If no one can understand, then the precedent in the above is false, meaning that we cannot give the consequent any truth value.

comment by Silas · 2007-10-24T02:08:47.000Z · LW(p) · GW(p)

TGGP: Yes for people below some IQ threshold. No for someone of the same IQ as the explainer.

(I probably should have added the intelligence criterion the first time around, I guess, but I was simplifying a bit.)

comment by Daniel_Humphries · 2007-10-24T03:00:48.000Z · LW(p) · GW(p)

This is an excellent post, Eliezer!

Taking this phenomenon into consideration not only gives me cause to go back over my own teaching technique (of a rather specialized trade) and make sure I am not leaving out any steps that seem obvious to me (the specialist), but, like Laura, it helps me to understand times when I was baffled by a speaker or writer whose tone implied I'd be an idiot not to follow along easily.

comment by Richard_Hollerith · 2007-10-24T08:44:23.000Z · LW(p) · GW(p)

When I write for a very bright "puzzle-solving-type" audience, I do the mental equivalent of deleting every fourth sentence or at least the tail part of every fourth sentence to prevent the reader from getting bored. I believe that practice helps my writings to compete with the writings around it for the critical resource of attention. There are of course many ways of competing for attention, and this is one of the least prejudicial to rational thought. I recommend this practice only in forums in which the reader can easily ask followup questions. Nothing about this practice is incompatible with the practices Eliezer is advocating. This week I am experimenting with adding three dots to the end of a sentence to signal to the reader the need mentally to complete the sentence.

So, what sentence did I delete from the above? A sentence to the effect that I only do this for writing that resembles mathematical proof fairly closely: "Suppose A. Because B, C. Therefore D, from which follows E, which is a contradiction, so our original assumption A must be false."

After writing a first draft, I go back and add a lot more words than I had saved with the "do not bore the reader" practice. E.g. I add sentences explicitly to contradict interpretations that would lead to my being dismissed as hopelessly socially inept, eccentric or evil. Of course because I advocate outlandish positions here, I still get dismissed a lot.

comment by g · 2007-10-24T11:01:53.000Z · LW(p) · GW(p)

Richard, you may or may not care that having read the above my willingness to read anything you write in future has somewhat decreased.

comment by TGGP4 · 2007-10-24T21:42:12.000Z · LW(p) · GW(p)

I would add, Richard, that writing "dear reader" on a medium like this comes off as patronizing.

comment by Charlie2 · 2007-10-28T01:00:42.000Z · LW(p) · GW(p)

Some of your claims about the EEA are counterintuitive to me. Basically, it's not obvious that all information not strictly private would have been public. I'm thinking, for example, of present-day isolated cultures in which shamans are trained for several years: surely not all of their knowledge can be produced in a publicly comprehensible form. There must be a certain amount of "Eat this herb -- I could tell you why, but it would take too long to explain". Or so I imagine.

So how much of your description of knowledge in the EEA is your guessimation, and how much is the consensus view? And where can I find papers on the consensus view? My Google-fu fails me.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2008-01-08T10:26:49.000Z · LW(p) · GW(p)

I present to you Exhibit A from the field of computer programming.

comment by MrHen · 2010-02-09T17:33:33.726Z · LW(p) · GW(p)

I find an easy way to get some of the complicated inferential jumps for free is to find a similar set of inferential jumps they have made in a similar subject. It is much easier to correct a "close" inferential jump than it is to create a new one out of thin air.

Example: When discussing the concept of programming you can use the concept of an assembly line to get their head into a procedural mode of thinking. Once they think about an object visiting a bunch of stations in a factory you can replace "object" with "program" and "station" with "line of code." They still have no idea how programming works, but they can suddenly create a bunch of inferential jumps based on assembly lines.

In my experience, they now start asking questions about programming as related to assembly lines and you can fill in the gaps as you find them.

"So what happens at the end of the line?"
"Well, the program generally loops back around and starts over."
"Oh. So it follows the same line forever?"
"Not necessarily. Sometimes the line takes a detour and heads off into a new area of the plant for awhile. But it generally will come back to the main assembly line."
"But what's the point? Like, how does that make my computer run?"
"Think of the computer like the company. The company owns a whole bunch of assembly lines all over the place and, periodically, it will ask a certain plant to start up and keep running. One of the stations in the assembly line is something like, 'Give report to company'. The company looks at the report, realizes it is a visual report, and hands it to the assembly line that processes visual reports. That assembly line takes the report, breaks it down into RGB pixels, and puts it on the monitor for you to see. For that to happen, all of these programs have to keep spinning on their assembly lines and doing the work at each station and keep sending reports back to the company. You don't have to worry about all of the lines because the computer is doing it for you."
"Wow, that's complicated."
"Yeah, it can get pretty crazy. As a programmer, I design the assembly lines."

Or whatever.

Replies from: Benito
comment by Ben Pace (Benito) · 2017-01-02T19:52:05.847Z · LW(p) · GW(p)

I like how it took me until the end to realise you'd re-inventedthe concept of analogies :-)

Replies from: snewmark
comment by snewmark · 2017-01-05T17:51:52.083Z · LW(p) · GW(p)

And I had to read past the end to realize that...

comment by Lorenzo · 2010-04-19T20:24:18.653Z · LW(p) · GW(p)

As someone who has done (some) teaching, I think this is absolutely correct. In fact, the most difficult thing I find about teaching is trying to find the student's starting knowledge, and then working from there. If the teacher does not goes back enough 'inferential steps', the student won't learn anything - or worse, they might think they know when they don't.

Excellent stuff.

comment by Lorenzo · 2010-04-19T20:38:42.843Z · LW(p) · GW(p)

Now I think of it, this reminds of something Richard Dawkins used to say at some talks: that we (the modern audience) could give Aristotle a tutorial. Being a fantasist myself, I've sometimes wondered how that could be possible. Leaving aside the complications of building a time machine (I leave that to other people), I wondered how would it be to actually meet Aristotle and explain to him some of the things we now know about life, the universe & everything.

First of all, I'd have to learn ancient greek, of course, or no communication would be possible. That would be the easy (and the only easy) part. More complicated would be that, to teach anything modern to Aristotle, one would have to teach an incredible amount of previous stuff. That is, one would have to step quite a large number of inferential steps. If I wanted to explain, for example, the theory of evolution, that would require a lot of anatomy, geography, zoology, botany, and even mathematics and philosophy. One would have to be a true polymath to achieve the feat. It's not that we don't know more about the universe than Aristotle, it is that to cross the inferential 'gap' between Aristotle and us would require an inordinate amount of knowledge.

Maybe a good metaphor is based on Dennett's crane idea: we develop ideas that help us reach higher levels of understanding, but as soon as we reach those upper levels we discard them to build new ones for higher levels. To help someone on the floor, one has to 'rebuild' these old cranes no longer in use.

Replies from: NancyLebovitz
comment by NancyLebovitz · 2010-04-19T21:13:32.393Z · LW(p) · GW(p)

Actually, evolution might be the easiest one. It's inevitable if you have variation and selection. It's a really pretty theory.

I don't know how hard it would be to convey that observation and experimentation will take you farther than just theorizing.

If I brought back some tech far advanced over Aristotle's period (and I wonder what would be most convincing), it might add weight to my arguments.

And personally, even if I had a time machine and the knowledge of ancient Greek, I don't know what hard it would be to get him to listen to a woman.

Replies from: Lorenzo, Tyrrell_McAllister, NancyLebovitz
comment by Lorenzo · 2010-04-19T21:46:34.882Z · LW(p) · GW(p)

You're right - evolution might be easier than, say, how and iPhone works (not that an iPhone would work very well in Ancient Greece, or for much long, anyway). Having some high tech to show to good old Aristotle maybe would convince him you come from a very strange land, and maybe he would want to hear more of what you have to say instead of just dismissing you as a lunatic.

But imagine how much you would have to explain to make him even dimly aware of the way an iPhone works! Electronics, electricity, computation, satellites and astronomy (goodbye lunar sphere), calculus, chemistry, physics... I can barely think of all the relevant topics!

Of course, as you point out, mysoginy would be a great obstacle too. One more of the 'steps' that separate ancient peoples from modern societies.

Replies from: NancyLebovitz
comment by NancyLebovitz · 2010-04-19T23:10:39.076Z · LW(p) · GW(p)

What you want to teach depends on what you're trying to accomplish. I don't think there's much point in trying to give Aristotle an overview of modern scientific conclusions.

Assuming we want to accelerate technological progress, I'd rather teach him scientific method, decimal notation, evolution, and maybe what Feynman said (iirc) was the most important conclusion-- that matter is made of tiny bits of elements. I don't know what other specific subjects might be a good idea. Bayes? Calculus?

I don't know what would be convincing experiments for atoms.

One more I'd want to teach him that you can learn a lot by doing careful measurement and thinking about the results.

I don't know what Aristotle would come up with, given all that-- he was very smart.

Replies from: RobinZ, mattnewport, NancyLebovitz
comment by RobinZ · 2010-04-20T00:21:10.065Z · LW(p) · GW(p)

I don't know what would be convincing experiments for atoms.

Assuming you convinced him of the epistemological primacy of experiment, I see two obvious paths:

  1. The kinetic theory of gases, particularly the ideal gas law;

  2. Stoichiometry in chemistry - for example, electrolysis of water.

Replies from: tut
comment by tut · 2010-04-20T07:07:14.269Z · LW(p) · GW(p)

I would add Brownian motion to that list.

comment by mattnewport · 2010-04-20T00:26:27.271Z · LW(p) · GW(p)

From a practical point of view teaching the germ theory of disease would probably have the most immediate benefit.

comment by NancyLebovitz · 2010-04-20T09:10:46.603Z · LW(p) · GW(p)

Using water droplets as rudimentary microscopes.

How big a jump would it be to give them lens-making tech?

Replies from: RobinZ
comment by RobinZ · 2010-04-20T10:58:31.727Z · LW(p) · GW(p)

You could probably explain geometrical optics without too much trouble.

comment by Tyrrell_McAllister · 2010-04-19T22:05:02.033Z · LW(p) · GW(p)

I don't know what hard it would be to get him to listen to a woman.

I would sort of expect any woman who showed up with apparently magical powers to be put into the goddess category. Even someone like Aristotle, who probably didn't believe that gods and goddesses literally existed, would be culturally conditioned to treat a woman who appeared to have super-powers with some respect.

Replies from: RobinZ
comment by RobinZ · 2010-04-19T23:08:15.112Z · LW(p) · GW(p)

I would sort of expect any woman who showed up with apparently magical powers to be put into the goddess category. Even someone like Aristotle, who probably didn't believe that gods and goddesses literally existed, would be culturally conditioned to treat a woman who appeared to have super-powers with some respect.

How would you implement that? What do we have the tech to build today for a reasonable outlay of money (less than a million euros, for example) that could blow minds in that era?

Replies from: NancyLebovitz, arfle, Polymeron, jeronimo196
comment by NancyLebovitz · 2010-04-19T23:16:17.792Z · LW(p) · GW(p)

I don't know what it would take to pass as a goddess, but a stash of cool stuff could be impressive. An ipad (with appropriate power source-- what would it take to power it from a water wheel?). Stainless steel blades. A Jacquard loom. What else?

Replies from: mattnewport
comment by mattnewport · 2010-04-19T23:24:41.740Z · LW(p) · GW(p)

A hunting or sniper rifle, a pistol, a remote controlled helicopter with wireless video, broad spectrum antibiotics, powerful painkillers, explosives.

Replies from: NancyLebovitz
comment by NancyLebovitz · 2010-04-19T23:28:46.470Z · LW(p) · GW(p)

A better list than mine. Do you think you'd need to go back with a group just to not have your stuff stolen?

If you want to bring back something useful for the educational project rather than just being impressive, a batch of slide rules would be good.

Could ancient Greeks make printing presses if they had designs for them? I'm sure they could at least do wood block printing.

Replies from: mattnewport
comment by mattnewport · 2010-04-19T23:37:21.708Z · LW(p) · GW(p)

I think that would somewhat depend on how convinced people were by your 'godlike' powers. That's where modern day weaponry would prove quite effective I'd imagine. A taser would probably be useful as a bullet wound would be recognizable physical damage whereas the effect of a taser would probably seem like the power of the gods. If I was on my own I'd probably want body armor, motion sensors and other defensive equipment as well to be on the safe side.

Mixing healing powers in would be just as valuable in self-preservation as demonstrating offensive capability. You would probably want to obfuscate the nature of your 'healing magic' so that people would not easily be able to replicate it if they managed to steal some of your stock of medical supplies. Special pills that had to be given in combination to be effective would be useful.

Replies from: NancyLebovitz
comment by NancyLebovitz · 2010-04-19T23:47:14.913Z · LW(p) · GW(p)

If you're planning to teach a scientific worldview, it might be well to not be too godlike.

Replies from: mattnewport
comment by mattnewport · 2010-04-20T00:07:38.151Z · LW(p) · GW(p)

True, but at least initially personal survival and not getting all your stuff nicked would probably require some compromises on teaching science. Once you had an established power base and some loyal local followers you could start to focus on the teaching.

comment by arfle · 2010-08-12T20:37:07.466Z · LW(p) · GW(p)

A plastic bottle out of the trash. It's transparent but flexible and almost weightless. See how well the lid has been made? It makes a water-tight seal.

It might be the most valuable object in Greece.

Replies from: arfle
comment by arfle · 2010-08-12T20:56:44.312Z · LW(p) · GW(p)

And then when you've got his attention, show him decimal notation.

And stirrups for his horse. And lances.

Once he's hooked, show him why things float. And how a ball rolling down an inclined plane moves 1, 4, 9, 16 as it accelerates.

Show him Cartesian geometry. And how to play go with lines scratched in the ground and coloured stones. Make a recorder and play him some songs.

He'll teach you Greek.

Show him how to send messages using flashing mirrors. Show him Playfair's cipher. Perspective drawing. How to make a magnifying glass. Newton's cradle. Make a model boat out of bronze.

I suspect in a day in Ancient Greece, you'd see so many easily solved problems that my list would look naive. You don't need modern technology. You need the things that were discovered just after the mediaevals recovered what the Greeks already knew.

Replies from: gwern, viktor-riabtsev-1
comment by gwern · 2010-10-27T04:14:23.876Z · LW(p) · GW(p)

I suspect in a day in Ancient Greece, you'd see so many easily solved problems that my list would look naive.

This is one of the more interesting approaches to the Connecticut Yankee in King Arthur's Court (as I dub this species of thought problem) - that you don't need any special preparation because your basic background means that you'll spend the rest of your life in the past rushing around yelling 'don't do that, you idiot, do it this way!'

Diplomacy might actually be the best preparation.

comment by Viktor Riabtsev (viktor-riabtsev-1) · 2018-10-10T23:13:06.105Z · LW(p) · GW(p)
Show him how to send messages using flashing mirrors.

Oh god. That is actually just humongous in it's possible effect on warfare.

I mean add simple ciphers to it and you literally add another whole dimension to warfare.

Communication lines setup this way are almost like adding radio. Impractical in some situation, but used in regional warfare with multiple engagements? This is like empire forming stuff just from reflective stone plus semi-trivial education equals dominance.

comment by Polymeron · 2011-05-04T13:28:40.218Z · LW(p) · GW(p)

Heck, you don't need a million Euros. I could easily blow minds with 100.

A simple Zippo lighter should do the trick. So could an adjustable-beam flashlight, for that matter. A music player with earphones or speakers is another obvious choice. Candy bars maybe? They'd be shocked you brought ambrosia...

Pretty much anything that emits light, sound, heat, cold, etc. is likely to have some serious impact. Remember superstimulus.

Replies from: RobinZ
comment by RobinZ · 2011-05-04T14:40:58.756Z · LW(p) · GW(p)

Ultimately, I suppose the key question is, "how long do you need to keep up the act?"

Replies from: Polymeron
comment by Polymeron · 2011-05-04T17:34:05.056Z · LW(p) · GW(p)

With a budget closer to 5,000EUR, access to firearms, and enough willingness to use Dark Arts, I could probably keep it up for a decade or more. Possibly even pass on knowledge to selected disciples who would likewise guard these technological secrets, even as they rule the ignorant peasants.

If, on the other hand, our purpose is as originally stated - prove to the scholars of the time that I have knowledge worthy of them becoming my disciples so I can impart as much knowledge to them as possible - I probably won't need much more than the superstimuli I described and a couple of afternoons. Something decidedly useful could cement this, and still on budget: A map of the world, a geographically appropriate taxonomy book, and a wristwatch (doubles as nautical navigational aide) would be enough. And I don't think I've even reached 100EUR yet, all told :)

What we can achieve with today's technology is so marvelous, it's amazing how ordinary it seems to us. One day I turned on the faucet at my house and just marveled at the incredible and unlikely wonder of having fresh drinking water at practically limitless capacity being instantly transported to my residence, at my whim.

This isn't just magic. It's better than magic.

comment by jeronimo196 · 2020-02-18T08:50:18.273Z · LW(p) · GW(p)

A gun could blow minds in any era.

I'm sorry, I couldn't help myself.

comment by NancyLebovitz · 2010-04-20T00:34:26.038Z · LW(p) · GW(p)

One more thing beside a time machine, knowledge of ancient Greek, and a stash of cool stuff-- the ability to argue well enough to convey your ideas to Aristotle and convince him you're right.

This is probably at least as hard as it sounds.

comment by Document · 2011-07-23T22:41:22.752Z · LW(p) · GW(p)

See also: www.justfuckinggoogleit.com, www.lmgtfy.com, Reddit anti-"repost" rage and the comments like this that appear in practically every online community.

comment by Arandur · 2011-07-31T19:22:46.006Z · LW(p) · GW(p)

This is the reason it's a Bad Thing that so many of the deeper concepts of Mormonism have become public knowledge. The first question I get asked, upon revealing that I'm a Mormon, is often, "So, you believe that if you're good in this life, you'll get your own planet after you die?" There are at least three huge problems with this question, and buried deep beneath them, a tiny seed of truth. But I can't just say "The inferential distance is too great for me to immediately explain the answer to that question. Let me tell you about the Plan of Salvation, and we'll move from there," because that sounds like I'm Trying To Convert You, which is a Scary and a Bad Thing, because... out of explanations come brainwashing. Or something.

2 Nephi 28:30:

30 For behold, thus saith the Lord God: I will give unto the children of men line upon line, precept upon precept, here a little and there a little; and blessed are those who hearken unto my precepts, and lend an ear unto my counsel, for they shall learn wisdom; for unto him that receiveth I will give more...

Replies from: Alicorn
comment by Alicorn · 2011-07-31T19:39:03.558Z · LW(p) · GW(p)

The "your own planet" thing isn't a huge selling point that you'd want to lead with?

Replies from: Arandur
comment by Arandur · 2011-07-31T19:52:14.335Z · LW(p) · GW(p)

Ha! I'd never thought of it like that! :3 Unfortunately, I have a problem with the idea of "selling" a religion. Just because you like an idea doesn't mean it's true...

Besides, the types of person who bother saying "You get your own planet?" instead of "You're religious?" usually views getting your own planet as The Ultimate Sacrilege, so it's not the best selling point, no. :/

comment by Michelle_Z · 2011-09-23T23:56:26.155Z · LW(p) · GW(p)

This is one of those things that I realize is so obvious once I thought about it, but until it was pointed out to me, I would have never seen it.

comment by gwern · 2012-07-12T18:22:37.668Z · LW(p) · GW(p)

To Mazur’s consternation, the simple test of conceptual understanding showed that his students had not grasped the basic ideas of his physics course: two-thirds of them were modern Aristotelians...“I said, ‘Why don’t you discuss it with each other?’” Immediately, the lecture hall was abuzz as 150 students started talking to each other in one-on-one conversations about the puzzling question. “It was complete chaos,” says Mazur. “But within three minutes, they had figured it out. That was very surprising to me—I had just spent 10 minutes trying to explain this. But the class said, ‘OK, We’ve got it, let’s move on.’...More important, a fellow student is more likely to reach them than Professor Mazur—and this is the crux of the method. You’re a student and you’ve only recently learned this, so you still know where you got hung up, because it’s not that long ago that you were hung up on that very same thing. Whereas Professor Mazur got hung up on this point when he was 17, and he no longer remembers how difficult it was back then. He has lost the ability to understand what a beginning learner faces.”

http://harvardmagazine.com/2012/03/twilight-of-the-lecture

comment by private_messaging · 2012-07-12T18:47:57.607Z · LW(p) · GW(p)

If there is a probability of faulty inference, then longer inferences are exponentially less likely to be valid, with the length of the valid inference being proportional to logarithm of the process fidelity. Long handwaved inferences can have unbelievably low probability of correctness, and thus be incredibly weak as evidence.

Furthermore, informal arguments very often rely on 'i can't imagine an alternative' in multiple of their steps, and this itself has proven unreliable. It is also too easy to introduce, deliberately or otherwise, a huge number of implicit assumptions all of which must be true for the argument to be valid.

With logarithmic dependence to the fidelity of argument, even dramatically more reliable informal arguments do not lead to dramatically longer inference chains that can be done. One has to use formal methods to produce long inference chains.

comment by Insert_Idionym_Here · 2012-09-11T04:58:50.548Z · LW(p) · GW(p)

The lack of this knowledge got me a nice big "most condescending statement of the day award" in lab a year ago.

comment by lukeprog · 2013-05-14T20:33:19.043Z · LW(p) · GW(p)

I don't think this is quite right, but taking up the challenge may be helpful when writing:

I have always figured that if I can't explain something I'm doing t oa group of bright undergraduates, I don't really understand it myself, and that challenge has shaped everything I have written.

Daniel Dennett

comment by Mass_Driver · 2016-12-23T10:08:49.201Z · LW(p) · GW(p)

And if you think you can explain the concept of "systematically underestimated inferential distances" briefly, in just a few words, I've got some sad news for you...

"I know [evolution] sounds crazy -- it didn't make sense to me at first either. I can explain how it works if you're curious, but it will take me a long time, because it's a complicated idea with lots of moving parts that you probably haven't seen before. Sometimes even simple questions like 'where did the first humans come from?' turn out to have complicated answers."

Replies from: snewmark
comment by snewmark · 2016-12-23T14:01:25.469Z · LW(p) · GW(p)

Sometimes even simple questions like 'where did the first humans come from?' turn out to have complicated answers

Of course it's not actually a simple question, it's really a broad inquiry. In fact it doesn't even need to have an answer and even when it does, it usually alters the question slightly... the hard part is asking the right questions not finding the answer.

(It just dawned on me that this was the whole point of The Question in A Hitchhiker's Guide to the Galaxy, thanks for that.)

comment by enye-word · 2017-05-31T07:03:28.338Z · LW(p) · GW(p)

And if you think you can explain the concept of "systematically underestimated inferential distances" briefly, in just a few words, I've got some sad news for you...

"This is going to take a while to explain."

Did I do it? Did I win rationalism?!

Replies from: TheAncientGeek, luke-allen
comment by TheAncientGeek · 2017-05-31T15:15:49.700Z · LW(p) · GW(p)

“If you understood everything I said, you’d be me” ― Miles Davis

comment by Luke Allen (luke-allen) · 2019-11-06T21:39:02.921Z · LW(p) · GW(p)

I'd go with "echo chambers." Or if I weren't feeling pedantic, I'd say "There's a reason this concept takes a whole semester to teach."

comment by arisen · 2017-07-02T23:10:39.964Z · LW(p) · GW(p)

Expecting short Inferential Distances wouldn't that be a case of rational thought producing beliefs which are themselves evidence? :P Manifested in over-explaining to the point of cognitive-dissonance? How about applying Occam's Razor and going the shorter distance: improve the clarity of the source by means of symbolism though a reflective correction (as if to compensate for the distortion in the other lens). To me it means to steel-man the opponent's argument to the point where it becomes not-falisifible. See, the fact that science works by falsification and pseudoscience by verification puts them in different paradigms that will only be reconciled by verification alone. Meaning also, science will have value because it can predict, so who cares about its inner workings of reason! This to me makes sense, because right now we seem to rank our intelligence superior to that of a virus, which is a problem of underestimating your enemy :). We are Neurotribes, autistic kids for ex. think in pictures; a different type intelligence may be emerging, may be without beliefs :)

"It is a profoundly erroneous truism, repeated by all copy-books and by eminent people when they are making speeches, that we should cultivate the habit of thinking of what we are doing. The precise opposite is the case. Civilization advances by extending the number of important operations which we can perform without thinking about them. Operations of thought are like cavalry charges in a battle — they are strictly limited in number, they require fresh horses, and must only be made at decisive moments." Alfred North Whitehead

comment by Jman9107 · 2019-11-27T21:06:56.475Z · LW(p) · GW(p)

that last sentence ha

comment by Blo · 2020-12-01T12:37:38.619Z · LW(p) · GW(p)

    I think this concept may be fundamental in explaining the mecanics of cohesion and division in society. This could help understand why politics tend to get more and more divided. Especially on the internet, but also IRL, people tend to confirm their ideas rather than confront them to different ones, as first observed by C. Wason (Peter C. Wason, « On the failure to eliminate hypotheses in a conceptual task », in Quarterly Journal of Experimental Psychology, 1960), and confirmed since. Or, one could argue, that reinforcing one's ideas (the ideas of a person or a community), building a shield of protective arguments around them, rather these arguments are solid and rationnal or just every means possible to deter ennemy attacks, is comparable to building evermore steps further away from people attacking these ideas, trying to change or oppose them.

    When it comes to people who have radicalized themselves to the point that they refuse to accept reality, in the sense of something we all agree to build from (global warming is real and manmade, there no such thing as "races" in humanity, if it's raining it is not not raining etc.), one could say they have build not inferential steps but an infertential wall. Conspiracy theorists for instance, or at least some of them, share some mecanims with people suffering from schizophrenia, in the sense that they will actually take arguments against their position as reinforcing this very position that is being contradicted, thinking that counterarguments are just trying to prevent them from discovering the truth, or that all people disagreeing with them are part of the conspiracy. It is as if they are building new steps when one tries to cross the already existing ones.

    Nevertheless, this grim presentation of mine shouldn't undermine the one great thing about this concept of inferential distance, that is, if this distance can be divided into steps, we can all reach for each other, even people that are staircases away. 

comment by Snazster · 2020-12-18T18:10:20.041Z · LW(p) · GW(p)

It's been nearly a century since relativity and quantum physics took the stage, and we have gathered considerably more data, and filled in a great many areas since then, yet we are still waiting for the next big breakthrough.

The problem may be in the way we approach scientific discovery.

Currently, when a new hypothesis is advanced, it is carefully considered as to how well it stands on its own and whether, eventually, it is worthy of being adopted as a recognized theory and becoming a part of our overall model of how everything works.

This might be the problem. Suppose the next advance cannot be made in this fashion? By this I am proposing that the next breakthrough may involve not a single new concept, that can be tested independently for worthiness, but several that cannot be tested individually.

For example, when you build a stone wall, you can test its strength and stability with each new stone placed. When you build a stone arch, attempting to test its strength and stability after each piece is going to get you labeled as an incompetent as the uppermost pieces will always fall if you merely try to set them in place, one at a time. Insisting that there are several pieces that must be placed at once before it can be tested is necessary, yet in theoretical physics, it well get you labeled as a crackpot.

For example, suppose one were to approach a transportation company with a radical new idea on how to improve airplanes, and even air travel in general. The company would want to test it and validate the concept before adoption. But suppose you told them that the idea could not even be evaluated unless it simultaneously includes the research and development of radical new ideas in such seemingly unrelated fields as personnel management, inventory control, and submarine transports. Chances are they would politely (or perhaps not politely) decline any further involvement.

Yet that is precisely the problem with the Standard Model.

We have conundrums in explaining consciousness, the dual-slit experiment, Schrodinger's cat, the number of spatial dimensions required, the expansion of the universe, the acceleration of the expansion of the universe, dark matter, dark energy, quantum uncertainty, the speed of light, singularities, the Big Bang, the heat-death (or Big Chill) of the universe, the Big Rip, gravity, entropy, and the list continues. Anyone that attempts to address more than one or two of these things at one time is likely to be dismissed at once as a crackpot. 

Yet is fairly well believed that Einstein's classic papers, submitted in today's climate, would go straight to the crackpot slush pile.

There is also the problem that, for proposed idea in physics to be given a hearing of any sort almost invariably requires advanced degree work in physics, with appropriately degreed instructors and sponsors. A paid position in the field is very nearly a prerequisite as well. Further, given the preoccupation with string theory that has consumed so many of them, and so restricted the opportunities of those who are not adherents . . . I've heard there may be less than 200 individuals in the world employed as theoretical physicists that are not dedicated to string theory (which doesn't seem to yield useful results in terms of advancing or redefining the Standard Model).

Additionally, given that they all go through a similar process to become recognized theoretical physicists, it almost certainly channels and colors their thinking on the subject, which is the same thing as saying that it limits them.