post by [deleted]
This is a link post for
Comments sorted by top scores.
comment by Ben Pace (Benito) ·
2017-10-12T19:37:42.240Z · LW(p) · GW(p)
I agree with the significant claim of this post, that often if you explain a complex idea quickly/badly to someone, this will cause them to have a cached misrepresentation and be more likely to dismiss your argument in future because they round it off to something worse. In general I really like adding together two concepts previously discussed to point out how they build.
I'd just like to register disagreement with this part in your explanation of inferential distance:
If you and I grew up in the same town, went to the same schools, have the same color of skin, have parents in the same economic bracket who attended the same social functions, and both ended up reading Less Wrong together, the odds are that the inferential distance between us for any given set of thoughts is pretty small. If I want to communicate some new insight to you, I don't have to reach out very far.
This feels like a large overweighting of group identity factors relative to things like 'whether you've ever studied a technical subject' or 'how often you feel curious'. I don't know if you intended to make such a strong claim, but I feel like things like whether you can grok 'moloch' as a concept to be much more dependant on whether you've taken an econ class than most of the things you mention.
comment by Jayson_Virissimo ·
2017-10-12T20:20:28.051Z · LW(p) · GW(p)
I had exactly the same reaction: the inferential distance between me and my childhood friends completely dwarfs that between me and the typical member of the coding bootcamp I attended in San Francisco, with whom I share almost none of those features.
comment by magfrump ·
2017-10-16T20:05:32.043Z · LW(p) · GW(p)
My experience is that I have different inferential gaps between people who share different features with me. In most of my life and conversations, the biggest gaps are coming from things like reading the sequences. But among people who are similar on a lot of levels, coming from a different socio-economic background means we have other inferential gaps, especially in terms of the way we handle money or define personal success.
I do agree that there's an implicit weighting in terms of mentioning some factors and not others and I don't mean to endorse 100% the exact implicit weighting, just to mention that we focus on some inferential gaps in our lives and not others for lots of reasons, and there are lots of gaps coming from the factors he did mention that are important and large and often invisible.
comment by Hazard ·
2017-10-12T18:57:49.918Z · LW(p) · GW(p)
A take away could be to get in the habit of checking in with all parties involved to see if they have the time and mental bandwidth to really discuss the issue, and if not, defer the conversation to a later date. This would be easier in a community like LW, but I think it could still be fluidly pulled off in other various social contexts.
I think the key to pulling it off would be to make it clear that the issue is only being deferred because you want it to give it your attention and consideration, and not because you are trying to avoid confrontation.
comment by Conor Moreton ·
2017-10-12T19:31:04.142Z · LW(p) · GW(p)
+1 for a concrete operationalization. In fact, I think this may be the impulse that's behind behavior (from some of my allies) that I've previously felt annoyed by, and I expect to react more charitably as a result of your comment.
comment by jsalvatier ·
2018-03-19T18:06:02.386Z · LW(p) · GW(p)
Thanks for writing this. Inferential distance + inoculation is a huge problem for transmitting large bodies of understanding in domains that previously didn't look like domains to the student. The student frequently getting a smaller version of the ideas before they can get the full version and that shuts off further interest because they've "got it".
comment by whales ·
2017-10-15T16:59:00.723Z · LW(p) · GW(p)
Also in favor of not only reserving judgment but ideally deferring exposure until one can seriously evaluate things, You Can't Not Believe Everything You Read; and then there's the mere-exposure effect to worry about, especially from prolific authors or in environments with a lot of repetition. (This is again the weird thing where you have apparently opposite biases which show up in similar situations, and it may not be obvious which direction you'll be taken. In this case I'd guess it depends on one's initial disposition and the level of conscious attention the idea is getting. [In particular, "inferential distance" isn't the determinant---with the illusion of transparency, the gap can go unrecognized by either party and lead to unjustified agreement.] Luckily, one is led to similar reading/discussion policies either way.)
comment by Ben Pace (Benito) ·
2017-10-13T14:45:53.728Z · LW(p) · GW(p)
Promoted to Featured for showing the connection between previous ideas discussed in the community - this helped me understand the previous ideas better, and was a useful post in its own right too.
comment by Chris_Leong ·
2017-10-13T01:01:11.899Z · LW(p) · GW(p)
Thanks for introducing the term Idea Inoculation into Less Wrong. We've had this idea in the community since the Cowpox of Doubt, but I don't think we've had a catchy term for it.
(Also: when trying to introduce a new term into usage, it is important to have canonical article with a clear explanation and the term in the title)