Map Errors: The Good, The Bad, and The Territory

post by orthonormal · 2020-06-27T05:22:58.674Z · score: 24 (9 votes) · LW · GW · 4 comments

What happens when your map doesn't match the territory?

There's one aspect of this that's potentially very helpful to becoming a rationalist, and one aspect that's very dangerous. The good outcome is that you could understand map errors more deeply; the dangerous outcome is that you could wind up stuck somewhere awful, with no easy way out.

The first version, where you notice that the map is wrong, comes when the map is undeniably locally wrong. The map says the path continues here, but instead there's a cliff. (Your beliefs strongly predict something, and the opposite happens.)

The ordinary result is that you scratch out and redraw that part of the map – or discard it and pick up an entirely different map – and continue along the new path that looks best. (You decide you were wrong on that one point without questioning any related beliefs, or you convert to a completely different belief system which was correct on that point.)

The really valuable possibility is that you realize that there are probably other errors besides the one you've seen, and probably unseen errors on the other available maps as well; you start to become more careful about trusting your maps so completely, and you pay a bit more attention to the territory around you.

This is a really important formative experience for many rationalists: 

(For me the Obvious But False Belief was about religion; for others it was politics, or an academic field, or even their own identity.)


Now, the dangerous outcome – getting trapped in a dismal swamp, with escape very difficult – comes when you've not seen an undeniable local map failure, so that you never notice (or never have to admit) that the map isn't matching up very well with the territory, until it's too late.

(I'm thinking of making major life decisions badly, where you don't notice or admit the problem until you're trapped in a situation where every option is a disaster of some sort.)

Sometimes you really do need to make bold plans based on your beliefs; how can you do so without taking a big risk of ending up in a swamp?

I suggest that you should ensure things look at least decent, according to a more "normal" map, while trying to do very well on yours. That is, make sure that your bold plan fails gracefully if the more normal worldview around you is correct. (Set up your can't-miss startup such that you're back to the grind if it fails, not in debt to the Mob if it fails.)

And get advice. Always get advice from people you trust and respect, before doing something very uncommon. I could try and fit this into the map framework, but it's just common sense, and way too many good people fail to do it regardless.

Best of luck adventuring out there!


Comments sorted by top scores.

comment by TAG · 2020-06-28T22:20:44.195Z · score: 1 (1 votes) · LW(p) · GW(p)

What happens when your map doesn’t match the territory?

What does it mean for the map not to match the territory?

  1. The map is less detailed that the territory?

  2. The map does not predict correctly?

  3. The map predicts, but only under limited circumstances?

  4. The map predicts well, but nonetheless does not correspond to the territory?

The really pessimistic possibility is that you can't avoid all the problems simultaneously.

comment by orthonormal · 2020-06-29T03:27:58.976Z · score: 2 (1 votes) · LW(p) · GW(p)

#1 is just inevitable in all but a few perfectly specified domains. The map can't contain the entire territory.

#2 is what I'm discussing in this post; it's the one we rationalists try most to notice and combat. (Beliefs paying rent and all.)

#3 is fine; I'm not as worried about [maps that admit they don't know what's beyond the mountain] as I'm worried about [maps that fabricate the territory beyond the mountain].

#4. For sufficiently perfect predictive power, the difference between map and territory becomes an epiphenomenon, so I don't worry about this either.

comment by Rudi C (rudi-c) · 2020-06-27T18:02:08.750Z · score: 1 (1 votes) · LW(p) · GW(p)

I can summarize this post as follows:


There is always a danger of being overconfident in our beliefs. So it is a very good idea to take the conventional wisdom seriously when we are dealing with high-stakes situations, and plan in a way that won’t lead to a disaster according to the outside view.

comment by orthonormal · 2020-06-27T19:03:14.761Z · score: 2 (1 votes) · LW(p) · GW(p)

That more or less covers the advice at the end, but the rest of my post feels very valuable to my model of rationality.