Intuition should be applied at the lowest possible level

post by Rafael Harth (sil-ver) · 2018-02-27T22:58:42.000Z · LW · GW · 9 comments

Contents

9 comments

Earlier today I lost a match at Prismata, a turn-based strategy game without RNG. When I analyzed the game, I discovered that changing one particular decision I had made on one turn from A to B caused me to win comfortably. A and B had seemed very close to me at the time, and even after knowing for a fact that B was far superior, it wasn't intuitive why.

Then I listed the main results from A and B, valued those by intuition, and immediately B looked way better.

One can model these problems on a bunch of different levels, where going from level n to n+1 means hiding the details of level n and approximating their results in a cruder way. On level 1, one would compare the two subtrees whose roots are decisions A and B (this should work just like in chess). Level 2 would be looking at exact resource and attack numbers in subsequent turns. Level 3 would be categorizing the main differences of A and B and giving them intuitive values, and level 4 deciding between A and B directly. What my mistake showcases is that, even in a context where I am quite skilled and which has limited complexity, applying intuition at level 4 instead of 3 lead to a catastrophic error.

If you can't go lower, fine. But there are countless cases of people using intuition on a level that's unnecessarily high. Hence if it's worth doing, it's worth doing with made-up numbers. That is just one example of where applying intuition one level further down: "what quantity of damage arises from this" rather than "how bad is it" can make a big difference. On questions of medium importance, briefly asking yourself "is there any point where I apply intuition on a level that's higher than necessary" seems like a worthy exercise.

Meta: I write this in the spirit of valuing obvious advice, and the suspicion that this error is still made fairly often.

9 comments

Comments sorted by top scores.

comment by gwillen · 2018-02-28T10:23:52.332Z · LW(p) · GW(p)

I think this is a really interesting framing. I like to do something that seems related but slightly different. Where I see what you're describing as something like "explicitly (system2) take something down one level (potentially into smaller pieces), and apply intuition (system 1) to each of the pieces", I like to do "explicitly (system 2) consider the problem from a number of different angles / theories, and try applying intuition (system 1) to each angle, and see whether the results agree or how they differ."

Replies from: gwillen
comment by gwillen · 2018-02-28T10:28:29.924Z · LW(p) · GW(p)

To give an example, because I think I'm being too abstract: If I am thinking of making an investment decision, I won't just query my intuition "is this a good investment?" because it doesn't necessarily have useful things to say about that. Instead I will query it "how does this seem to compare to an equity index fund", and "what does an adequacy analysis say about whether there could plausibly be free money here", and "how does this pattern-match against scams I'm familiar with", and "what does the Outside View say happens to people who make this type of investment", and "what does Murphyjitsu predict I will regret if I invest thusly?" This seems similar to your described approach, if not quite the same.

Replies from: sil-ver
comment by Rafael Harth (sil-ver) · 2018-02-28T12:48:42.676Z · LW(p) · GW(p)

That looks like a more general approach to me, where going one level deeper could be one of the angles considered, and appealing to the outside view another.

comment by Said Achmiz (SaidAchmiz) · 2018-02-28T01:48:31.181Z · LW(p) · GW(p)

This looks like an effect of computational costs, not a strategic mistake. Listing the results of two decisions costs time / cognitive effort (i.e., computation); applying a heuristic (intuitively compare action A to action B) is computationally cheaper, but—as you discovered—more error-prone.

Thus, though you chide people for “using intuition on a level that’s unnecessarily high” [emphasis mine], in fact applying intuition (i.e. heuristics) on a higher level may be quite necessary, for boundedness-of-rationality / computational-cost reasons.

Replies from: sil-ver
comment by Rafael Harth (sil-ver) · 2018-02-28T08:06:47.358Z · LW(p) · GW(p)

That's why I said it should be used "on questions of medium importance". For small recurring decisions, the computational cost could be too high, and for life-changing decisions, one would hopefully have covered this ground already (although on reflection, there are probably counter-arguments here, too). But for everything that we explicitly spend some time on anyway, not bothering to list consequences seems like a strategic mistake to me. Even in the example I used with only 45 seconds available for each turn, I had enough time to do this. And I did spend some time on this decision, I just used it to double and triple check with my intuition, rather than going lower.

comment by Teja Prabhu (0xpr) · 2018-03-01T00:14:30.115Z · LW(p) · GW(p)

I reflexively tried to reverse the advice, and found it surprisingly hard to think of situations where applying higher level intuition would be better.

There's an excerpt by chess GM Michael Tal:

We reached a very complicated position where I was intending to sacrifice a knight. The sacrifice was not obvious; there was a large number of possible variations; but when I began to study hard and work through them, I found to my horror that nothing would come of it. Ideas piled up one after another. I would transport a subtle reply by my opponent, which worked in one case, to another situation where it would naturally prove to be quite useless. As a result my head became filled with a completely chaotic pile of all sorts of moves, and the infamous "tree of variations", from which the chess trainers recommend that you cut off the small branches, in this case spread with unbelievable rapidity.
And then suddenly, for some reason, I remembered the classic couplet by Korney Ivanovic Chukovsky: "Oh, what a difficult job it was. To drag out of the marsh the hippopotamus". I don't know from what associations the hippopotamus got into the chess board, but although the spectators were convinced that I was continuing to study the position, I, despite my humanitarian education, was trying at this time to work out: just how WOULD you drag a hippopotamus out of the marsh ? I remember how jacks figured in my thoughts, as well as levers, helicopters, and even a rope ladder. After a lengthy consideration I admitted defeat as an engineer, and thought spitefully to myself: "Well, just let it drown!" And suddenly the hippopotamus disappeared. Went right off the chessboard just as he had come on ... of his own accord!
And straightaway the position did not appear to be so complicated. Now I somehow realized that it was not possible to calculate all the variations, and that the knight sacrifice was, by its very nature, purely intuitive. And since it promised an interesting game, I could not refrain from making it.

But this is a somewhat contrived example since this is reminiscent of the pre-rigor, rigor, and post-rigor phases of Mathematics (or more generally, in mastering any skill). And one could argue chess GMs have so thoroughly mastered the lower levels that they can afford to skip them without making catastrophic errors.

Another example that comes to mind is Marc Andreessen in the introduction to Breaking Smart:

In 2007, right before the first iPhone launched, I asked Steve Jobs the obvious question: The design of the iPhone was based on discarding every physical interface element except for a touchscreen. Would users be willing to give up the then-dominant physical keypads for a soft keyboard?
His answer was brusque: “They’ll learn.”

It seems quite clear that Jobs wasn't applying intuition at the lowest level here. And it seems like the end result could have ended up worse off if he ended up applying intuition at lower levels. He even explicitly says:

You can't connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something - your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.

I find neither examples I came up with convincing. But are there circumstances where applying intuition at lower levels is a strategic mistake?

Replies from: SaidAchmiz, SaidAchmiz
comment by Said Achmiz (SaidAchmiz) · 2018-03-01T00:36:36.648Z · LW(p) · GW(p)

I find neither examples I came up with convincing. But are there circumstances where applying intuition at lower levels is a strategic mistake?

Applying intuition at lower levels is a strategic mistake when you are substantially more certain that your high-level intuition is well-honed, than you are of your ability to explicitly decompose the high level into lower-level components.

(It can also be a strategic mistake for computational-cost reasons, as I outline in my other comment.)

comment by Said Achmiz (SaidAchmiz) · 2018-03-01T00:45:33.599Z · LW(p) · GW(p)

Also, why is the Steve Jobs example unconvincing? It seems, in fact, an example of the sort of thing I am talking about.

Here’s something that Bruce Tognazzini (HCI expert and author of the famous Apple Human Interface Guidelines) said about Steve Jobs:

Steve Jobs was also one of the greatest human-computer interaction designers of all time, though he would have adamantly denied it. (That’s one of Apple’s problems today. They lost the only HCI designer with any power in the entire company the day Steve died, and they don’t even know it.)

Had you asked Steve Jobs to break down his intuitions into lower-level components, and then evaluate those, he may well have failed. And yet he made incredible, groundbreaking, visionary products, again and again and again. He had good reason to be confident in his high-level intuitions. Why would he want to discard those, and attempt a lower-level analysis?

Replies from: 0xpr
comment by Teja Prabhu (0xpr) · 2018-03-01T01:31:30.164Z · LW(p) · GW(p)

I had worded it somewhat poorly, I wasn't intending to say that Steve Jobs should have attempted a lower level analysis in technology design.

I just found it unconvincing in the sense that I couldn't think of an example where applying lower level intuitions was a strategic mistake for me in particular. As you mention in your other comment, I am not substantially more certain that my high-level intuition is well-honed in any particular discipline.

More generally, Steve Jobs' consistently applied high-level intuition to big life decisions too ― as evidenced by his commencement speech. It on the whole worked out for him I guess, but he also did try to cure his cancer with alternative medicine which he later regretted.

I completely agree with your computational tradeoff comment though.