On Being Robust

post by TurnTrout · 2020-01-10T03:51:28.185Z · LW · GW · 7 comments

Contents

  Concrete examples
  The general philosophy
None
7 comments

Inspired in part by Being a Robust Agent [LW · GW]. Flipside of Half-assing it with everything you've got.

Do you ever feel... fake? Like, at any minute, Scooby Doo and the gang might roll up and unmask you as a freeloading fraud impostor in front of everyone?

There are a lot of things to say about the impostor syndrome on a psychological basis (the fears are often unrealistic / unmerited, etc). But I'd like to take another angle. For a few years, I've tried to just make a habit of being un-unmaskable. Although this is a useful frame for me, your mileage may vary.

My point isn't going to just be "do the things you know you should". I think we're often bad at judging when corners are okay to cut, so you probably do better just by having the policy of not cutting corners, unless it's extremely obviously alright to do so. That is, generally err against using scissors when confronted with corners, even if it makes sense in the moment.

Concrete examples

The general philosophy

This robustness is a kind of epistemic humility - it's the kind of reasoning that robustly avoids the planning fallacy, only generalized. It's the kind of reasoning that double-checks answers before turning in the test. It's best practices, but for your own life.

I try to live my mental life such that, if people could read my thoughts, they would think I'm doing things right. That doesn't mean I'm always being polite to people in my mind, but it means that I'm not being deceitful, or unfair, or secretly cutting corners on work I'm doing for them.[2]

Again, the point isn't "have good habits and be happy". The point is that I think we often cut too many corners, and so I recommend a policy which leans towards not cutting corners (even when it locally makes sense). The benefits for me have been twofold: getting better results, and feeling more secure about myself while getting those results.


  1. Ironically, the first draft of this spelled "impostor" as "imposter". ↩︎

  2. Naturally, I probably fail anyways sometimes, because I'm somewhat biased / unable to achieve full transparency for my thoughts. ↩︎

7 comments

Comments sorted by top scores.

comment by Raemon · 2020-01-10T05:57:50.797Z · LW(p) · GW(p)

Hmm, this all roughly makes sense, but I feel like there was some kind of important generator here that you were aiming to convey that I didn't get. 

I think you should probably do most of these things, but not sure which order to do them in, and meanwhile, I think so long as you're afraid of being unmasked part of the problem seems like it's about the fear itself?

Replies from: TurnTrout
comment by TurnTrout · 2020-01-10T06:55:29.160Z · LW(p) · GW(p)

I think the important generator is: being robust seems like a solution to this "generalized planning fallacy"[1], where you don't correctly anticipate which corners should not be cut. So, even though you could theoretically excise some wasted motions by cutting pointless corners, you can't tell which corners are pointless. Therefore, a better policy is just not cutting corners by default.

I think you should probably do most of these things, but not sure which order to do them in,

TBC, the main point isn't that people should do these specific things per se, the main thing is the overall mindset.

and meanwhile, I think so long as you're afraid of being unmasked part of the problem seems like it's about the fear itself?

This is what I was getting at with

There are a lot of things to say about the impostor syndrome on a psychological basis (the fears are often unrealistic / unmerited, etc). But I'd like to take another angle.

I think the fear itself is the key problem with impostor syndrome, and I wasn't trying to say "just be so good you feel secure" should be the main line of attack on that insecurity.


  1. I don't particularly like this name, but it's just a temporary handle. In the planning fallacy, you optimistically stress-test your planned schedule. In this case, you optimistically stress-test your life in a broader sense; being robust attempts to counter that. ↩︎

Replies from: Raemon, jmh
comment by Raemon · 2020-01-12T21:06:58.841Z · LW(p) · GW(p)

I think the important generator is: being robust seems like a solution to this "generalized planning fallacy"[1] [? · GW], where you don't correctly anticipate which corners should not be cut. So, even though you could theoretically excise some wasted motions by cutting pointless corners, you can't tell which corners are pointless. Therefore, a better policy is just not cutting corners by default.

Ah, that does make the point much clearer, thanks!

Replies from: TurnTrout
comment by TurnTrout · 2020-01-12T23:43:56.906Z · LW(p) · GW(p)

Awesome. I should also note this generator is post hoc; I was (trying to) do this for a few years before I was even thinking about the planning fallacy.

comment by jmh · 2020-01-12T19:21:59.390Z · LW(p) · GW(p)

This reflect both a couple of comments I've made regarding rules versus analyzing/optimizing as well as very unclear thought I've had bouncing around in my head for a little while now. The though is about the tendency of discussion here being very formal and model oriented, as if we really can optimize in our daily lives like occurs in the theoretical world of our minds and equations and probabilities. I'm not saying that approach is something to avoid, only that it does tent to set the focus on a view of precision that probably does not obtain for most in a majority of situations. (The recent post about acting on intuitions rather than the calculations, then tracking those over a year or two to see what you learn fits here too.)

This rule approach clearly takes that every case decision analysis choice away, if one buys into following the rule. However, we all know that in some cases you can get away with violating the rule (and perhaps even should violate the rule -- or revise as has been suggested in other threads/comments as I recall). At the same time if can be difficult, as you mention, to know just which cases are good for violating the general rule. I would add that it might not be enough to keep track of when we violate the rule and what the results were -- the probably hangs on just how representative the individual cases are and how well we can tell the differences in any truly complex case.

This seems all very paradoxical to me, and generally I can deal with paradox and a contradictory world (or perhaps just in my own behavior when I try viewing it from an outside perspective). Still, I find myself wanting to wrap my own thinking around the situation be bit better as I read various approaches or thoughts by people here.

comment by Vaughn Papenhausen (Ikaxas) · 2020-01-14T15:37:53.733Z · LW(p) · GW(p)

This seems like another angle on "Play in Hard Mode". Is that about right?

Replies from: TurnTrout
comment by TurnTrout · 2020-01-14T16:26:51.495Z · LW(p) · GW(p)

Yeah, I think that's quite close to this concept - thanks for the link.