Generic advice caveats

post by Saul Munn (saul-munn) · 2024-10-30T21:03:07.185Z · LW · GW · 1 comments

This is a link post for https://www.brasstacks.blog/caveats/

Contents

1 comment

You were (probably) linked here from some advice. Unfortunately, that advice has some caveats. See below:

from dynomight's essay on advice.

Oh, and one last caveat: all of the above apply to all of the above.

1 comments

Comments sorted by top scores.

comment by Parker Conley (parker-conley) · 2024-11-07T20:20:41.683Z · LW(p) · GW(p)

Another caveat:

  • I am believable and have expertise in very few major life skills, and possibly don't have expertise in the thing you're asking advice for.

Related note: I think developing the skill of identifying believability and expertise is very powerful (though I have only been applying said skill for a couple of years explicitly; caveat emptor, lol.)


Here's Cedric Chin outlining believability defined by Ray Dalio: 

Technique summary:
Believable people are people who have 1) a record of at least three relevant successes and 2) have great explanations of their approach when probed.

You may evaluate a person's believability on the subject matter at hand by applying this heuristic. When interacting with them:

  1. If you’re talking to a more believable person, suppress your instinct to debate and instead ask questions to understand their approach. This is far more effective in getting to the truth than wasting time debating.
  2. You’re only allowed to debate someone who has roughly equal believability compared to you.
  3. If you’re dealing with someone with lower believability, spend the minimum amount of time to see if they have objections that you’d not considered before. Otherwise, don’t spend that much time on them.

Here's Gary Klein, founder of Naturalistic Decision Making, outlining seven dimensions of expertise:

We want pragmatic guidelines for deciding which if any purported experts to listen to when making a difficult and important decision. How can we know who is really credible?

Bottom line: We cannot know for sure. There are no iron-clad criteria.

However, there are soft criteria, indicators we can pay attention to. I have identified seven so far, drawing on papers such as Crispen & Hoffman, 2016, and Shanteau, 2015, and on suggestions by Danny Kahneman and Robert Hoffman. Even though none of these criteria are fool-proof, all of them seem useful and relevant:

(a) Successful performance—measurable track record of making good decisions in the past. (But with a large sample, some we do very well just by luck, such as stock-pickers who have called the market direction accurately in the past 10 years.)

(b) Peer respect. (But peer ratings can be contaminated by a person’s confident bearing or fluent articulation of reasons for choices.)

(c) Career—number of years performing the task. (But some 10-year veterans have one year of experience repeated 10 times and, even worse, some vocations do not provide any opportunity for meaningful feedback.)

(d) Quality of tacit knowledge such as mental models. (But some experts may be less articulate because tacit knowledge is by definition hard to articulate.)

(e) Reliability. (Reliability is necessary but not sufficient. A watch that is consistently one hour slow will be highly reliable but completely inaccurate).

(f) Credentials—licensing or certification of achieving professional standards. (But credentials just signify a minimal level of competence, not the achievement of expertise.)

(g) Reflection. When I ask "What was the last mistake you made?" most credible experts immediately describe a recent blunder that has been eating at them. In contrast, journeymen posing as experts typically say they can't think of any; they seem sincere but, of course, they may be faking. And some actual experts, upon being asked about recent mistakes, may for all kinds of reasons choose not to share any of these, even ones they have been ruminating about. So this criterion of reflection and candor is not any more foolproof than the others.