When should you defer to expertise? A useful heuristic (Crosspost from EA forum)

post by Noosphere89 (sharmake-farah) · 2022-10-13T14:14:56.277Z · LW · GW · 3 comments

This is a link post for https://forum.effectivealtruism.org/posts/ngFdWkELusg5gAKdp/when-should-you-defer-to-expertise-a-useful-heuristic

Contents

  Now, before I begin, I want to list caveats here:
  Conclusion
    What about selection bias?
None
3 comments

One important aspect of our lives as we search for knowledge is knowing what and who to defer to, as we usually must take a lot of our knowledge on faith in expertise. However, how should you defer on issues, given that certain areas could be vastly wrong?

Well, I'll introduce some heuristics from Chris Hallquist that might help you to defer better.

They can be used in the following ways:

  1. When an EA defers to a non-EA expert, or the movement as a whole defers to non-EA expertise.

  2. When a less-knowledgeable EA defers to a more knowledgeable EA on something.

  3. When someone outside a field defers to an insider expert.

Now, before I begin, I want to list caveats here:

  1. The heuristic only applies to non-moral fields.

  2. The heuristic assumes the field is sound. In an upcoming post, I'll talk about signs a field may have unsound bases, and what to expect there.

  3. It's not a replacement for EV calculations.

  4. If you're in a field or plan to work in a cause area, it's best to replace this heuristic with this post by Emrik: The underappreciated value of original thinking below the frontier.

https://www.lesswrong.com/posts/KmkZriGwkn2vDx8gB/the-underappreciated-value-of-original-thinking-below-the [LW · GW]

But let's begin.

Conclusion

When the data show an overwhelming consensus in favor of one view (say, if the number of dissenters is less than the Lizardman's Constant), this almost always ought to swamp any other evidence a non-expert might think they have regarding the issue.

When a strong but not overwhelming majority of experts favor one view, non-experts should take this as strong evidence in favor of that view, but there's a greater chance that evidence could be overcome by other evidence (even from a non-expert's point of view).

When there is only barely a majority view among experts, or no agreement at all, this is much less informative than the previous two conditions. It may indicate agnosticism is the appropriate attitude, but in many cases non-experts needn't hesitate before having their own opinion.

Expert opinion should be discounted when their opinions could be predicted solely from information not relevant to the truth of the claims. This may be the only reliable, easy heuristic a non-expert can use to figure out a particular group of experts should not be trusted.

What about selection bias?

Emrik raised a concern about deferring to experts in that the most informed people are also selection biased to believe that their field is sound, from his post here: The Paradox of Expert Opinion, link below.

https://www.lesswrong.com/posts/S6Qcf5EgX5zAozTAa/the-paradox-of-expert-opinion [LW · GW]

This is why it's so rare for the 1st, strongest condition to hold in practice. Not always, but unless selection effects are controlled for, it's going to produce wrong results.

So what's next? This is hopefully a useful resource so that you can defer quite a bit better and with better reasons than before this post.

3 comments

Comments sorted by top scores.

comment by Jiro · 2022-10-14T22:30:16.580Z · LW(p) · GW(p)

Expert opinion should be discounted when their opinions could be predicted solely from information not relevant to the truth of the claims.

I can't think of how to usefully determine that some sort of information is not relevant to the truth of the claims. In some sense, everything is; I can predict someone's opinion on homeopathy by observing that they're a doctor. Although you could say that being a doctor is relevant to the truth of the claims (people who choose to become doctors rather than homeopaths make this choice because medicine works and homeopathy doesn't), it's a rather indirect relevance.

comment by Dagon · 2022-10-13T17:02:57.049Z · LW(p) · GW(p)
  1. The heuristic only applies to non-moral fields.
  2. The heuristic assumes the field is sound. In an upcoming post, I'll talk about signs a field may have unsound bases, and what to expect there.

My initial reaction is "are there significant fields for which the advice is necessary (it's not obvious to most readers that experts are on the right track) and for which either of these, let alone both, are true?"  A few examples of things which you think your readers are incorrectly down-weighting expert opinion would help a lot.

Your given examples aren't about fields, but individuals, and the first two seem like morally-relevant fields - that's fine, but specifying what questions and what expertise-differential levels you're using (and where the boundary conditions are where it's neutral whether to defer or to think originally) would go a long way.

Replies from: sharmake-farah
comment by Noosphere89 (sharmake-farah) · 2022-10-14T11:30:50.215Z · LW(p) · GW(p)

A good example: Economics. While some of the work is tainted by ideology, most aren't. And this leads to a few conclusions.

  1. Capitalism is the best system of economics in practice, with a government.

  2. Tariffs aren't good things, contra Trump.