Realistic epistemic expectations
post by JonahS (JonahSinick) · 2015-05-31T22:52:30.303Z · LW · GW · Legacy · 11 commentsContents
11 comments
When I state a position and offer evidence for it, people sometimes complain that the evidence that I've given doesn't suffice to establish my position. The situation is usually that I'm not trying to give a rigorous argument for my position, and I don't intend to claim that the evidence that I provide suffices to establish my position.
My goal in these cases is to offer a high-level summary of my thinking, and to provide enough evidence so that readers have reason to Bayesian update and to find the view sufficiently intriguing to investigate further.
In general, when a position is non-obvious, a single conversation is nowhere near enough time to convince a rational person that it's very likely to be true. As Burgundy recently wrote:
When you ask Carl Shulman a question on AI, and he starts giving you facts instead of a straight answer, he is revealing part of his book. The thing you are hearing from Carl Shulman is really only the tip of the iceberg because he cannot talk fast enough. His real answer to your question involves the totality of his knowledge of AI, or perhaps the totality of the contents of his brain.
If I were to restrict myself to making claims that I could substantiate in a mere ~2 hours, that would preclude the possibility of me sharing the vast majority of what I know.
In math, one can give rigorous proofs starting from very simple axioms, as Gauss described:
I mean the word proof not in the sense of lawyers, who set two half proofs equal to a whole one, but in the sense of mathematicians, where 1/2 proof = 0, and it is demanded for proof that every doubt becomes impossible'.
Even within math, as a practical matter, proofs that appear to be right are sometimes undercut by subtle errors. But outside of math – the only reliable tool that one has at one's disposal is Bayesian inference. In 2009, charity evaluator GiveWell made very strong efforts to apply careful reasoning to identify its top rated charity, and gave a "conservative" cost-effectiveness estimate of $545/life saved, which turned out to have been wildly optimistic. Argumentation that looks solid on the surface often breaks down on close scrutiny. This is closely related to why GiveWell emphasizes the need to look at giving opportunities from many angles, and gives more weight to robustness of evidence than to careful chains of argumentation.
Eliezer named this website Less Wrong for a reason – one can never be certain of anything – all rational beliefs reflect degrees of confidence. I believe that discussion advances rationality the most when it involves sharing perspectives and evidence, rather than argumentation.
11 comments
Comments sorted by top scores.
comment by Vladimir_Nesov · 2015-06-01T02:40:44.599Z · LW(p) · GW(p)
A lot of communication is about explaining what you mean, not about proving that something is true. In many cases, you don't need to provide any evidence at all, as it's already available or trivially obtainable to your audience, the bottleneck is knowing what to look for. So it may well be sufficient to give a bit of motivation to keep them learning (such as the beauty of the concepts, or of their presentation). The evidence about truth of the eventual conclusions, or clear idea of what they are, could remain irrelevant throughout.
Replies from: JonahSinick↑ comment by JonahS (JonahSinick) · 2015-06-01T02:51:06.318Z · LW(p) · GW(p)
So it may well be sufficient to provide a bit of motivation to keep them learning, with evidence about truth of the eventual conclusions irrelevant throughout.
Are you saying that it might be best to provide no evidence, and instead just give references?
Replies from: Vladimir_Nesov↑ comment by Vladimir_Nesov · 2015-06-01T03:12:43.616Z · LW(p) · GW(p)
It's often possible to sidestep the difficulty of communicating evidence by focusing on explaining relevant concepts, which usually doesn't require evidence (or references), except as clarifying further reading. Evidence may be useful as motivation, when it's easier to communicate in the outline than the concepts, but not otherwise. And after the concepts are clear, evidence may become easier to communicate.
(Imagine trying to convince a denizen of Ancient Greece that there is a supermassive black hole at the center of our galaxy. You won't get to presenting actual astronomical observations for quite some time, and might start with the entirely theoretical geometry and mechanics. Even the mechanics doesn't have to be motivated by experimental verification, as it's interesting as mathematics on its own. And mentioning black holes may be ill-advised at that stage.)
Replies from: JonahSinick, taryneast↑ comment by JonahS (JonahSinick) · 2015-06-01T04:40:51.447Z · LW(p) · GW(p)
Yes, thanks, this is helpful.
↑ comment by taryneast · 2015-06-04T23:55:01.994Z · LW(p) · GW(p)
I think this depends strongly on whether the person you're explaining-to is initially open or closed to your ideas.
An example - if a new-earth creationist approached me to talk about their ideas on the creation of earth - I would not want them to spend time explaining their ideas until they showed me sufficient evidence to warrant my expenditure of time.
By comparison, most people on LessWrong I'm willing to give the benefit of the doubt and let them explain the idea, then go look up evidence to confirm whether it has a solid foundation.
Keeping in mind that tendency to auto-accept ideas told to us...
comment by Shmi (shminux) · 2015-06-01T00:44:51.200Z · LW(p) · GW(p)
If I were to restrict myself to making claims that I could substantiate in a mere ~2 hours, that would preclude the possibility of me sharing the vast majority of what I know.
Using your mathematical analogy, would it help to explicitly assume a few plausible lemmas whose proof can be given later, and derive your main result from them, in order to cut down on the size of the post?
Replies from: JonahSinick↑ comment by JonahS (JonahSinick) · 2015-06-01T01:51:59.120Z · LW(p) · GW(p)
Yes, that's a good suggestion.
comment by SatvikBeri · 2015-06-26T16:35:02.292Z · LW(p) · GW(p)
One approach I've been working with is sharing models rather than arguments. For example, nbouscal and I recently debated the relative importance of money for effective altruism. It turned out that our disagreement came down to a difference in our models of self-improvement: he believes that personal growth mostly comes from individual work and learning, while I believe that it mostly comes from working with people who have skills you don't have.
Any approach that started with detailed arguments would have been incredibly inefficient, because it would have taken much longer to find the source of our disagreement. Starting with a high-level approach that described our basic models made it much easier for us to hone in on arguments that we hadn't thought about before and make better updates.
comment by KnaveOfAllTrades · 2015-05-31T23:54:13.163Z · LW(p) · GW(p)
I used to have an adage to the effect that if you walk away from an argument feeling like you've processed it before a month has passed, you're probably kidding yourself. I'm not sure I would take such a strong line nowadays, but it's a useful prompt to bear in mind. Might or might not be related to another thing I sometimes say, that it takes at least a month to even begin establishing a habit. While a perfect reasoner might consider all hypotheses in advance or be able to use past data to test new hypotheses, in practice it seems to me that being on the lookout for evidence for or against a new idea is often necessary to give the idea a fair shake, which feels like a very specific case of noticing (namely, noticing when incoming information bears on some new idea you heard and updating).
comment by hairyfigment · 2015-06-01T00:19:05.722Z · LW(p) · GW(p)
In general, when a position is non-obvious, a single conversation is nowhere near enough time to convince a rational person that it's very likely to be true.
Oh, the situation is much worse than that. If your claim is empirical, and needs many background claims to even be coherent, then the evidence required may far exceed what any one person could gather. Compare the claim that one gene with a silly name controls the expression of another - consider the amount of evidence needed to justify the jargon in that case.
Replies from: JonahSinick↑ comment by JonahS (JonahSinick) · 2015-06-01T00:24:52.081Z · LW(p) · GW(p)
then the evidence required may far exceed what any one person could gather
Yes, it can.
But it's possible to do much better on in these situations than people usually do, by developing good heuristics for how to weight the beliefs of others who have relevant subject matter knowledge. If you haven't seen them, see e.g. Vladimir M's post Some Heuristics for Evaluating the Soundness of the Academic Mainstream in Unfamiliar Fields and Nick Beckstead's post Common sense as a prior.