Weak Evidence is Common

post by dkl9 · 2023-07-16T23:37:40.983Z · LW · GW · 5 comments

This is a link post for https://dkl9.net/essays/weak_evidence.html

Contents

  Clarification
None
5 comments

Epistemic status: parody well-intentioned with known mistakes (see comments), more confident about clarification at the end

Followup to: Strong Evidence is Common [LW · GW]

One time, someone asked me if I was evil. I said "no". Afterward, they probably didn't believe I wasn't evil. I'm guessing they wouldn't have happily accepted a bet at 20:1 odds that I don't deliberately hurt people.

The prior odds that someone is evil are realistically 1:10. Posterior odds above 1:20 implies that the odds ratio of me saying "no" is less than 2:1, or roughly 1 bit of evidence. That's a little bit of evidence.

Seeing an Onion headline say "X did Y" is teeny evidence that X didn't Y. Someone telling you "I am not gay" is small evidence they they are not gay. Putting a complicated, unclear technical claim into ChatGPT and getting "correct" is little evidence that the claim is correct. Tiny odds ratios lurk behind many encounters.

On average, people are overconfident, but 12% aren't. It takes 8:1 evidence to conclude someone isn't overconfident. Asking if they're overconfident and getting "yes" won't be enough, and might be negative.

Running through Bayes' Rule explicitly might produce a bias towards significant values. Extraordinary claims require extraordinary evidence, and a lot of the evidence you get is quite ordinary.

Clarification

This is not to contradict the original, but to remind you not to take it too far.

Strong evidence is common (but certainly not guaranteed) for questions with mostly-honest sources and tiny priors, such as

In many of those cases, the priors are tiny sith there are thousands or millions of plausible options you'd consider before the evidence.

Weak evidence is common when the source of evidence is likely to lie or be stupid, and that situation seems to be especially common with binary questions.

5 comments

Comments sorted by top scores.

comment by Dweomite · 2023-07-17T04:55:18.238Z · LW(p) · GW(p)

If you took the original post to mean that weak evidence isn't common, I'd contend you took the wrong lesson. Encountering strong evidence can be a common occurrence while still being much less common than encountering weak evidence.

You are constantly bombarded by evidence all the time, so something can be true of only a tiny fraction of all evidence you encounter and still be something you experience many times per day.

Also, each observation is simultaneously evidence of many different things.  When someone asked if you were evil and you said "no", that was weak evidence against you being evil, but also fairly strong evidence that you speak English.  If I put my hand out the window and it gets wet, that's pretty strong evidence that it's raining outside my window, but also weaker evidence that it's raining on the other side of town.

I think you're right to point out that just because strong evidence is common in general doesn't necessarily mean that strong evidence about some specific question you are interested in will be common.  There are definitely questions for which it is hard to find strong evidence.

But I don't especially trust your generalization of when strong evidence should be expected, and I think some of your examples are confused.

In asking how hard it is to get evidence that you can beat the stock market, I think you are misleadingly combining the questions "how likely is it that you can beat the market?" and "in worlds where you can, how hard is it to get evidence of that?" in order to imply that the evidence is hard to get, when I think most of the intuition that you are unlikely to see such evidence is coming from the "it's unlikely to be true" side. (Also, it is actually not true that getting >50% of trades right means you will be profitable, because the size of the gains or losses matters.)

And asking someone "are you overconfident?" may not give you very much evidence about whether they are overconfident or not, but that's probably far from the best strategy for gathering evidence on that question.

comment by ChristianKl · 2023-07-17T10:56:45.577Z · LW(p) · GW(p)

One time, someone asked me if I was evil. I said "no". Afterward, they probably didn't believe I wasn't evil. I'm guessing they wouldn't have happily accepted a bet at 20:1 odds that I don't deliberately hurt people.

What would you expect a person who's evil to answer? I wouldn't expect them to answer "yes" but "no" is the most likely answer that I would expect an evil person to give. 

Saying "no" means that you refuse to give any evidence that you aren't evil when exposed to the question, which seems to me a tiny bit more likely for someone who doesn't really have that evidence. 

Theoretically, a trading method that gets over 50% of its trades right will be profitable. 

That's not true. Different trades have a different impact. A trading strategy that gets 90% of it's trades right but makes one trade that bankrupts the whole company is not a profitable strategy. 

Seeing an Onion headline say "X did Y" is teeny evidence that X didn't Y. 

I think that's doubtful. Most Onion headline are about events where there's a strong prior against the event happening. 

Sometimes a true statement still makes for a good ironic article. One of the greatest articles by the German equivalent of the Onion, Der Postillon, was an article about how politicians want to change processes around rescuing migrant ships because too much of them had migrants drown in a short time frame. Being very explicit about the game theoretic position where the Western politician are okay with a certain amount of ships drowning isn't something you would have read in normal news sources but which is both true and funny.

Replies from: dkl9
comment by dkl9 · 2023-07-17T15:43:10.134Z · LW(p) · GW(p)

I don't understand what you're getting at with your response to the question of personal evil.

You're right about trading.

Seeing an Onion headline say "X did Y" is teeny evidence that X didn't Y. 

...

I think that's doubtful.

In which direction? Do you mean to say that it's no evidence, or it's strong evidence? You speak of "a strong prior against the event", but the strength of the prior doesn't have any necessary relation to the strength of the evidence.

comment by aphyer · 2023-07-16T23:48:00.071Z · LW(p) · GW(p)

Putting a claim into ChatGPT and getting "correct" is little evidence that the claim is correct. 

Unless your claims are heavily preselected e.g. for being confusing ones where controversial wisdom is wrong, I think this specific example is inaccurate?  If I ask ChatGPT 'Is Sarajevo the capital of Albania?', I expect it to be right a large majority of the time.

Replies from: dkl9
comment by dkl9 · 2023-07-16T23:59:40.643Z · LW(p) · GW(p)

Fixed, thanks. I implicitly assumed that all ChatGPT use we cared about was about complicated, confusing topics, where "correct" would be little evidence.