Being right isn't enough. Confidence is very important.

post by mako yass (MakoYass) · 2020-04-07T01:10:52.517Z · LW · GW · 8 comments

Contents

8 comments

There are two machines that predict weather. One machine always reports the probability of a storm as 1%, because 1% is the local yearly rate of storms. This machine turns out to be perfectly well calibrated and totally unbiased.

Another machine pays attention to local data, and processes it in a clever way. It always gives numbers like either 98% or 0.05%, depending on the day. Impressively, it also turns out to be perfectly well calibrated and unbiased.

Both are predicting the same category of event, both are perfectly calibrated- the event does happen or fail to happen at the rate they claim it will-, but the confident machine is much more useful than the simpler, low-confidence machine.

Being right and sane means being well calibrated. It imposes no further requirements. Being right and sane is not enough. Moving well in the world requires your predictions to also sometimes be confident.


Feel the pain of confidence's absence. Imagine that an enemy tank has entered your firing range. You can hear it rolling and occasionally clambering through the foliage, but you can't see it, your night cameras are damaged, however, your network holds a lot of good historical data. You figure out the most likely place for it to be given what you can hear and what you all remember, and you fire in that direction. You make the right choice given the data and the cognitive resources you had available to you. You are following a sound algorithm. Nobody can call you insane. Your record as a predictor will have a perfectly triangular histogram.

You miss. Your enemy's night vision is fine, and so they shoot back at you. Some of your parts can be salvaged but they probably wont be. It would have been really good if you'd had more eyes.

8 comments

Comments sorted by top scores.

comment by cousin_it · 2020-04-07T10:37:09.999Z · LW(p) · GW(p)

The phrasing "being right isn't enough" seems a bit off, because the second machine is right more often than the first. So maybe the post is more about calibration vs confidence. But there's no tension between these two things, because they can be combined into log score - a single dimension that captures the best parts of both.

comment by Gordon Seidoh Worley (gworley) · 2020-04-09T00:57:33.172Z · LW(p) · GW(p)

You seem to be pointing at what I would describe as the difference between accuracy and precision.

comment by rpapp · 2020-04-07T04:39:25.276Z · LW(p) · GW(p)

I think what you have done here is pinpoint the difference between true belief and knowledge. It is indeed a very important distinction. I would be wary of using the word “confidence” though, because in US parlance it usually denotes a state of mind (which can be unsubstantiated) rather than express the measure by which your prediction can be considered well-founded.

Replies from: Jadael, MakoYass
comment by Trevor Hill-Hand (Jadael) · 2020-04-07T12:59:23.609Z · LW(p) · GW(p)

The above comment just helped me realise that the connotation above is why I like the word "credence". Does "credence" have similar problems in other cultures though?

comment by mako yass (MakoYass) · 2020-04-07T23:14:52.563Z · LW(p) · GW(p)

It seems relevant to that, I guess.

I was considering discussing that. It seems to me that the common image of confidence is just the appearance that results from having a lot of epistemic confidence (about the things we're usually interacting with). I think the contemporary understanding of the word is probably just confusion, it will wash away when people learn what the real underlying thing is.

comment by Dagon · 2020-04-07T15:52:21.487Z · LW(p) · GW(p)

I don't think both machines are correct at the same rates. If they wager among themselves every day, with machine one taking 99-1 against a storm, and machine two taking whatever odds it calculates, they'll OFTEN find bets that both think they can win, and machine two will get the vast majority of the money.

Machine one is WRONG most days. That's not about confidence, that's about specificity of prediction.

Replies from: dxu
comment by dxu · 2020-04-07T21:25:51.648Z · LW(p) · GW(p)

The original post wasn’t talking about “correctness”; it was talking about calibration, which is a very specific term with a very specific meaning. Machines one and two are both well-calibrated, but there is nothing requiring that two well-calibrated distributions must perform equally well against each other in a series of bets.

Indeed, this is the very point of the original post, so your comment attempting to contradict it did not, in fact, do so.

Replies from: Dagon
comment by Dagon · 2020-04-07T23:45:36.197Z · LW(p) · GW(p)

The post is titled:

Being right isn't enough. Confidence is very important.

It's not talking about calibration - both are asserted to be equally well-calibrated. It's talking about a difference it labels "confidence", and I assert "correctness" or "usefulness" would be better words.