post by [deleted] · · ? · GW · 0 comments

This is a link post for

0 comments

Comments sorted by top scores.

comment by mtcc · 2022-10-06T12:54:22.896Z · LW(p) · GW(p)

A similar idea called Correlation Neglect was coined in this Correlation Neglect in Belief Formation paper.

TLDR: People double-count information. This behavior is probably a type of confirmation bias.

Replies from: None
comment by [deleted] · 2022-10-07T00:16:11.971Z · LW(p) · GW(p)
comment by swarriner · 2022-10-06T14:32:55.470Z · LW(p) · GW(p)

Aside from double-counting, here's a problem; you should have just set your starting priors on the false and true statements as x and 1-x respectively, where x is the chance your whole ontology is screwed up, and you'd be equally well calibrated and much more precise. You've correctly identified that the perfect calibration on 90% is meaningless, but that's because you explicitly introduced a gap between what you believe to be true and what you're representing as your beliefs. Maybe that's your point; that people are trying to earn a rationalist merit badge by obfuscating their true beliefs, but I think at least many people treat the exercise as a serious inquiry into how well-founded beliefs feel from the inside.

Replies from: None
comment by [deleted] · 2022-10-07T00:20:39.273Z · LW(p) · GW(p)