[SEQ RERUN] When (Not) To Use Probabilities

post by MinibearRex · 2012-07-10T05:33:02.802Z · LW · GW · Legacy · 6 comments

Contents

6 comments

Today's post, When (Not) To Use Probabilities was originally published on 23 July 2008. A summary (taken from the LW wiki):

 

When you don't have a numerical procedure to generate probabilities, you're probably better off using your own evolved abilities to reason in the presence of uncertainty.


Discuss the post here (rather than in the comments to the original post).

This post is part of the Rerunning the Sequences series, where we'll be going through Eliezer Yudkowsky's old posts in order so that people who are interested can (re-)read and discuss them. The previous post was Fake Norms, or "Truth" vs. Truth, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it here, posting the next day's sequence reruns post, or summarizing forthcoming articles on the wiki. Go here for more details, or to have meta discussions about the Rerunning the Sequences series.

6 comments

Comments sorted by top scores.

comment by moridinamael · 2012-07-10T16:12:11.721Z · LW(p) · GW(p)

I've found that I have benefitted from "making up" probabilities in some circumstances. I have an unfortunate but probably normal tendency to think something along the lines of, "Just as I would have predicted," whenever something happens which confirms my general attitude about that thing. Of course, I don't notice when the converse happens, and it doesn't occur to me to be surprised, and thus I don't update. Standard Confirmations Bias really.

But if I explicitly think, "There's a 70% chance Bob will be late to our meeting," and then Bob is early, I've provided myself with the useful and inescapable information that I may be miscalibrated about Bob's punctuality and may be selectively remembering one time his lateness cost me something. If I never tried to make an internalized probability assessment, I would not have been surprised in the same way. This is probably because I would cover up my mistaken nonverbal expectation of Bob's lateness with some other fallacious thought like, "Well, nobody is late all the time."

comment by maia · 2012-07-12T20:52:23.386Z · LW(p) · GW(p)

It's interesting that Eliezer wrote this, because the Calibration Game, for instance, is based on exactly this premise: getting better at pulling numbers out of your butt.

comment by VincenzoLingley · 2012-07-10T14:27:12.177Z · LW(p) · GW(p)

I don't want to sound rude, but what is the point of this rerun? Looking at the reposted articles from the last month, most have fewer than 10 comments each.

Replies from: RolfAndreassen
comment by RolfAndreassen · 2012-07-10T16:45:06.316Z · LW(p) · GW(p)

I like the reruns; it gives me a chance to re-read the Sequence in a structured way, without doing a full formal Archive Binge. I suggest that comments are not the metric to go by; presumably it is less interesting to make comments that Eliezer definitely will not respond to.

Replies from: Rubix
comment by Rubix · 2012-07-10T22:23:29.260Z · LW(p) · GW(p)

I'd love to see a discussion space on Sequence posts that's closed to high-status Less Wrong folk, it might be really interesting.

Replies from: MinibearRex
comment by MinibearRex · 2012-07-11T06:52:34.267Z · LW(p) · GW(p)

I doubt you'd actually get much. A lot of what's in the original sequences was novel, and somewhat controversial at the time. Now, most of the content is essentially taken for granted.