Where Are We the Weakest?

post by DiscyD3rp · 2013-07-09T18:07:25.823Z · LW · GW · Legacy · 9 comments

Contents

9 comments

As rationalists, we should be able to consistently and accurately make predictions that enable us to act effectively.

As humans, we don't. At least not perfectly.

We need to improve. Many of us have, or at least believe we have. However, it's a notably hacked improvement. PredictionBook is an excellent source of feedback on how well we're doing, but there's more detailed information that isn't easily available that I think could be incredibly useful. Questions I would like to see answered are:

Before we are able to improve as a community, we need to know where to improve. I'd love to hear suggestions on how to answer these questions in the comments.

9 comments

Comments sorted by top scores.

comment by buybuydandavis · 2013-07-09T21:57:32.761Z · LW(p) · GW(p)

I think we're weaker on acting according to our predictions than on the predictions themselves.

Replies from: afterburger
comment by afterburger · 2013-07-11T22:04:47.698Z · LW(p) · GW(p)

I agree. Whatever process copies rational conclusions back into subconscious emotional drivers of behavior doesn't seem to work too well. For me, I enjoy cookies just about every day, despite having no rational reason to eat them that often. Eating cookies does not fit into my long term utility-maximizing plans, but I am reluctant to brainwash myself.

Replies from: rosecongou
comment by rosecongou · 2013-07-18T10:02:24.980Z · LW(p) · GW(p)

In all seriousness, how do you know that you're not simply brainwashed into believing cookies are making you happy?

For example, during my religious years, attending a 5-hour prayer meeting made me feel happier -- even ones where not much English was spoken. Much of this was a learned association between attendance and the feeling of "doing the right thing," in retrospect. Once I no longer thought of it as "the right thing," the happiness I derived from it waned.

Replies from: afterburger
comment by afterburger · 2013-07-19T03:24:11.223Z · LW(p) · GW(p)

I know cookies make me unhappy in the long run, but I enjoy eating cookies in the short run. I could name a bunch of parts of the cookie-eating experience that I like, such as the feeling of sleepiness and contentment caused by eating a lot.

You could argue that any feeling is "brainwashing", meaning that my feelings are controlled by my physical brain, which is something separate from me. I am deeply uncomfortable with all of the current solutions to the hard problem of consciousness. If I am self-aware, then it seems like all matter must be aware in the same sense that I am not a philosophical zombie.

comment by Qiaochu_Yuan · 2013-07-09T23:43:09.459Z · LW(p) · GW(p)

What kinds of predictions are we the least successful at predicting? (weakest calibration, smallest accuracy)

This doesn't seem like a useful question to answer in isolation. It's easy to come up with extremely hard but also extremely useless prediction questions.

comment by Cthulhoo · 2013-07-09T22:04:27.351Z · LW(p) · GW(p)

I usually proceed by asking a different question (the question if you want). Where am I not winning? Which areas of my life am I unsatisfied with? Then I go on asking the question you mentioned (or very similar ones):

  • Why am I not winning?
  • How much important is this area to me? It should be at least a bit since the issue came to my mind, minor problems are usually automatically filtered out (not always, of course, a problem I see could be the symptom of a bigger one, but I should at least be aware of something)
  • Can I identify a solution?
  • Would it be easy to apply? Better: would the cost/benefint ratio be worth it?
  • Can I generalize one solution to many problems?
comment by [deleted] · 2013-07-09T19:53:30.307Z · LW(p) · GW(p)

The easiest predictions to improve on and the most immediately beneficial are falsifiable predictions. See 'Conjectures and Refutations' by Sir Karl Popper. A highly accurate prediction could be a false positive.

Replies from: Jayson_Virissimo
comment by Jayson_Virissimo · 2013-07-10T17:22:17.948Z · LW(p) · GW(p)

Arguably, if it is unfalsifiable, then it isn't a "prediction."

comment by summerstay · 2013-07-10T15:19:16.816Z · LW(p) · GW(p)

Perhaps a good place to start would be the literature on life satisfaction and happiness. Statistically speaking, what changes in life that can be made voluntarily lead to the greatest increase in life satisfaction at the least cost in effort/money/trouble?