11 core rationalist skills

post by Roko · 2009-12-02T08:09:05.922Z · LW · GW · Legacy · 36 comments

An excellent way to improve one's skill as a rationalist is to identify one's strengths and weaknesses, and then expend effort on the things that one can most effectively improve (which are often the areas where one is weakest). This seems especially useful if one is very specific about the parts of rationality, if one describes them in detail. 

In order to facilitate improving my own and others' rationality, I am posting this list of 11 core rationalist skills, thanks almost entirely to Anna Salamon


Comments sorted by top scores.

comment by Jordan · 2009-12-02T21:05:03.050Z · LW(p) · GW(p)

Great post.

If only we could flesh this out into a four year long class and call it high school...

comment by Tom_Talbot · 2009-12-02T16:07:33.018Z · LW(p) · GW(p)

This is an excellent list, and would serve well as an introduction to Less Wrong.

comment by brazil84 · 2009-12-17T20:17:00.285Z · LW(p) · GW(p)

One of the more common irrationalities I see on the internet is the sin of foolish consistency. i.e. people post something without a lot of thought and then when evidence or arguments arise which undermine their statement, they are overly dismissive because they do not want to appear inconsistent.

comment by ChristianKl · 2009-12-02T15:47:34.028Z · LW(p) · GW(p)

Actually make written down predictions about the future. If you don't make real written down predictions you will never know how bad you are at predicting the world around you.

Replies from: None, Morendil, Roko
comment by [deleted] · 2009-12-02T16:28:12.732Z · LW(p) · GW(p)

The thing is, if you actually make a written down prediction, you're more likely to alter your actions purely for the sake of making it come true.

Replies from: gwern
comment by gwern · 2009-12-02T17:38:34.695Z · LW(p) · GW(p)

You say that like it's a bad thing.

Replies from: None
comment by [deleted] · 2009-12-02T17:45:22.686Z · LW(p) · GW(p)

It is a bad thing, if you predict that something bad will happen.

Replies from: billswift, gwern
comment by billswift · 2009-12-03T03:28:25.016Z · LW(p) · GW(p)

If you don't foresee/predict bad things happening, then you can't do anything to prevent them. UnFriendly AI anyone?

This reminds me of an architectural designer I used to work for, when we were laying out what needed to be done I would try to foresee what could go wrong, so I could head it off. I finally quit trying to do that with him because he kept accusing me of "negativity" as though thinking about bad things make them happen. (Since I was the one actually doing most of the work I still tried to predict what could go wrong, and head it off, I just quit talking to him about it.)

Replies from: None
comment by [deleted] · 2009-12-03T06:44:03.012Z · LW(p) · GW(p)

I can imagine someone predicting something bad happening, seeing that it probably won't happen, and causing it to happen in order to prove they were right.

Replies from: alexflint
comment by alexflint · 2009-12-03T16:58:14.437Z · LW(p) · GW(p)

It's probably best to do this with things that we have almost no control over.

Replies from: None
comment by [deleted] · 2009-12-04T00:44:15.261Z · LW(p) · GW(p)

Quite right.

comment by gwern · 2009-12-03T00:59:14.802Z · LW(p) · GW(p)

It's the poor craftsman who blames his tools!

(Or as Gibbon says, 'The wind and waves are always on the side of the ablest navigators.')

comment by Morendil · 2009-12-02T18:31:22.132Z · LW(p) · GW(p)

Your work context may provide you with frequent opportunities to do that.

For instance, if you are a programmer, you can make predictions about how long a given task is going to take, or alternately how many tasks you can take on in a given period.

If you train or teach people, you can predict what they will have understood at the end of a given session, and test those predictions by asking questions at the end of the session.

More generally, predictions of the form "I will achieve objective X by time T" are a useful class, as you normally have quite a lot of the relevant information, which ought to narrow your confidence bounds.

ETA: keeping appointments is another similar class. If you're never late, you're probably underconfident. (See Umeshisms.) You should have a general degree of confidence in your timeliness, e.g. "I will seek to show up on time 80% of the time." You may adjust that depending on criticality in given contexts, e.g. "...except that I hate to disappoint employers, so I'll show up to work on time 95% of the time".

comment by Roko · 2009-12-02T19:00:48.265Z · LW(p) · GW(p)

How many people actually bother to do this?

Written predictions seem like they would be a lot of work, but maybe this is indeed worth trying.

Replies from: Jack, ChristianKl
comment by Jack · 2009-12-02T19:14:31.495Z · LW(p) · GW(p)

Prediction Book

Replies from: anonym
comment by anonym · 2009-12-02T22:06:31.536Z · LW(p) · GW(p)

Prediction Book is a really promising app, but it unfortunately has lots of problems, and it doesn't look like it's being improved very much at all.

Among the problems are:

  • performance is terrible, making it really frustrating to use, because you're forever waiting for pages to load
  • the UI is confusing, which leads to lots of wrong judgments, though it has improved from what it was
  • the right/wrong judgment is global, not per user, so if 2 people make a prediction on "I will lose 5 pounds before Jan. 1, 2010", then it can only be judged right or wrong for both of them, not right for one and wrong for the other. The only alternative is to make everything private, which eliminates the benefits of seeing other people's estimates on the same issue and of the comments and feedback from other users.
Replies from: Jack
comment by Jack · 2009-12-02T23:07:53.391Z · LW(p) · GW(p)
  1. Despite any problems, it is still a pretty convenient place to record predictions and that was the topic at issue.
  2. I agree with the first two complaints.
  3. I'm not sure I would want predictions with indexicals referring to the user who posted them to apply to all users. That makes little sense since the prediction that you lose wait is totally different from the prediction that anyone else does. It definitely is a little strange to see "I will do x" and not estimate as if it is yourself but I have no idea why it would be helpful to see "other people's estimates on the same issue" since it isn't the same issue. What would be the point of a composite estimation? Maybe it would be nice to have a tool that grouped predictions like this so people could talk about them, but that is hardly the most pressing issue.
  4. You've voted in the site's feedback section, yes?
Replies from: anonym
comment by anonym · 2009-12-03T04:46:17.090Z · LW(p) · GW(p)

Yes, it's definitely convenient. I guess I am a bit frustrated with PB because I thought it had tremendous potential and was very excited about using it. After using it for a while now and making more than a hundred predictions with it, the problems have worn me down and there seems little chance that it'll be improved, since they've said they'll only improve it if it's used a lot (and it doesn't get very much use because of all the problems).

The issue with not having the judgments of public options be able to be different per user is that there are lots of public predictions that include indexicals and that multiple people vote on (I've done it myself before I realized what it leads to), and there will probably continue to be plenty of those kinds of predictions, since there is no suggestion from PB that public predictions should not include indexicals or that people should avoid providing estimates on public predictions that include indexicals.

What happens is that multiple people make predictions, and then the judgment swings back and forth between Right and Wrong as different people judge it Right or Wrong for them, many of them probably thinking that they're rendering judgment for themselves and not for everybody else as well. Now, I make everything private to avoid these kinds of problems, but a site like that with most content private is much less useful than it would be if things were more public, and people could get ideas about things to make predictions on from other people and could comment on each others' predictions.

Another problem is that if, every time I see a prediction with an indexical (of which there are many) that I would like to add an estimate for, I have to create a new prediction and copy/paste the text or type it out again, then it becomes too much of a hassle -- especially given how slow everything is. I don't care how it's implemented, but I should just have to add an estimate and click a button. Anything more than that is too much work when the exact wording for the prediction I want to make already exists and I'm looking at it. Perhaps they could add another button to allow making a private estimate for that prediction, and then allow a private judgment for it that is only visible to the user as well.

And no, I didn't vote in the feedback. Requiring your users to sign up for a different account in order to provide feedback is just obnoxious.

Replies from: matt
comment by matt · 2009-12-06T08:28:47.123Z · LW(p) · GW(p)

Dude, we're trying to help on many fronts. We host OB at no charge; we developed and host LW at no charge; we wrote PB and offer it at no charge. If you tried a little harder, do you think you could come up with an explanation for why we'd use an external feedback service other than that we're obnoxious?

ETA 08:39:51 UTC: Sorry - that was overly snarky. You obviously want to be a passionate user but are being let down by our lack of time to tune and improve the site. Watch for a top level post on PB and its future coming soon.

Replies from: anonym, anonym
comment by anonym · 2009-12-06T09:15:48.162Z · LW(p) · GW(p)

I didn't say you or anybody else developing PB is obnoxious. I said a certain behavior (requiring signing up for 2 accounts) was obnoxious. And since behaviors aren't intrinsically obnoxious or not, I obviously meant that I judge that requiring users to sign up for a second account to give feedback is obnoxious. Colloquially, this just means that I find it annoying, and it doesn't imply that you're trying to annoy anyone or say anything about you as a person. I find the behavior annoying, which I gave as an explanation for why I didn't bother to provide feedback.

I can of course imagine plenty of reasons why you'd use an external feedback service, just like I can imagine plenty of reasons that the performance would be what it is, none of them involving any kind of malevolent intention or lack of skill on your part. Nevertheless, I and quite a few others find PB frustrating to use, which is a real shame for an app that holds so much promise.

For what it's worth, I applaud your pro bono work for OB and LW, and I hope you keep up the good work. I think PB holds incredible promise, and I hope that you do find a way to improve it.

comment by anonym · 2009-12-06T19:46:56.764Z · LW(p) · GW(p)

No apology necessary. I'd probably react similarly if I felt that somebody was being unconstructively critical of an app that I created.

I was just frustrated, as you guessed, because I really care about the idea and the app, and I see so much promise there. I should have just said to the original poster that I didn't provide feedback because I didn't want to sign up for a second account, but my frustration made me get snarky, which I apologize for.

Thanks again to you and everybody else at Tricycle.

comment by ChristianKl · 2009-12-02T19:39:50.462Z · LW(p) · GW(p)

It's not really that there's much time involved.

Let's say you are waiting in line in the supermarket to pay. It costs you no additional time to take a paper and note down an estimated amount of money that you have to pay.

It's probably rather like quiting smoking. Going through life while being fuzzy about your expectations is just easier than making predictions.

However all that talk about cognitive biases doesn't do much when you just gather knowledge but don't change any of your deep seated habits.

Replies from: Roko, Roko
comment by Roko · 2009-12-02T20:44:28.265Z · LW(p) · GW(p)

It would cost me effort and thought cycles; I would pay $100 a year at least to have this prediction/calibration thing done by magic.

comment by Roko · 2009-12-02T20:49:37.447Z · LW(p) · GW(p)

Though, one alternative that seems to work is making bets with people. It seems to work because the thought of gaining coolness/status points over the other person overcomes the resistance to putting in the effort of thinking about the prediction. Apparently some major financial firms (Renaissance?) have a culture where you are actively encouraged to do this.

comment by PlaidX · 2009-12-02T19:08:41.896Z · LW(p) · GW(p)

Actually want an accurate map, because you have Something to protect.

Why does protection have to be everyone's Capitalized goal?

Replies from: DanArmak, Nanani
comment by DanArmak · 2009-12-02T19:27:24.311Z · LW(p) · GW(p)

Yeah, what's wrong with having Something to Destroy?


comment by Nanani · 2009-12-03T00:45:00.717Z · LW(p) · GW(p)

You probably need to find Something to understand that.

I am still looking, myself.

Replies from: DanArmak
comment by DanArmak · 2009-12-03T11:47:00.324Z · LW(p) · GW(p)

So your Capitalized Goal is having Something to Look For?

Replies from: Nanani
comment by Nanani · 2009-12-04T00:13:30.720Z · LW(p) · GW(p)

Ha. You could put it that way. I still don't know what Something is, though.

comment by MendelSchmiedekamp · 2009-12-04T16:35:33.378Z · LW(p) · GW(p)

Or more succinctly and broadly, learn to:

  • pay attention

  • correct bias

  • anticipate bias

  • estimate well

With a single specific enumeration of means to accomplish these competencies you risk ignoring other possible curricula. And you encourage the same blind spots for the entire community of aspiring rationalists so educated.

Replies from: MendelSchmiedekamp, ciphergoth
comment by MendelSchmiedekamp · 2009-12-04T17:23:13.678Z · LW(p) · GW(p)

Proof of how dangerous this sort of list can be.

I entirely forget about:

  • act effectively

After all, how can you advance even pure epistemic rationality without constructing your own experiments on the world?

comment by Paul Crowley (ciphergoth) · 2009-12-04T16:54:29.983Z · LW(p) · GW(p)

Also, the first eleven Virtues of Rationality should be removed from the list.

comment by woozle · 2009-12-04T02:18:52.783Z · LW(p) · GW(p)

I want to add "be wary of conclusions which make you feel safer or require less action", but that may just one of the "standard biases". (I have come to the conclusion that I don't have time to read the referenced book just now, but I suppose I should be suspicious of that conclusion because the alternative requires more work and may challenge the validity of this comment, thus making me feel less safe in making it...)

comment by djcb · 2009-12-03T18:02:38.583Z · LW(p) · GW(p)

I like the idea of a list -- maybe it should really be limited to some fixed number -- say 13 to clarify the rationalist stance on superstition :)

Anyway, perhaps this current list is somewhat unbalanced -- for example, before including analytic philosophy, I think the e.g., familiarity with game theory seems much more important.

Also, the first couple of points are like simple rules to follow, while many of the later ones are more about pointing to fields of knowledge than given something short to keep in mind. There's something to be said for both, but it might be clearer to have separate lists for those, e.g., a list of short rules one can remember ("The map is not the area"), and a list of fields that are important, such as parts of economics, game theory, information theory and so on.

comment by IncidentalEcon · 2009-12-02T16:27:34.326Z · LW(p) · GW(p)

Bravo. Were this a religion, I'd be a member. Wait, I already am. Or is that self-contradictory?