Is there a practitioner's guide for rationality?

post by cedric · 2018-08-13T05:39:06.563Z · LW · GW · 9 comments

Contents

9 comments

Hi everyone, I'm new to the community, and am currently working my way through the sequences — yes, all of them.

In the introduction to the first book of Rationality A-Z, Eliezer says:

I didn’t realize that the big problem in learning this valuable way of thinking was figuring out how to practice it, not knowing the theory. I didn’t realize that part was the priority; and regarding this I can only say “Oops” and “Duh.” Yes, sometimes those big issues really are big and really are important; but that doesn’t change the basic truth that to master skills you need to practice them and it’s harder to practice on things that are further away. (Today the Center for Applied Rationality is working on repairing this huge mistake of mine in a more systematic fashion.)

Just wanted to ask if CFAR has got any of those reorganised materials up, and if they're linked to from anywhere on this site? Any links to other rationality-as-practice blog posts or books or sequences would also be incredibly appreciated!

9 comments

Comments sorted by top scores.

comment by Vaughn Papenhausen (Ikaxas) · 2018-08-13T06:25:18.226Z · LW(p) · GW(p)

I don't know of a full guide, but here's a sequence exploring applications for several CFAR techniques: https://www.lesswrong.com/sequences/qRxTKm7DAftSuTGvj [? · GW]

comment by Jsevillamol · 2018-08-13T20:05:33.585Z · LW(p) · GW(p)

At risk of stating the obvious, have you considered attending a CFAR workshop in person?

I found them to be a really great experience, and now that they have started organizing events in Europe they are more accessible than ever!

Check out their page.

Replies from: rk, cedric
comment by rk · 2018-08-14T21:51:08.152Z · LW(p) · GW(p)

I wasn't aware that CFAR had workshops in Europe before this comment. I applied for a workshop off the back of this. Thanks!

comment by cedric · 2018-08-14T05:24:02.997Z · LW(p) · GW(p)

I'm afraid I'm based out of Singapore and Saigon at the moment; really far away from where CFAR is active. =[

comment by [deleted] · 2018-08-13T17:26:52.582Z · LW(p) · GW(p)

Ikaxas has already linked to alkjash's sequence, which is excellent. I also wrote a sequence, mostly on habits and planning here [? · GW].

comment by Elo · 2018-08-13T06:24:15.715Z · LW(p) · GW(p)

https://www.lesswrong.com/posts/pjGGqmtqf8vChJ9BR/unofficial-canon-on-applied-rationality [LW · GW]

I have a few dojos published on my own site like this one about Zen koans: http://bearlamp.com.au/zen-koans/

comment by Elo · 2018-08-13T06:24:03.743Z · LW(p) · GW(p)

https://www.lesswrong.com/posts/pjGGqmtqf8vChJ9BR/unofficial-canon-on-applied-rationality [LW · GW]

I have a few dojos published on my own site like this one about Zen koans: http://bearlamp.com.au/zen-koans/

comment by cedric · 2018-08-13T07:37:03.993Z · LW(p) · GW(p)

Thank you so much, keep the links coming.

comment by TruePath · 2018-08-16T09:13:15.373Z · LW(p) · GW(p)

We can identify places we know (inductively) tend to lead us astray and even identify tricks that help us avoid being affected by common fallacies which often aflict humans. However, it's not at all clear if this actually makes us more rational in any sense.

If you mean act-rationality we'd have to study if this was a good life choice. If you mean belief rationality you'd have to specify some kind of measure/notion of importance to decide when it really matters you believed the true thing. After all if it's just maximizing the number of times you believe the truth the best way to be rational is just to memorize giant tables of dice rolls. If it's minimizing false beliefs you might want to avoid forming any at all. Even if you find some more appropriate function to maximize some beliefs obviously should count more than others. I mean you don't want to spend your time memorizing lists of dice roles and forget the facts about being killed by buses if you walk into the street.

But once you realize this point then who knows. It could be the most rational thing in the world to have a totally dogmatic, evidence irresponsive, belief in the existence of some beardy dude in the sky because it's the belief that matters the most and the rule "always believe in God Beard" will thus maximize getting important beliefs right.

I know what you mean. You mean something like avoiding the kind of fallacies that people who always talk about fallacies care about avoiding. But why should those be the most important fallacies to combat etc..