How can I get help becoming a better rationalist?

post by TeaTieAndHat (Augustin Portier) · 2023-07-13T13:41:46.670Z · LW · GW · No comments

This is a question post.

Contents

  Answers
    12 Adam Zerner
    11 Sable
    10 Garrett Baker
    4 Stephen James
    3 Viliam
    3 trevor
    1 melo
None
No comments

Well…

Right now, being ‘a rationalist’ could be said to be a massive part of my identity, at least judging by the absurd amount of time I’ve spent reading posts here, or SSC/ACX, and in a few other places. Yet, I’m still a mere lurker unfamiliar with most of the local customs.

But it’s not what matters. What does is that I’m a terrible rationalist. 
You see, rationality takes practice. And reading stuff on LW isn’t practice at all. If anything, it’s just a great way of filling my brain with a lot of useful concepts, and then either blame myself for not using them, or use them for something entirely unrelated to their normal purpose. Often, to make myself feel worse, and think worse.

As the saying goes, rationality is a martial art. Learning it by reading the rules, or by watching other people apply the rules, is about as effective as developing one’s muscles by watching sports on TV.

I know of the CFAR, and of various related groups, meetups for ACX readers or for other people, etc. But, apart from ACX meetups, which aren’t about being better rationalists per se, I don’t have easy access to any of those, or certainly to a general environment which welcomes this. You know, not being in the Bay Area and all. 

And yet, I want to be more rational as much as anyone who’s been lurking here for five years wants it, and given how depressed I was until very recently, I probably badly need it, too.

I’m not sure what kind of answers I expect, but, like, how can I push myself to learn more, and especially to practice more, and ideally to actually use rationality to improve my life?

Answers

answer by Adam Zerner · 2023-07-13T18:14:12.957Z · LW(p) · GW(p)

This is an awesome and heart-warming question. Tsuyoku naritai!

In Julia Galef's book The Scout Mindset, she talks about how attitude is usually more important than knowledge.

Knowing that you should test your assumptions doesn't automatically improve your judgement, any more than knowing you should exercise automatically improves your health. Being able to rattle off a list of biases and fallacies doesn't help you unless you're willing to acknowledge those biases and fallacies in your own thinking. The biggest lesson I learned is something that's since been corroborated by researchers, as we'll see in this book: our judgment isn't limited by knowledge nearly as much as it's limited by attitude.

It sounds like you have a great attitude. I suspect that you're a much better rationalist than you claim to be.

Some thoughts on your original question:

  • Where do you live? There's actually tons of meetups outside of the Bay Area. Check out the community page [? · GW].
  • If there isn't a meetup near you, hey, maybe you can start one [? · GW]!
  • There's also various online groups that you can join. Slack, Discord, etc.
  • I hear that The Guild of the Rose is pretty cool and leans more towards pragmatic, real-life improvement types of things. I haven't done it myself but I get the sense that you'd like it a lot.
  • You might like Beeminder. You also might enjoy discussing self improvement on their forum.
  • The CFAR handbook is probably a great resource.
  • Be careful about stumbling into a Valley of Bad Rationality [? · GW] by accident. When in doubt, I think it'd be good to default to common sense.
comment by TeaTieAndHat (Augustin Portier) · 2023-07-13T19:14:14.393Z · LW(p) · GW(p)

Well, if you want the complicated answer: four or five years ago, about when I started lurking here, I came with a good mindset, and intent on learning skills rather than rules, for precisely the reason you mention. That was right before I entered university, and I think I screwed up that bit. Got depressed, and have remained too out of it to actually use rationality to improve my life much since then. But that’s changing, and the first step is to pick up my rationality practice where I left it four years ago, when I started focusing only on reading blog posts to forget them soon afterward. 
 

In other words: yeah, I know that ‘valley of bad rationality’ thing. Just digging myself out of one as we speak, actually. 
 

And, apart from that, thanks for the great advice. Will heed it :-)

Replies from: adamzerner
comment by Adam Zerner (adamzerner) · 2023-07-13T20:41:14.690Z · LW(p) · GW(p)

Gotcha. I'm sorry to hear about the setback, but I'm glad that things have been getting better recently.

So there aren't any meetups near you?

answer by Sable · 2023-07-13T16:20:30.718Z · LW(p) · GW(p)

Broadly speaking, I'm in favor of defining Rationality as systematized winning.

To become a better rationalist, you have to have something to win (or something to protect).

So you need a goal that isn't directly "become a better rationalist". This goal could be to learn a new skill, accomplish some simple task, or literally anything else.

Some suggestions:

  • Learn a Skill (Chess, Math, Foreign Language, Juggling)
  • Dealing with Uncertainty (start using a prediction market, play poker for real money)
  • Study the World (Pick a topic (Housing, Nutrition, anything), exhaustively research it, write up your results and post them)
  • Optimize your Life (Research sleep habits, diet/exercise, etc., then apply results to your own life to improve it)

Attempt to complete the goal, using what you've learned. Take notes. Reflect. Get better. Iterate.

Remember, the goal is to cut the enemy; the Art is not to be studied solely in isolation.

comment by Adam Zerner (adamzerner) · 2023-07-13T19:49:33.089Z · LW(p) · GW(p)

I think this is incorrect. From Levels of Action [LW · GW]:

One of the most useful concepts I have learned recently is the distinction between actions which directly improve the world, and actions which indirectly improve the world.

Suppose that you go onto Mechanical Turk, open an account, and spend a hundred hours transcribing audio. At current market rates, you'd get paid around $100 for your labor. By taking this action, you have made yourself $100 wealthier. This is an example of what I'd call a Level 1 or object-level action: something that directly moves the world from a less desirable state into a more desirable state.

On the other hand, suppose you take a typing class, which teaches you to type twice as fast. On the object level, this doesn't move the world into a better state- nothing about the world has changed, other than you. However, the typing class can still be very useful, because every Level 1 project you tackle later which involves typing will go better- you'll be able to do it more efficiently, and you'll get a higher return on your time. This is what I'd call a Level 2 or meta-level action, because it doesn't make the world better directly - it makes the world better indirectly, by improving the effectiveness of Level 1 actions. There are also Level 3 (meta-meta-level) actions, Level 4 (meta-meta-meta-level actions), and so on.

 

The most important difference between Level 1 and Level 2 actions is that Level 1 actions tend to be additive, while Level 2 actions tend to be multiplicative. If you do ten hours of work at McDonald's, you'll get paid ten times as much as if you did one hour; the benefits of the hours add together. However, if you take ten typing classes, each one of which improves your ability by 20%, you'll be 1.2^10 = 6.2 times better at the end than at the beginning: the benefits of the classes multiply (assuming independence).

I too like the idea of rationality being about winning (although you can also argue that it is about epistemics independent of how epistemics relate to winning). But I think that rationality usually helps with winning via high level actions. For example, learning about biases is a high level action that helps in a whole bunch of different scenarios.

That said, I also agree with the author of Levels of Action's warning about focusing too much on high levels of action.

It is also possible to have the opposite problem, of under-valuing Level 1, and I suspect that quite a few people in the nerdier communities do. People sometimes fall into the trap of noticing that the higher levels are (when applied properly) far more useful on the margin than Level 1, and then reacting by giving blind praise [? · GW] to the meta level at the expense of the object level. One cultural example is the ancient Greeks- who, though they were good thinkers for their day, didn't invent science. Science involved actually going out and looking at the world [? · GW], and that was manual labor and manual labor was for slaves. The ultimate extreme of this is Aristotle, who got philosophy off to an unfortunate beginning by starting his Metaphysics with the assumption that the most noble knowledge would be the most useless.

The problem there is that, because Level 2 actions are multiplicative and not additive, you still need at least some Level 1 actions to multiply by. It doesn't matter how high the value of one's labor is, if one never actually goes out and does labor. A very large number, multiplied by zero, is still zero. If one just does Level 2 actions, without any Level 1 actions, it is a failure to do something instead of nothing. Taking only meta-level actions accomplishes less, in the end, than the ten-year-old who just mowed the neighbor's lawn for a dollar.

comment by TeaTieAndHat (Augustin Portier) · 2023-07-13T17:23:08.057Z · LW(p) · GW(p)

How did I not notice ‘systematized winning’ meant that? I think I actually had no clue what it meant :/ Still, sounds great! And it’s actually a big part of what I’m trying to do, but I’ve been depressed for a long while and it’s only getting better now, so I’m a bit late at that game :-)

So, I’ll have to find a goal. Even then, that sounds way easier to do in the Bay, where one supposedly has other LWers around to talk to, but that shouldn’t be too much of a problem

answer by Garrett Baker · 2023-07-13T17:35:53.154Z · LW(p) · GW(p)

I’ve heard good things about the Guild of the ROSE, a virtual community made by rationalists to help each other level up in practical success in everyday life. You may want to look into joining them.

comment by Said Achmiz (SaidAchmiz) · 2023-07-13T22:18:30.208Z · LW(p) · GW(p)

What good things have you heard, could you be more specific?

Replies from: D0TheMath
comment by Garrett Baker (D0TheMath) · 2023-07-13T23:26:42.512Z · LW(p) · GW(p)

They run interesting seeming workshops on a variety of subjects, most salient to me are using decision theory practically via cost-benefit analyses, teaching people how to develop their own clothing style, team community projects like making a street cleaning robot (likely misremembering specifics here, this may have been an aspirational goal of theirs) for participants’ local community, and using LLMs to automate tasks. Much of my knowledge comes from this podcast episode.

Edit: Re-listening to some parts of the podcast episode, it seems like they start talking about the guild at about 00:26:47.

Replies from: SaidAchmiz
comment by Said Achmiz (SaidAchmiz) · 2023-07-13T23:51:26.124Z · LW(p) · GW(p)

using decision theory practically via cost-benefit analyses

Could you say more about this? (Or, link to some written commentary on the matter?)

using LLMs to automate tasks

Ditto?

Is there a transcript for the podcast episode?

Replies from: D0TheMath, moridinamael
comment by Garrett Baker (D0TheMath) · 2023-07-14T00:04:52.785Z · LW(p) · GW(p)

To my knowledge there’s no such transcript. The podcast is small and this one was made before Whisper so at the time a transcript would be super expensive (even if you did use whisper, you’d need to pay someone to label who’s talking, which likely isn’t cheap). You can find info about their workshops on their workshops page. Probably more informative than hearing me describe a podcast I last heard over a year ago.

comment by moridinamael · 2023-07-14T03:56:56.392Z · LW(p) · GW(p)

Here's are a couple of examples of our decision theory workshops:

https://guildoftherose.org/workshops/decision-making

https://guildoftherose.org/workshops/applied-decision-theory-1

There are about 10 of them so far covering a variety of topics related to decision theory and probability theory.

comment by TeaTieAndHat (Augustin Portier) · 2023-07-13T17:44:05.130Z · LW(p) · GW(p)

Seems really interesting! Like, really interesting! Thanks

answer by Stephen James · 2023-07-13T14:52:28.161Z · LW(p) · GW(p)

If your head is full of concepts but you haven't applied them, there are a few things you can start practicing - easily, right now - to begin living rationaly.

  1. Open the CFAR handbook, turn to page 135 (in the 2019 edition) and do the Resolve Cycle Technique, top to bottom. Review the background if you need a refresher.
  2. (Same book) Read about OODA loops and consciously do them for the rest of the day; if a problem comes up, apply Frame-by-Frame Debugging.
  3. Read "Thinking Better On Purpose [? · GW]" and take every call to action literally.

Let me know how it goes. All of these can be done on the order of minutes.

comment by TeaTieAndHat (Augustin Portier) · 2023-07-13T17:45:38.229Z · LW(p) · GW(p)

Will do this evening, thanks for the advice!

answer by Viliam · 2023-08-02T08:52:07.396Z · LW(p) · GW(p)

Late to the debate, so just a few notes:

Some things are way more important than others. If you are currently doing something really stupid in your life, fixing it is more important than learning a list of 1000 cognitive biases and rationality techniques.

It is much better to have 5 techniques that work for you and use them regularly, than to memorize a list of 1000 techniques and actually never use them.

Make logs. (And make them simple.) For example, if your goal is to exercise regularly, put a circle in your calendar whenever you exercise, or something like that. This gives you quick feedback whether you are actually doing things, or just lying to yourself.

Think about how you reward or punish yourself. (Read the book Don't Shoot the Dog.) Punishment, including self-punishment, is almost always the wrong approach. Why? Because you indirectly punish yourself for noticing the problem, and for trying to overcome it, which is the opposite of what you would want to do. Partial successes are a reason to celebrate! (Ancient wisdom says: "All you need is love positive reinforcement.")

Your body has a huge impact on your mind. Getting enough sleep, exercising regularly, getting enough sunlight, seeing your friends... are powerful (indirect) rationality techniques.

comment by TeaTieAndHat (Augustin Portier) · 2023-08-02T12:06:44.629Z · LW(p) · GW(p)

Interesting comment. You know, the weird thing is that I knew all of that long before I started implementing it (that is, quite recently). And it’s not even surprising that for the longest time I knew that avoiding bad situations, making logs, making a deliberate effort to stay healthy, and avoiding self-punishment were important, but that I hardly ever did any of it. I don’t think I quite grok fully why our brains work like that.
 

And now, I’m actually a little concerned that I may have taken up negative reinforcement and other such bad habits in a way where it’ll be hard to uproot them. I guess if there is good advice on that, I could probably use it.

Replies from: Viliam
comment by Viliam · 2023-08-02T14:07:14.646Z · LW(p) · GW(p)

It's as if our brains are made of rubber -- you can flex your brain a little, but it soon reverts to its original shape. The question is how to make a change permanent.

  1. Create habits -- but this is a "chicken and egg" problem.
  2. Make changes in your environment -- add/remove/move objects, add reminders.
  3. Set up external reminders -- friends, calendar, alarm clock.

For logging, I made a simple calendar in Excel (one paper - six months, a small rectangle for each day), printed it and put on a magnetic board. On the top I write the topic, e.g. "exercise", and every day I exercise, I circle that number using a pen. The point is, I can make such calendar in 5 minutes... otherwise I would procrastinate with making the calendar.

Furthermore, I keep my weights under my working desk, so whenever I look at them, it is a reminder that I want to use them. I also made a playlist for exercising. So when I am at my computer (with the calendar in my peripheral vision) and the thought crosses my mind "actually, I should exercise", I just need to start the playlist and pick up the weights; it only takes a few seconds.

And now, I’m actually a little concerned that I may have taken up negative reinforcement and other such bad habits in a way where it’ll be hard to uproot them.

Being nice to yourself can be surprisingly hard for many people. Myself included.

A part of that is various bullshit beliefs that we have accumulated. Such as "but if you stop being hard on yourself, you will never achieve anything". It may sound so obviously correct, even when the reality is the exact opposite -- you are not doing anything useful, because instead of dreaming and planning and doing you spend your time unproductively beating up yourself.

Uhm, it's difficult to put it in words. There is a right place for everything; even for being angry at yourself. Sometimes you need to stop doing stupid shit immediately, and a blast of anger could be a way to achieve exactly that. But if you do it more than once in a week, you should use some other mental tool instead. Observe. Never be angry while observing; that ruins the data.

Similarly, I am tempted to write "work smarter, not harder". But this also isn't true: working hard is useful; you just need to make sure that you did the smart things first. Work hard in the situations when working hard is efficient, not when you should be doing something smarter instead. Willpower is a scarce resource (even more if you have ADHD), so first make sure you did the easy things.

I think visualization helps (although I often forget to do it myself). Remind yourself why you are doing something. Just relax for one minute, and imagine the glorious outcome you are trying to achieve; imagine already being there, how would it feel. Then, starting the work may get easier.

You probably just need to start trying something, observing how it works, and updating your strategy. And maybe sharing advice with some people at ACX meetup, though I would expect that most of them are not interested in that, and only came to chat.

For me, the best resources were Don't Shoot the Dog and maybe Games People Play; I would strongly recommend them both.

A great obstacle to improvement is the status instinct. Sometimes the right actions are unimpressive (e.g. doing the dirty work), and the wrong actions are impressive (e.g. sharing insights or debating philosophically). People are afraid to admit weakness and ask for help/advice, because that makes them less impressive. (Then again, sometimes the instinct is right; some people will just feel superior to you when you admit a weakness, and will not provide any useful help.) People avoid hard work, because if you are cool, the impressive results are supposed to come "naturally". (In fact, many impressive results are a result of hard work done in the past. And "natural" often means well-trained.) Everyone wants to be an expert, no one wants to be a beginner, but you cannot have the former without the latter. People avoid improvement, because it may upset the existing social balance. (You never know whose fragile ego in your proximity is supported by "at least, you aren't better either". Your self-improvement may kick this down, and the person may take revenge; this can come as a huge surprise.) Even being hard on yourself is probably about this, because it seems "fair" that if you want to be great, you should suffer for your audacity. But in fact there is no law of physics that requires that.

answer by trevor · 2023-07-13T20:16:41.040Z · LW(p) · GW(p)
  1. Sequences highlights [? · GW], then read all the ones from Rationality from AI to Zombies, at whatever pace you feel like (e.g. one or ten per morning), you can read them in order or in random order [LW · GW]. 
  2. Many recommend the CFAR handbook [LW · GW] (that one in particular seems like a good fit for you), Scott Alexander's Codex [? · GW], and the book Inadequate Equilibria.
    1. Unlike the sequences, with the CFAR handbook you have to consciously practice, not just read and know it. Once each habit is built it's automatic, but the muscle memory requires repetition.
  3. If you find a topic you like e.g. security mindset (which you totally can and should devote your life to), you can look for tons of top-upvoted articles on the tag for the topic [? · GW].
  4. I have also been recommended the books Super Thinking, Superforecasting, and the post on Functional Decision Theory [LW · GW]. They're all great but they aren't standard texts (yet). I personally recommend Tuning your Cognitive Strategies [LW · GW], but if that one causes you harm somehow then please stop and write a Lesswrong post about the details, or for anything you discover, it's a neglected area so it's big deal whenever someone makes a big discovery there. 
  5. Go outside more for more brainpower [LW · GW] and less depression [LW · GW].

No comments

Comments sorted by top scores.