How has rationalism helped you?

post by Sunny from QAD (Evan Rysdam) · 2019-08-24T01:31:06.616Z · LW · GW · No comments

This is a question post.

Contents

  Answers
    22 Wei_Dai
    5 Three-Monkey Mind
    3 Raven
    2 Callmesalticidae
None
No comments

This November, I will be participating in NaNoWriMo, an online event where participants have one month to write a 50,000-word manuscript for a novel. I'm fairly settled on the idea that I'm going to write about a person who is fairly smart, but who has no rationalist training, discovering rationalism and developing into a fully-fledged rationalist.

I'm looking for inspiration for what kind of problems they might learn to solve. How has rationalism helped you? There is no answer too big or too small. If rationalism helped you realize that you needed to divorce your spouse and change careers, that's a good answer; if rationalism changed the way you tie your shoelaces, I'm all ears. In particular, I'd like to hear:

Answers

answer by Wei Dai (Wei_Dai) · 2019-08-24T02:07:40.646Z · LW(p) · GW(p)
  1. I had a business dispute and other interpersonal conflicts, where knowing about human biases made me realize that my own perspectives are probably biased, and others' are as well but it's only human, which made me less indignant and prevented the conflicts from escalating more.
  2. Learning the concept of moral uncertainty made me less angsty with not knowing what real morality is, and more ok with pursuing selfish and altruistic goals at the same time.
  3. Eschewing conventional notions of success and doing things that are better for both my selfish and altruistic goals. (Not sure which rationalist concept this corresponds to exactly.)
comment by Sunny from QAD (Evan Rysdam) · 2019-08-24T04:24:08.662Z · LW(p) · GW(p)

I like these answers a lot because they strike just the right balance between specificity and generality that will let me render similar character developments into my story world, without feeling like I'm just copying something down directly. Thanks!

answer by Three-Monkey Mind · 2019-08-25T02:23:53.336Z · LW(p) · GW(p)

Over on the "too small" end of the spectrum…

I wrote about how rationality made me better at Mario Kart which I linked to from here a while ago. In short, it's a reminder to think about evidence sources and think about how much you should weigh each.

More recently, I've been watching The International, a Dota 2 competition. Last night I was watching yet another game where I wasn't at all sure who would win. That said, I thought Team Liquid might win (p = 60%). When I saw Team Secret win a minor skirmish (teamfight) against Team Liquid, I made a new prediction of "Team Secret will win (p = 75%)". However, my original guess was correct: Team Secret eventually won that game.

I then thought about the current metagame and how, this year, any team can go from "winning" to "lost" with only a small error or two, and the outcome of any individual skirmish doesn't matter much.

I then imagined Bart Simpson repeatedly writing "I WILL NOT MAKE LARGE UPDATES BASED ON THE OUTCOME OF A SINGLE TEAMFIGHT" on a large blackboard and stopped making that mistake.


I think the major takeaway I've gotten from reading The Sequences is the vocabulary around updating beliefs, by varying amounts, based on evidence.

comment by Sunny from QAD (Evan Rysdam) · 2019-08-25T07:19:28.589Z · LW(p) · GW(p)

Vocabulary is big. What I'm about to say is anecdotal, but I think having the words to express a concept make that concept a LOT more readily available when its relevant. Thanks for the response!

comment by digital_carver · 2019-08-30T12:22:40.900Z · LW(p) · GW(p)

I thought Team Liquid might win (p = 60%). When I saw Team Secret win a minor skirmish (teamfight) against Team Liquid, I made a new prediction of "Team Secret will win (p = 75%)". However, my original guess was correct: Team Secret eventually won that game.

I think you mean "Team Liquid eventually won the game" here, since that seems to have been your original guess.

Also, it would be interesting to see how the Dota Plus win probabilities at, say 15 minutes into the match, hold up against the actual wins/losses in the games. On the one hand, it seems very difficult to have good predictions in a game like Dota where things can turn around at the drop of a hat, but on the other hand we have OpenAI Five claiming 85% win chance just at the end of the drafting phase.

answer by [deleted] · 2019-09-18T19:43:08.551Z · LW(p) · GW(p)
The thing you used to do, which was lacking in some way.

I used to be a loyal follower of Traditional Rationality. Emotions were the enemy, frigid logic and reason were my allies. Combined with a burning desire to win at all costs, it sort of worked. Since I wasn't learning from a specific person, I even managed to reinvent a lot of stuff from true rationality (in a hazy, unspecific sort of way). But the emotion thing... I was miserable, and my response to that was to bury those feelings instead of carrying out the introspection necessary to figure out why I felt so bad all the time.


The rationalist concept that challenged your habit.

Yudkowsky's vision of a unified rationality+emotions, and Kahneman's S1/S2 model of cognition, particularly the idea that S1 had something valuable to contribute.

I had heard all the standard "don't bottle up your emotions" advice before, but none of it addressed the fundamental problem that I wanted to win -- and as far as I could tell, emotions were nothing but dead weight. People who were emotional lost. They got angry. They cheated on their relationships. They flitted around life, tugged on a leash by their feelings. There was no focus, no coherence, no master plan behind it all. Just chaos. It seemed to me that the choice was to either be strong and cold, or weak and warm.

Reading about an alternative to what I was doing that didn't have massive immediately obvious flaws... that was enough to convince me to start the introspection process instead of ignoring how I felt.


What you do now.

I don't feel like crap all the time. Turns out there was quite the surprise waiting for me at the end of the rabbit hole (I'm trans), and now that I'm addressing the root problem, all the negative side effects are disappearing. I don't believe in a rationality vs emotion dichotomy anymore. I'd like to say that this has spread into my general behavior, but unfortunately it's only been a few months since my crisis of faith, and it takes longer than that. But when I catch myself trying to quash a feeling, or disregarding an intuition, I stop and ask myself whether this is the sort of situation where intuition would reasonably be expected to work well (frequent, rapid feedback, etc).

comment by Sunny from QAD (Evan Rysdam) · 2019-09-18T21:00:52.366Z · LW(p) · GW(p)

I like this answer a lot because this is something I can have running in parallel to the main plot. In fact, I can just add a separate character who starts out as a follower of Traditional Rationality and then acts as a foil for my main character. Thanks!

comment by eigen · 2019-09-18T19:59:13.375Z · LW(p) · GW(p)

Thank you for an excellent answer and for sharing your experience. I'm glad you're doing better now!

I agree very much, BTW, on the ‘rationality vs emotion dichotomy’ view of Yudkowsky and I'm glad he addressed that early in the sequences.

answer by Callmesalticidae · 2019-08-27T03:36:32.318Z · LW(p) · GW(p)

Rationalism has...

1. Helped me to get out of a cult.

2. Given me the mindset that any negative aspect of my life is (probably) fixable if I can just find the right angle to approach it from.

3. Kept me from falling into all kinds of political weirdness, and reminded me that My Guys shouldn't exist as such--I might have certain political positions which are shared by certain other people, but if I start thinking in terms of teams then I'm going to fall down a very bad hole.

comment by Sunny from QAD (Evan Rysdam) · 2019-08-27T03:59:42.935Z · LW(p) · GW(p)

I upvoted for the last one. For the other two, would you mind sharing details? Specifically, what did you used to do, what insight did you have, and what do you do now?

Replies from: Callmesalticidae
comment by Callmesalticidae · 2019-09-09T23:09:51.801Z · LW(p) · GW(p)

Sure thing.

Re #2 I have bipolar, and it would be very easy to just give up on a lot of pursuits or decide that I wasn't getting anywhere with them. In particular, I often feel bad about my writing, but the records that I keep (inspired in large part by rationalism, but also partly by my time as a cult recruiter, where there was a focus on tracking various "key indicators") are able to demonstrate to me that I'm doing better than I usually think.

The emphasis on taking "The Outside View" is also immensely helpful. Ten years ago, I wanted to publish novels, and I still haven't finished a novel, let alone published one, but I've done a lot of other stuff that wasn't in the original game plan (e.g. maintained a good GPA throughout grad school, run several successful Kickstarters, published a line of nonfiction resource booklets) and, if I were talking to someone with those accomplishments then I'd say they had done well, so I have to say the same of myself.

Re #3, I'm a socialist, and without rationalism it'd be very easy for me to slip into black-and-white "my side is always right and the other guys never have anything worth paying attention to." Instead, I'm more critical of everyone (even if I'm probably not quite as critical of My Guys as I think I am, because that's how humans work) and I'm more willing to change my views (I have, for example, shifted significantly away from anarchism). Likewise,

  • I was much less surprised by the fallout of "Russiagate" (or whatever we're calling the stuff with Trump and Russia and Mueller) than many other people on the Left.
  • I very quickly stopped listening to [some Russiagate conspiracy theorists whose names I can't remember because it's been so long] because I noticed how they would keep making predictions but only refer back to the predictions that panned out. I'd originally started listening because I kept hearing about all the stuff they got right, and without rationalism I might have continued to listen to them and just forget about their failed predictions.
  • I spend very little time on Culture War / "flashy" political stuff, because the rationalist mindset helped me realize how much I was being sucked into that stuff vs. getting anything concrete out of it. That idea isn't exactly unique to rationalism, but without rationalism I don't know if I would have recognized the problem in myself, you know?

Also, another one, because I was in a rush last time: Rationalism helped me figure out how to bring my migraines down from "incredibly debilitating for an entire day, and sometimes two days" to "mildly debilitating for part of one day and sometimes just mildly annoying", because it gave me a more "If I have a problem, let's experiment with possible solutions and be very careful about keeping track of the result of each possible solution" mindset. Not only have I found some great ways to deal with my migraines (and I know that they're responsible, because every now and then I fail to do things properly and then my migraine is around as bad as it was before I started doing whatever it is I failed to do) but I was able to stop doing some things which folks felt worked for them but, as it turned out, were useless for me.

No comments

Comments sorted by top scores.