Winning doesn't need to flow through increases in rationality

post by Michel (MichelJusten) · 2023-06-02T12:05:26.615Z · LW · GW · 3 comments

Contents

  Consider that there may be more rewarding pursuits than marginally increasing your rationality toolbox.
None
3 comments

(My first LessWrong post. Written quickly, in the spirit of putting more of my ideas out there and seeing how they meet reality.) 

Consider that there may be more rewarding pursuits than marginally increasing your rationality toolbox.

I get the sense that a lot of people in the rationality community think that others should dedicate themselves to the pursuit of becoming more rational, or that they themselves need to become more rational. 

"If only they had read the Sequences, then they wouldn't have wasted their time on that class"; 

"if only I had used the internal double crux [LW · GW] move, then I wouldn't have made X mistake."

There is probably some truth here, but I think it's possible to be too fixated on increasing rationality, or hold the view that all shortcomings would have best been solved by more engagement with rationality. 

Rationality is systematized winning [? · GW], and winning doesn't need to flow through increases in rationality.

. . . 

Take Alice as an example. She's motivated to reduce suffering in the world as much as possible; an effective altruist, if you will. 

As she first orients to her goal of reducing suffering, Alice may[1] benefit a lot from engaging with rationality content. Scout mindset, some CFAR techniques, and some the Sequences will set her up to see the world clearer and make wiser decisions in the future.

But after some initial orientation to rationality and further readings when she feels like it, Alice may better achieve her goals by working on other multipliers [EA · GW] than engaging with more rationality content. 

For example, rather than reading the Sequences, Alice may be helped more by...

Yes, perfect rationality would empower Alice far more than any of these gains. But that doesn't imply that Alice should dedicate herself to rationality. Increases in rationality are bounded, probably have diminishing returns, and Alice may just be one of those people who doesn't find that engaging with rationality content helps her that much. And that's fine.

As a community, I think we should be cautious of overemphasizing rationality engagement as the way to win. 

  1. ^

    This depends on Alice's existing familiarity with rationality practices, or her trait inclination towards rationality.

3 comments

Comments sorted by top scores.

comment by the gears to ascension (lahwran) · 2023-06-02T15:29:07.714Z · LW(p) · GW(p)

Certainly true for practical effectiveness. For epistemic skill building, though, I think practicing prediction in difficult technical domains is more useful than not, and studying epistemics can be very useful there.

Replies from: Viliam
comment by Viliam · 2023-06-03T20:10:29.144Z · LW(p) · GW(p)

"Addressing any lurking mental health issues" might also be useful for epistemic skills. For example, you might realize that you have an unconscious reason to flinch away from certain thoughts.

comment by Vladimir_Nesov · 2023-06-02T13:47:28.075Z · LW(p) · GW(p)

There is truth or calibrated credence or knowing what "good" means or carefully optimizing goodness. Then there are methods that are more or less effective at helping with attaining these things. If you happen to be practicing the better methods, then to the extent they really are effective, you become better at finding truth or calibrated credence or at developing goodness.

And then there is rationality, which is aspiration towards those methods that are better at this [LW · GW]. Practicing good methods is sufficient to get results, if the methods actually exist and you actually practice them, the methods screen off influence of rationality on object level results. But being on the lookout for useful methods, and for understanding of their effectiveness, is what systematically guides their attainment, and that's rationality.