Trying to be rational for the wrong reasons

post by Viliam · 2024-08-20T16:18:06.385Z · LW · GW · 8 comments

Contents

8 comments

Rationalists are people who have an irrational preference for rationality.

This may sound silly, but when you think about it, it couldn't be any other way. I am not saying that all reasons in favor of rationality are irrational -- in fact, there are many rational reasons to be rational! It's just that "rational reasons to be rational" is a circular argument that is not going to impress anyone who doesn't already care about rationality for some other reason.

So when there is a debate like "but wouldn't the right kind of self-deception be more instrumentally useful than perfectly calibrated rationality? do you care more about rationality or about winning?", well... you can make good arguments for both sides...

On one hand, yes, if your goal is to maximize your utility function U then "maximizing U by any means necessary" is by definition ≥ "maximizing U using rationality". On the other hand, if you take a step back, how would you know whether your approach X actually maximizes U, if you gave up on rationality? The self-deception that you chose instrumentally as a part of strategy X could as a side effect bias your estimates about how much U you really get by following X... but there may be ways to deflect this counter-argument.

1) Doublethink. Keep simultaneously two models of reality, one of them rational, the other optimized by the former to be winning. There are some shaky assumptions here. It may be computationally impossible for a human to keep two separate models of reality; to make sure that it's the former that nudges the latter (rather than the other way round, or both nudging each other), but it's the latter (rather than a mix of both) that influences System 1 [? · GW]. But this sounds like a nirvana fallacy: the people who choose rationality over doublethink are not doing rationality perfectly either! So let's compare the average human doublethink against the average human rationality (instead of a hypothetical perfect rationality). Now it is not so clear that the rationality wins.

2) Multiple agents. Imagine a father who wants his son to be winning as much as possible. The father could be a perfect rationalist, while raising his son to believe the optimal mix of rationality and self-serving bullshit. Here the objections against self-deception do not apply; the father is not deceiving himself about anything. (We could make a different objection, that the son will not be able to provide the same kind of service to his children. But that's moving the goalpost.)

3) Split time. Become a perfect rationalist first, then design the perfect plan for brainwashing yourself into someone more winning (at the cost of losing some rationality), then brainwash yourself. Most of the objections you make against this idea can be answered by: yeah, assume that the original perfect rationalist considered this possibility and adjusted their plans accordingly. Yeah, in some Everett branches something completely unexpected might happen in exactly the right way that the original rationalist could have prevented a disaster, but the brainwashed person no longer can. But again, compare the average outcomes. The small probability of a disaster might be an acceptable cost to pay over a large probability of winning more.

Frankly, "if you are no longer a rationalist, you cannot be sure that you are doing the optimal thing" was never my true rejection. I am quite aware that I am not as rational as I could be, so I am not doing the optimal thing anyway. And I don't even think that the outcome "you are doing the optimal thing, and you think that you are doing the optimal thing, but because you have some incorrect beliefs, you don't have a justified true belief about doing the right thing" is somehow tragic; that sounds like something too abstract to care about, assuming that the optimal thing actually happens regardless.

My true rejection is more like this: I have an irrational preference for things like truth and reason (probably a side effect of mild autism). You provide an argument that is maybe correct or maybe incorrect, I am not really sure. From my perspective, what takes away the temptation is that your strategy requires that I give up a lot of what I actually care about, now, forever, with certainty... and in return maybe get some other value (possibly much greater) in some unspecified future, assuming that your reasoning is correct, and that I can execute your proposed strategy correctly. This simply does not sound like a good deal.

But the deal might be more balanced for someone who does not care about rationality. Then it's just two strategies supported by similarly sounding, very abstract arguments. And you are going to make some mistakes no matter which one you choose, and in both cases an unlucky mistake might ruin everything. There is too much noise to make a solid argument for either side.

...which is why I consider "arguing that rationality is better than optimal self-deception" a waste of time; despite the fact that I made my choice and feel strongly about it. The arguments in favor of rationality are either circular (on the meta level), or irrational.

8 comments

Comments sorted by top scores.

comment by Protagoras · 2024-08-20T20:37:53.358Z · LW(p) · GW(p)

I remember Bas van Fraassen (probably quoting or paraphrasing someone else, but I remember van Fraassen's version) saying that the requirements for finding truth were, in decreasing order of importance, luck, courage, and technique (and this surely applies to most endeavours, not just the search for truth). But although technique comes last, it's the one you have the most control over, so it makes sense to focus your attention there, even though its effect is the smallest. Of course, he is, like me, a philosopher, so perhaps we just share your bias toward caring about rationality.

Replies from: None
comment by [deleted] · 2024-08-22T13:24:01.962Z · LW(p) · GW(p)

the requirements for finding truth were, in decreasing order of importance, luck, courage, and technique (and this surely applies to most endeavours, not just the search for truth)

Perhaps this might be the order of importance of these factors in the quest of finding any particular truth, but in the aggregate, I would expect technique (i.e., basic principles of rationality that tell you what truth is [LW · GW], what it should imply [LW · GW], how to look for it [LW · GW], what can justifiably change your view about it [LW · GW], etc) to be the most important one in the long run. This is mostly because it is the one that scales best when the world around us changes such that there is a greater supply of information out there from which important insights can be drawn.

comment by ChristianKl · 2024-08-22T11:19:14.741Z · LW(p) · GW(p)

This assumes that believing self-serving bullshit makes you more effective at achieving your goals.

It's frequently argued that startups founders should believe that their companies should irrationally strongly believe in the success of their companies and yet Elon Musk and Jeff Bezos who both thought their companies would fail more likely than succeed are some of the richest men on earth. 

Self-deception leads to having blindspot around them and that's frequently a problem for effective action. If you self-deceive yourself around the reasons your startup might fail, you might be worse at dealing with those reasons.

Replies from: Viliam
comment by Viliam · 2024-08-22T12:59:31.217Z · LW(p) · GW(p)

That's an interesting fact and a good point!

But feel free to replace "self-serving bullshit" with whatever other specific deviation from rationality people may propose as a plan to win at life more.

Replies from: ChristianKl
comment by ChristianKl · 2024-08-22T13:27:10.710Z · LW(p) · GW(p)

It seems to me like most deviations people propose have clear short term gains but the costs of them are less obvious. 

It might be that in a world with accurate lie detectors there are specific deviations that are actually improving people's ability to win, but I'm not sure we currently live in a world where there are specific deviations from rationality that provide net benefits. 

I also like to quote from Baron's textbook on rationality (and how Eliezer defined rationality as well):

The best kind of thinking, which we shall call rational thinking, is whatever kind of thinking best helps people achieve their goals. If it should turn out that following the rules of formal logic leads to eternal happiness, then it is “rational thinking” to follow the laws of logic (assuming that we all want eternal happiness). If it should turn out, on the other hand, that carefully violating the laws of logic at every turn leads to eternal happiness, then it is these violations that we shall call “rational.”

When I argue that certain kinds of thinking are “most rational,” I mean that these help people achieve their goals. Such arguments could be wrong. If so, some other sort of thinking is most rational.

Replies from: Viliam
comment by Viliam · 2024-08-22T14:04:29.579Z · LW(p) · GW(p)

This theoretical argument seems to have a problem that we are not perfectly rational anyway. Similar problem exists in consequentialism -- in theory, you should choose the action that has the best possible outcomes, but in practice, this would require you to have perfect knowledge about everything, which you don't have. So you need to try to guess the probabilities of various plans going wrong, and to face your cognitive biases. And then you get the less elegant reasoning, such as "this plan seems like it would have wonderful consequences... but also outside view says that everyone I know who tried this thing in the past turned out to be deluded and caused a lot of harm... but I am smarter than them, and know more about rationality and biases... but maybe I shouldn't do it anyway...".

The problem with the textbook example is that things sometimes do not conclusively "turn out" this or that way; instead we just get a lot of weak evidence that maybe together mostly points in one direction but maybe that is just a consequence of some bias, or maybe there is a 99% chance that something is true but the consequences are horrible if we get it wrong (e.g. the "black swan" situations). Perhaps some day we will know the exact amount of rationality that produces the best outcome for a human in 2020s, but I need to make some choices now.

Replies from: ChristianKl
comment by ChristianKl · 2024-08-22T18:53:19.326Z · LW(p) · GW(p)

There's a question of what we mean with specific words. The talk about cognitive biases comes out of behavioral economics and in economics the rational actor is one that makes utility maximizing choices. As such a person with a high amount of rationality is a utility maximizer in the terms of economics. 

Talking about "the exact amount of rationality" isn't that useful in that regard. 

If you instead ask what amount of using the scout mindset produces the best outcome for a human in the 2020s you have a much more concrete question. You additionally might split that into the question of whether having the scout mindset internally is useful and having it in a externally visible way.

In a heavily political environment a cynical person who says all the bullshit that makes them get ahead but internally knows what's bullshit might be able to navigate better than the deluded believer. There's a reason why Venkatesh Rao calls the people who believe the bullshit fully "clueless" and puts them at the bottom of the hierarchy. 

A stereotypical programmer who sees themselves as valuing rationality and the truth, might speak up in a project meeting by focusing on what's truly matters for the business but be ignorant of important truths about the political effects of speaking up. If you model that situation as the programmer being "rational" but the people saying the political bullshit at the meeting being "irrational" you are going to have a poor understanding of the situation.

comment by bushsoul · 2024-08-21T00:44:27.040Z · LW(p) · GW(p)

“Reason is and only can be slave to the emotions” - David Hume