[SEQ RERUN] Your Rationality is My Business

post by badger · 2011-06-01T14:04:57.022Z · LW · GW · Legacy · 7 comments

Today's post, Your Rationality is My Business was originally published on April 15, 2007. A summary (from the LW wiki):

As humans, we have an interest in the future of human civilization, including the human pursuit of truth. That makes your rationality my business. However, calling out others as wrong can be a dangerous action. Some turn to relativism to avoid it, but this is too extreme. Disagreements should be met with experiments and arguments, not ignored or met with violence and edicts.


Discuss the post here (rather than in the comments of the original post).

This post is part of a series rerunning Eliezer Yudkowsky's old posts so those interested can (re-)read and discuss them. The previous post was New Improved Lottery, and you can use the sequence_reruns tag or rss feed to follow the rest of the series.

Sequence reruns are a community-driven effort. You can participate by re-reading the sequence post, discussing it, posting the next day's sequence reruns post, summarizing forthcoming articles on the wiki, or creating exercises. Go here for more details, or to discuss the Sequence Reruns.

7 comments

Comments sorted by top scores.

comment by XiXiDu · 2011-06-02T11:28:05.330Z · LW(p) · GW(p)

Playing the lottery has nothing to do with truth. If someone is biased, you can call them objectively wrong. A bias is an epistemic confusion. But if one does recognize the cognitive factors leading to certain decisions, and chooses to embrace them as valuable, one is not biased.

I believe that it is right and proper for me, as a human being, to have an interest in the future, and what human civilization becomes in the future.

It's not right or wrong. Your interests are not subject to epistemic criticism. If you care more strongly to play the lottery than to save human beings, that's neither right nor wrong, as long as you are not confused about what you are doing and its consequences.

If someone else does not care about the future of humanity, while you do, you might call their values instrumentally wrong in relation to your own goals, or even that of most humans. But if you do not mention the context in which someones values are wrong, you engage in malicious persuasion by signaling that their goals are wrong in an absolute, epistemic sense.

If they were really selfish, they would get on with making money instead of arguing passionately with others.

That sentence is completely confused. If selfishness meant not to care to influence other people in any way, how could one earn money, how would one spend it? Selfishness means to care more strongly about the well-being of oneself and one's goals than that of others.

If it could be shown that caring about others is instrumental in reaching selfish goals, then even the rational selfish would engage in altruism.

If one's goal is to correct other people, arguing is not less a selfish activity than earning money, if you care about money. And even if one doesn't care to correct other people in and of itself, it can be instrumental.

comment by AlphaOmega · 2011-06-02T05:24:03.008Z · LW(p) · GW(p)

To summarize: people should use rationality to decide arguments instead of a) killing each other or b) forbidding all judgment about who's right or wrong.

Just for the sake of argument: a) what is rationality? and b) what is so sacred about it that it should be the arbiter of all truth? I.e. why exactly do you worship the god of reason?

Replies from: beriukay
comment by beriukay · 2011-06-02T12:06:14.510Z · LW(p) · GW(p)

I think the point of the post is a bit more meta than "people should use rationality [...]". More like: "I am allowed to think that there is a right way for people to think".

I like your first question. Not having had much a chance to taboo my words, and not wanting to get lost in a maze of words, I would describe it thusly:

It is noticing a challenge. Then poking and prodding the challenge with tools until you have some ideas for how it works. Then using the ideas to poke and prod it in ways that will eliminate the wrong ideas. What remains should be ideas that you aren't sure about. You can compare those ideas with ideas you already held before the challenge, and the ones that disagree need to be tested some more. With enough effort, eventually you will have gotten rid of all the ideas that obviously don't pass your tests, and you will have a collection of ideas that you have tested that don't disagree with one another. Or ones that are clear on how they disagree, since there will always be open questions.

In the spirit of The 5-Second Level, I will name three concrete examples:

-Noticing my computer fails to wake up properly from hibernation, I note that I recently replaced my video cards, so I update video card drivers. That doesn't fix the problem, so I google it with words that I think will turn up the best results and find that some people had this problem after flashing the firmware on the video card, but that doing an RMA fixed their problem. I begin the process of RMAing, but then notice that the broader class of problems is associated with driver issues. I find a driver updater program that seems trustworthy, and find that I have 2 out-of-date drivers associated with a couple of my hard drives. I update them, and find that the computer no longer has the problem I complained about. I stop looking.

-There is a loud, distracting, unknown, short-lived beep at work, that promises to bother me all shift. I start a timer when a beep goes off and stop it when it repeats, and find that it goes off exactly every 5 minutes. I set my alarm to go off in 4 minutes, and go stand near where my best guess is at the proper time. The beep goes off, and I revise my best guess for beep source as a computer that I can't log into. I turn off the computer. There is another beep about 5 minutes later, so I set the timer, and guess again where it is. Eventually I find out there is a phone in a box with a new voicemail message.

-I'm arguing with a friend about science being the only way to know the world. He keeps drawing examples revolving around the fallibility of books and naive appeals to authority. I keep drawing examples about revolving around being right and being able to show why you are right. Eventually I stumbled upon this realization and dispelled the source of our disagreement by conceding that books aren't the only way to learn (that they aren't the best way in many cases), and that an illiterate welder who can point out where an engineer's blueprints are structurally unsound is using science, not proving that science is inferior to experience.

As you can tell, I see it as a process of weeding out the crap. If you can think of a better way to get rid of crap, I'm all ears. There's nothing sacred about the process. I'm not sure about calling it 'the arbiter of truth'. It seems as strange to me as if someone asked "Why is f(x) the arbiter for the value of sin(x)?" There are no gods here. No sentient beings that decide that the answer is right. In fact, I should stress that rationality does not guarantee that you end up at the right place, even if you did everything correctly.

Replies from: AlphaOmega
comment by AlphaOmega · 2011-06-02T19:17:16.238Z · LW(p) · GW(p)

So rationality is desirable because it gets rid of crap? What if crap makes me happy? What if my entire culture is based on crap?

Is there a paper here that addresses this meta-question of "why be rational?" I can think of many reasons, but mostly it seems to come down to this: rationality confers power. Bertrand Russell called Western thought "power thought," which seems pretty accurate. Rationality is good because it wins, and eliminates the competition. I haven't had any conversations with any pre-rational Stone Agers lately, though they were once common in my neighborhood. Did they lose a philosophical debate with rationalists, or were they simply exterminated?

So it seems to me that the lasting appeal of irrationality, spirituality, religion, etc. is that, for some strange reason, people aren't quite comfortable worshipping this Terminator-like god of reason.

EDIT: I take it from the response that people here don't want to discuss this meta-question? Is rationality perhaps a sacred cow which plays a role similar to God in other faiths?

Replies from: ArisKatsaris, beriukay, Richard_Kennaway
comment by ArisKatsaris · 2011-06-03T08:39:08.578Z · LW(p) · GW(p)

I take it from the response that people here don't want to discuss this meta-question?

I certainly wouldn't mind discussing it, just not with someone who is behaving like a rude jerk and uses trollish attempts to annoy people into discussing it.

"Terminator-like god of reason"? Seriously? And every past post of yours you seem to be using some attempt to attribute characteristics to people that you ought know they don't have "Oh, you ought support banning birth control, then; Oh, you are like genocidal criminals then; Oh, you're like megalomaniacal villains, then"

So, no, no "sacred cows" here, not for me atleast. Just the lack of desire on my part to engage in conversation someone as unpleasant as you currently are.

comment by beriukay · 2011-06-03T06:07:53.382Z · LW(p) · GW(p)

What if crap makes me happy? What if my entire culture is based on crap?

Then I guess you have a decision to make: do you want to be happy, or do you want all the things you care about to have the best chance possible in working out how you want them to? Personally, having a computer not work, or having intermittent beeping for eight hours, or pointlessly arguing DON'T make me happier, and if I can fix such problems I will. But if you want to stew in the unexamined life, allowing other people or situations to control you because you feel like thinking makes you unhappy, then I can only hope that you make as little of an impact on the world as possible before you go happily into the grave.

comment by Richard_Kennaway · 2011-06-03T13:31:43.272Z · LW(p) · GW(p)

Is there a paper here that addresses this meta-question of "why be rational?"

Yes.

Short version: What do you want? That is your reason to be rational.

I can think of many reasons, but mostly it seems to come down to this: rationality confers power. Bertrand Russell called Western thought "power thought," which seems pretty accurate.

That seems accurate to you, because power is what you want. You have said this explicitly yourself: "Well I just want to rule the world." Because power is what you want, you assume that it is what everyone else wants. So when you read that rationality "wins", you interpret winning as "defeating other people". That is only "winning", in the sense of the slogan "rationality wins", if what you want is to subjugate or exterminate "the competition". You see everyone as "the competition", and your solution is to take over the world. Bertrand Russell gives you delightful cold prickles (I'm sure you have no use for warm fuzzies) because you hear in him something you want to hear: everyone is everyone's enemy. (BTW, here's some context for that phrase of his. Plato's totalitarian dream, 1931 edition.)

So it seems to me that the lasting appeal of irrationality, spirituality, religion, etc. is that, for some strange reason, people aren't quite comfortable worshipping this Terminator-like god of reason.

You are the would-be Terminator God, the self-styled AlphaOmega.

How's that going? What do you do when you're not reading LessWrong and being pissed on? Not that LW karma means anything in the greater scheme of things, but you keep coming back for more. It is said that he who would be Pope must think of nothing else. How much more so for he who would rule the world!