Counter-irrationality
post by Spectral_Dragon · 2012-02-09T20:04:26.597Z · LW · GW · Legacy · 16 commentsContents
16 comments
I don't think anyone here is perfect, though a lot here is simply brilliant logic. Though, as humans, we're automatically biased for or against something, at times for the wrong things and reasons. No bias at all isn't something we're equipped for, and to me sounds quite dull. Some things we have to believe, even as with our limited lifespan it would be very hard to learn everything.
Anyway, on to my actual point: How often do you realise you're biased for the wrong reasons, and you argue more for winning than being right? Also, what irrationality can you point out, and how to fight it? This could apply to you, or those around you, doesn't matter as no one's perfect. But as usual, we should try to at least be better.
Right now I'm trying to figure out how to best deal with unreasonable demands. A theological debate which eventually went "Well, if science can't do X, it is flawed, and we should accept that some questions are unanswerable.", and I'm working out the most efficient counter-argument (leaning towards "that's why we should try to see if it's impossible or not"). Anyway, share your thoughts on this, how to be efficiently rational, though if you try to help me, I'd like more questions instead of answers. I value the process of rationality more than straight-up answers, even though it's frustrating as it is.
Basically: Flaws you encounter and how to fight them efficiently, maybe [Bayesian Judo style](http://lesswrong.com/lw/i5/bayesian_judo/).
16 comments
Comments sorted by top scores.
comment by moridinamael · 2012-02-09T21:08:27.850Z · LW(p) · GW(p)
As to your first paragraph, one of the more liberating things about internalizing the Sequences and inculcating myself with Less Wrong memes is that it I've come to hold very few opinions strongly. When you move yourself to a sufficiently morally-relativistic framework you stop identifying with your opinions. Instead of saying there are things you "just have to believe" I would say there are things that it is instrumentally rational to behave as if you believe.
Regarding your second paragraph: I find that I simply get into fewer arguments. Because I have let go of most of the opinions that typically weight people and make them respond at an emotional level to being contradicted, every conversation that includes a disagreement becomes a joint truth-seeking venture. Instead of arguments, I have "discussions." If the other party in the discussion is not interested in truth seeking but is instead interested in being right, I just stop humoring them and change the subject. If they are someone I can be honest with, I will point out that they seem to have an irrational bias regarding the topic in question.
It seems like your first mistake was getting involved in a theological debate. Science is flawed, but "religion" doesn't even have enough predictive power to be falsified. I would step back at least one level and urge you to ask yourself what your objectives are in participating in such a confused discussion in the first place. I myself have indulged in totally stupid internet arguments which I can only attribute to a sort of perverse pique that strikes me at times, but I generally admit that I'm already failing to be rational just by participating in such.
Replies from: orthonormal, Spectral_Dragon, Manfred↑ comment by orthonormal · 2012-02-10T05:07:11.524Z · LW(p) · GW(p)
When you move yourself to a sufficiently morally-relativistic framework you stop identifying with your opinions.
"Stop identifying with your opinions" is a classic Less Wrong idea, but moral relativism is not.
Replies from: faul_sname↑ comment by faul_sname · 2012-02-11T00:06:22.678Z · LW(p) · GW(p)
Perhaps it's not an explicitly stated idea, but it's probably a fairly common one.
Replies from: orthonormal, Vladimir_Nesov↑ comment by orthonormal · 2012-02-11T23:59:36.416Z · LW(p) · GW(p)
Sure, there are plenty of moral relativists here; but given that Eliezer's metaethics sequence explicitly contrasts itself with traditional moral relativism and that Luke's moral reductionism sequence holds out several levels of the objective-subjective distinction (to each of which moral relativism gives the subjective answer) as open questions (pending the rest of the sequence), I'd say that the Less Wrong consensus is "reject naive moral realism" rather than "embrace moral relativism".
↑ comment by Vladimir_Nesov · 2012-02-12T16:16:14.943Z · LW(p) · GW(p)
You should perhaps unpack what you mean by the label "moral relativism" at this point.
Replies from: faul_sname↑ comment by faul_sname · 2012-02-12T20:07:39.924Z · LW(p) · GW(p)
Yeah, I definitely should. What I was trying to say was that most LWers think that morality is an aspect of minds, not an aspect of the outside world (no universal morality). I think I misunderstood the term, this time through wikipedia it seems that moral relativism rejects a common human morality. It appears I was using the word wrong.
Replies from: Vladimir_Nesov↑ comment by Vladimir_Nesov · 2012-02-12T20:42:49.866Z · LW(p) · GW(p)
Then you're probably right about this being a standard position on LW, but you used wrong/misleading terminology. Rejection of universal morality might be a suitable description, though there are fine points this doesn't capture, morality being "subjectively objective", in the sense of everyone having their own "personally-objective" morality they can't alter in any way (so that there is a possibility of getting it wrong and value in figuring out what it is).
("Being an aspect of mind" also runs into problems, since there's no clear dividing line that makes things other than your own mind absolutely useless in figuring out what (your) morality is.)
↑ comment by Spectral_Dragon · 2012-02-09T23:10:03.408Z · LW(p) · GW(p)
That's really insightful. Lately I have been getting into a few more debates, religious and not, because of decreased tolerance to flawed ideas. I got stuck on “That which can be destroyed by the truth should be." loop, convinced I was right in a discussion with my mother, actually.
Still trying to figure out how to become more truth-seeking, but it's hard since I'm not nearly rational enough. I wonder what the best way to act is, if I don't want debates, or discussions, but feel compelled to give a hint at my own opinions. For example, a friend thought the best way to make things better was to pray for me, which sparked a pretty heated argument, something I didn't want.
I'll just try to get my head out of my arse, but I still find it frustrating how obviously wrong people (including me) can be.
What sequence would you recommend if I repeatedly approach this from the wrong angle?
Replies from: Randaly, moridinamael↑ comment by Randaly · 2012-02-10T00:42:22.541Z · LW(p) · GW(p)
Does an external link work instead? Because I found Paul Graham's essay Keep Your Identity Small to make the point a bit more succinctly.
Replies from: Spectral_Dragon↑ comment by Spectral_Dragon · 2012-02-10T23:08:00.868Z · LW(p) · GW(p)
Definitely helpful, much appreciated!
↑ comment by moridinamael · 2012-02-10T00:38:10.604Z · LW(p) · GW(p)
The Reductionism Sequence has been the most important for me, in terms of how I would assess its impact on my mental processes. In particular, I think it helps you see what other people are doing wrong, so that you can respond to their errors in a non-confrontational manner. Spending a lot of time essentially meditating on the concepts underlying dissolving the question has really changed how I see things and how I deal with disagreements with other people.
I'm not claiming to be some paragon of perfect rationality here, I still lose my patience sometimes, but it's a process.
Replies from: Spectral_Dragon↑ comment by Spectral_Dragon · 2012-02-10T23:12:07.490Z · LW(p) · GW(p)
An attempt to be more rational, then? Thanks, I think I need to reread that, anyway. That and a few others. It'll require some work, sure, but few things in life are easy. It's a start anyway, cheers! Think I'll do a bit better in... A few weeks, once I've mulled it over.
comment by Shmi (shminux) · 2012-02-09T21:54:06.981Z · LW(p) · GW(p)
Ever since I started frequenting this forum, I am much more aware when I am trying to win an argument, rather than to come up with the best possible model/solution.
It does not mean that I immediately recite the appropriate litany and switch to the "rational" mode, because sometimes winning with a flawed argument is the rational thing to do. If you doubt that, you have never bought a used car.
However, when finding out the truth (or a reasonable facsimile thereof) is the rational thing to do, I make an effort to distance myself from the "ownership" of my original argument. This is one of the hardest things to do for me, and probably for most people. I probably fail much too often. I can certainly see others fail much too often, and I should not expect to be that much better.
Now, on the surface, your reply should be along the lines of You're Entitled to Arguments, But Not (That Particular) Proof. People generally don't abandon a flawed line of reasoning, they resolve their cognitive dissonance in other ways.
The Bayesian judo example worked mostly because EY is so smart and has thought through the issue countless times before, so the believer on the other side was unprepared and completely outmatched. Plus he made a bad move and was not prepared to cut his losses by abandoning it right away ("e.g. I do not care what you call an AI, if it is not created by God, it's not an intelligence to me", nice and circular).
So, what should you do in a debate like that, provided that you are irrational enough to get sucked into one? First, you should probably recognize by now that the particular argument you mentioned is simply a feint, and so are all other arguments. Your opponent has no intention of changing his or her mind (and neither do you, I suspect), so both of you start with your respective bottom lines already written. How do you know when you are not in an honest argument? The moment you catch yourself thinking "My argument failed, I need to come up with another one", it is a sure sign that you are being irrational.
So, if you honestly want to come to an agreement with someone who does not subscribe to the LW idea of rationality, what do you do?
One much popularized approach (and a negotiation tactics) is to "think like your enemy". If you are unable to accurately model your opponent, you will not be able to affect their thinking. EY once attributed his improbable success in getting a simulated UFAI out of the box to thinking like a UFAI would. How do you know that you do have an adequate model? You will be able to come up with their arguments as fast as they do, and these arguments will make perfect sense within that model. This is very hard, because people generally suck at running virtual machines of other people's minds without letting the VM bleed through to the host and vice versa, corrupting one or both of them.
Assuming you are finally at a point where you can be on the other side of the debate and do as well as your opponent, what's next? Now that you are intimately familiar with your surroundings, you have a better chance of noticing the imperfections and pointing them out, since you are not emotionally invested in keeping this particular VM running. Pick at the imperfections, see if you can fix them from inside of the VM, because your opponent sure would try that first. Once you are sure that there is no adequate fix for a flaw, you can point it out (or, better yet, guide them to discovering it on their own) and let them stew on it... unless your opponent has thought it through and has a counter that you missed. You don't push your point, because it makes people defensive. Minds are not changed overnight.
Pushing this one step further, you should practice this by constructing a VM of yourself first, so you can go through the same process dispassionaly and make sure that your own boat is airtight. For example, the simulation argument has been known to convert an occasional atheist into an agnostic.
Well, this should be enough of preaching for now. Hope some of it makes sense.
Replies from: Spectral_Dragon↑ comment by Spectral_Dragon · 2012-02-09T23:21:15.007Z · LW(p) · GW(p)
Of course I'm not changing my mind, I'm right!
All sarcasm aside, thank you, this is a, for me, much needed change in perspective. I'll consider this more next time I'm tempted into an argument, rather than getting rational. I still don't like the idea of flawed arguments, but I'll need to consider things further before I form a reply, I guess, for maximum benefit and rationality in the longer run. Your preaching's appreciated.
comment by billswift · 2012-02-09T22:06:43.990Z · LW(p) · GW(p)
I agree with moridinamael, but you also seem to be making a second mistake in your first paragraph:
we're automatically biased for or against something, at times for the wrong things and reasons.
You appear to be conflating social/political biases (ie, for or against something) with cognitive biases which directly influence how we think about things. Cognitive biases, to the extent that they can be described as "for or against", would be "for or against" particular ways of thinking. There is some overlap, which is why I only said "seem to be making a second mistake", from the rest of your post it looks like you are, but a charitable reading says maybe not.
Your third paragraph describes what looks like a version of the "argument from ignorance" fallacy combined with a "default to God" (a version of the God of the Gaps). I don't think there really is a general counter-argument to this. George Smith, in one of his essays in Atheism, Ayn Rand, and other Heresies suggested avoiding getting into arguments like this, unless you are using them to try to convince third party observers.