[LINK] Why taking ideas seriously is probably a bad thing to do

post by David_Gerard · 2013-01-05T23:37:55.909Z · LW · GW · Legacy · 39 comments

Contents

39 comments

Yvain's blog: Epistemic learned helplessness.

A friend in business recently complained about his hiring pool, saying that he couldn't find people with the basic skill of believing arguments. That is, if you have a valid argument for something, then you should accept the conclusion. Even if the conclusion is unpopular, or inconvenient, or you don't like it. He told me a good portion of the point of CfAR was to either find or create people who would believe something after it had been proven to them.

And I nodded my head, because it sounded reasonable enough, and it wasn't until a few hours later that I thought about it again and went "Wait, no, that would be the worst idea ever."

I don't think I'm overselling myself too much to expect that I could argue circles around the average high school dropout. Like I mean that on almost any topic, given almost any position, I could totally demolish her and make her look like an idiot. Reduce her to some form of "Look, everything you say fits together and I can't explain why you're wrong, I just know you are!" Or, more plausibly, "Shut up I don't want to talk about this!"

39 comments

Comments sorted by top scores.

comment by Qiaochu_Yuan · 2013-01-05T23:49:01.904Z · LW(p) · GW(p)

My summary / take: believing arguments if you're below a certain level of rationality makes you susceptible to bad epistemic luck. Status quo bias inoculates you against this. This seems closely related to Reason as memetic immune disorder.

Replies from: None, private_messaging, torekp
comment by [deleted] · 2013-01-06T07:27:30.393Z · LW(p) · GW(p)

.

comment by private_messaging · 2013-01-06T20:01:45.593Z · LW(p) · GW(p)

Mistakenly believing that you're above that level of rationality is, then, really bad.

Replies from: Qiaochu_Yuan
comment by Qiaochu_Yuan · 2013-01-06T21:49:31.113Z · LW(p) · GW(p)

Yep. That is the primary reason I haven't yet signed up for cryonics: I can't tell if I want to because I actually think it's a good idea or because I just believe things that people I like say.

Replies from: David_Gerard
comment by David_Gerard · 2013-01-06T21:52:56.174Z · LW(p) · GW(p)

You and I and lots of people will sign up for cryonics when it's the normal thing to do in our social group.

comment by torekp · 2013-01-13T16:01:55.997Z · LW(p) · GW(p)

Bad epistemic luck, and also adverse selection. Once you become known as persuadable, you tend to attract some unsavory characters.

comment by someonewrongonthenet · 2013-01-08T08:29:16.025Z · LW(p) · GW(p)

"I can't explain why you are wrong. But, honestly, I can't fully explain why you are right either. Until I can, I'm going to go with the trusty old combination of tradition and gut instinct."

When I understand a valid argument, I do believe it. I don't have a choice in the matter. I'm not sure if anyone has a choice in the matter, although some people may be better at ignoring that the inner voice that requires you to act in accordance with your beliefs.

But I think there is a big difference between being understanding and being unable to counter. Accepting arguments which you are merely unable to counter opens you up to all kinds of manipulation.

I don't think I'm overselling myself to think that I could wield questions accurately enough to knock down false arguments given by someone who is significantly smarter than me, so long as they aren't allowed to fabricate evidence.

The key is that one must recognize when one has not yet understood something. I think most of the posts on lesswrong are devoted to honing this very skill, even if it is not explicitly mentioned. I'd go so far as to say that a rationalist's self evaluation is essentially equivalent to the extent to which she trusts herself to accurately assign certainty to statements.

I suppose I'm saying the same things as Yvain in different words... here is my phrasing, which I think is better: be careful about accepting beliefs. But please do take your beliefs seriously.

comment by Vaniver · 2013-01-06T02:56:32.595Z · LW(p) · GW(p)

So, I just spent a few hours today rereading Moldbug, and am amused by the relevance of these paragraphs (from here:

[W]e might say that whether they teach the truth or not, churches are just a bad idea, period. People should think for themselves. They should not have thoughts broadcast into a little antenna in the back of the skull. Therefore, the state should separate itself from the church, just because a good state should separate itself from all evil things.

But fortunately or unfortunately, there is no kingdom of philosophers. Most people do not think for themselves, should not think for themselves, and cannot be expected to think for themselves. They do exactly what they should be doing, and trust others to work out the large philosophical truths of the world for them. This trust may be well-placed or not, but surely this mechanism of delegation is an essential aspect of human society - at least with the humans we have now.

Replies from: someonewrongonthenet
comment by someonewrongonthenet · 2013-01-08T07:52:46.036Z · LW(p) · GW(p)

To derail slightly, this is a great point that I repeatedly try to emphasize to fervent anti-theists.

For most people, organized religion is the closest brush with philosophy that they will ever have. Currently, there is no other social institution that makes people ponder what it means to be good, or to seek truths beyond the practical matters of every day life. College education comes the closest, but not everyone gets that privilege.

I've got to say though...with the disclaimer that this is the only post I've read from here - Moldbug's post is entertaining, but it has a pronounced pseudo-intellectual feel about it. Pretty writing, but he's tying a lot of separate concepts together into one big picture from very sparse premises.

Although, I suppose that is the state of most political commentary. It's all hollow all the way through, until you get into specifics and data.

comment by buybuydandavis · 2013-01-06T03:18:16.774Z · LW(p) · GW(p)

"'Cos when their eloquence escapes you
Their logic ties you up and rapes you"

It's a problem. Besides the fact that most arguments are rationalizations and not motivations for a position, you can see why people aren't convinced by arguments. There are smarty pants on both sides of every issue with arguments most can't refute. Argumentation just isn't a reliable means for them to come to the truth.

comment by Oligopsony · 2013-01-06T13:48:26.959Z · LW(p) · GW(p)

One thing I've noticed is that in nearly any controversy where the adherents of the heterodox position show signs of basic mental stability, the arguments for heterodoxy are stronger than the arguments for orthodoxy. In the rare cases where this is not true - for instance, creationism - I can take this as a strong indicator of orthoxy (at least against the particular heresy in question.) but how am I to take the general pattern? Should I be more skeptical of orthodoxy in general - of the likelihood of truth coming to orthodoxy given the standards of public truth evaluation which now prevail - or more trusting of it - given that heterodox positions appear to be stronger regardless of context, and are thus likely stronger for reasons other than their truth? My rough conclusion is that I should either look for me-specific biases in this matter, or else look with greater skepticism of orthodoxy in matters I have not yet investigated and greater trust in orthodoxy in matters I have investigated that the strength of arguments would otherwise lead me to believe. But I haven't thought this through fully.

Replies from: satt, roystgnr, DanArmak, army1987
comment by satt · 2013-01-06T20:04:52.367Z · LW(p) · GW(p)

One thing I've noticed is that in nearly any controversy where the adherents of the heterodox position show signs of basic mental stability, the arguments for heterodoxy are stronger than the arguments for orthodoxy.

Is this true? A priori I could see this go either way, and my personal experiences don't add much evidence here (I can't recall many controversies where I've probed deeply enough to conclusively weigh orthodoxy against heterodoxy).

A weaker statement I'm more sure of: the arguments for orthodoxy one hears from most people are weaker than the arguments for heterodoxy, because most people have little reason to actually look up whatever factual basis the orthodoxy might have. (I've seen someone make this point somewhere on Yvain's blog but can't remember who.) For example, I haven't bothered to look up the precise scientific arguments that'd justify my belief in plate tectonics, but a shrinking earth theorist probably has, if only to launch a counterattack on them. (Corollary: I'd have a good chance of losing an argument with a shrinking earth theorist, even though plate tectonics is, well, true.)

Replies from: Eugine_Nier
comment by Eugine_Nier · 2013-01-07T20:59:59.207Z · LW(p) · GW(p)

Of course, this means the supporters of orthodoxy are in the worst position to judge when they should be updating their position based on new evidence.

comment by roystgnr · 2013-01-07T18:12:16.381Z · LW(p) · GW(p)

You'll want to read an earlier Yvain blog post, then, explaining "many reasons to expect that arguments for socially dominant beliefs (which correlate highly with truth) to be worse than the arguments for fringe beliefs (which probably correlate highly with falsehood)".

Replies from: ewbrownv
comment by ewbrownv · 2013-01-08T23:16:23.559Z · LW(p) · GW(p)

Why would you expect the social dominance of a belief to correlate with truth? Except in the most trivial cases, society has no particular mechanism that selects for true beliefs in preference to false ones.

The Darwinian competition of memes selects strongly for those that provide psychological benefits, or are politically useful, or serve the self-interest of large segments of the population. But truth is only relevant if the opponents of a belief can easily and unambiguously disprove it, which is only possible in rare cases.

Replies from: Eugine_Nier
comment by Eugine_Nier · 2013-01-09T22:42:41.173Z · LW(p) · GW(p)

Or if acting on the damage caused by having a bad model of reality is worse than the signaling benefit of the false belief.

comment by DanArmak · 2013-01-06T19:29:37.059Z · LW(p) · GW(p)

If the arguments for orthodoxy are stronger, then you dismiss contrarians entirely: they are obviously wrong! So do other people, so you don't get to hear about them to begin with. And so do most potential contrarians themselves.

So by selection effect, we mostly see contrarian arguments which at least appear to be better than the orthodoxy.

comment by A1987dM (army1987) · 2013-01-07T13:13:11.259Z · LW(p) · GW(p)

One thing I've noticed is that in nearly any controversy where the adherents of the heterodox position show signs of basic mental stability, the arguments for heterodoxy are stronger than the arguments for orthodoxy.

See this and this.

I think it's a version of Berkson's paradox: if a position is both heterodox and not supported by any strong arguments, it's very unlikely that people with “basic mental stability” will embrace it in the first place. See also: “The Majority Is Always Wrong” by EY.

comment by David_Gerard · 2013-02-25T08:22:05.606Z · LW(p) · GW(p)

The real problem is the phrase "the skill of taking ideas seriously" - by which they do not mean "can deftly sling remarkable quantities of hypotheticals and work out what they would imply", but "being moved to action by the ideas."

The trouble is that there is a name for this in the normal world - it is the defect of "gullibility" or "being easily led".

If CFAR selects for people prone to this defect - it really, really isn't a "skill" - you will be actively selecting for people who will add 2+2+2 ... and get 666.

This may be a problem.

comment by duckduckMOO · 2013-01-08T01:46:26.640Z · LW(p) · GW(p)

The writer says "If you insist on telling me anyway, I will nod, say that your argument makes complete sense..." despite knowing perfectly well they can't tell if the argument makes sense or not.

If, even knowing specifically in this case that you can't tell if an argument is correct or not, you feel the need to announce that "your argument makes complete sense" your problem is that you believe things without understanding them. Fixing that bad habit might remove the need to not take arguments seriously.

Replies from: atorm
comment by atorm · 2013-01-10T05:29:40.331Z · LW(p) · GW(p)

"Your argument makes complete sense" may be a polite way of saying "I don't see any obvious holes and am not willing to look for the non-obvious ones. Please stop talking to me now."

comment by Risto_Saarelma · 2013-01-07T18:22:59.933Z · LW(p) · GW(p)

Why no one wants their brain cut into pieces and preserved chemically, squishy pieces in a jar style?

They don't? I do. Well, after I'm clinically dead, preferably. And it's an actual proposed alternative tech to cryonics. I don't think people who are hedging on the pattern theory of identity to be correct in going for cryonics care much about how much the physical brain substrate gets sliced and diced during preservation, as long as the information about its structure remains reconstructible.

There are a bunch of cryonicists who are adamantly opposed to anything that messes with the biological brain staying as a single intact body though.

comment by gwern · 2013-01-07T16:27:02.544Z · LW(p) · GW(p)

It'd be a normal thing if water didn't crystallize even at very high cooling rates. World not being convenient, you can cut brain into pieces and store in fixatives like good ol formaldehyde, or you can freeze it whole with parts vitrifying after being damaged by solvents and parts getting shredded into pieces by ice and everything cracking apart.

Can you name any 'normal' thing at all where people invest a good sum of money for their personal benefit based on highly uncertain projections of continued scientific progress because the expected value seems good?

Because otherwise, I don't think the more convenient world in which water doesn't crystallize would look very different...

comment by TimS · 2013-01-06T00:18:58.897Z · LW(p) · GW(p)

From the article:

If I know that a false argument sounds just as convincing as a true argument, argument convincingness provides no evidence either way, and I should ignore it and stick with my prior.

That's true, but it's just a restatement of your ignorance of a topic. When one is sufficiently ignorant of a topic, one isn't capable of evaluating the arguments.

But Yvain suggests that continued education left him unable to differentiate the quality of arguments. How much of that was that he was reading only nonsense. Reading competing Timecube-quality arguments on a particular topic doesn't add to one's understanding - but so what? That doesn't imply that learning how to recognize good arguments is a strange quality - one can still aspire to become better at it, and reasonably expect to achieve that goal.

In short, unwillingness to take ideas seriously sounds like a terrible idea. Unwillingness to take bad ideas seriously is worthwhile, but skipping over the mechanisms for filtering good ideas from bad leaves me confused about the point of the post.

Replies from: Qiaochu_Yuan
comment by Qiaochu_Yuan · 2013-01-06T00:55:43.281Z · LW(p) · GW(p)

skipping over the mechanisms for filtering good ideas from bad leaves me confused about the point of the post.

The point of the post is that most people, in most domains, should not trust that they are good at filtering good ideas from bad.

Replies from: Academian, private_messaging
comment by Academian · 2013-01-07T18:46:50.428Z · LW(p) · GW(p)

And the point of CFAR is to help people become better filtering good ideas from bad. It is plainly not to produce people who automatically believe the best verbal argument anyone presents to them without regard for what filters that argument has been through, or what incentives the Skilled Arguer might have to utter the Very Convincing Argument for X instead of the Very Very Convincing Argument for Y. And certainly not to have people ignore their instincts; e.g. CFAR constantly recommends Thinking Fast and Slow by Kahneman, and teaches exercises to extract more information from emotional and physical senses.

comment by private_messaging · 2013-01-09T01:15:53.619Z · LW(p) · GW(p)

The point of the post is that post people, in most domains, should not trust that they are good at filtering good ideas from bad.

Or good courses from the bad courses. People should rely on empirical evidence more, that is to say, need more empiricism over rationalism. E.g. here, in rationalist community (quoting verbatim from article linked on About page): "Epistemic rationality is about forming true beliefs, about getting the map in your head to accurately reflect the territory of the world. We can measure epistemic rationality by comparing the rules of logic and probability theory to the way that a person actually updates their beliefs.", whereas just about anyone else would measure that kind of thing by predicting something hidden, then checking for correctness, which is more empiricist than rationalist.

comment by brilee · 2013-01-06T00:15:47.253Z · LW(p) · GW(p)

Excellent post by Yvain... your excerpt really doesn't do it justice.

comment by Academian · 2013-01-06T15:44:56.263Z · LW(p) · GW(p)

testing this symbol: ∃

Replies from: Kawoomba
comment by Kawoomba · 2013-01-06T15:50:16.013Z · LW(p) · GW(p)

There's a sandbox you can use for such, below the box in which you write a new comment click "Show help", then there's a link taking you there on the bottom right.

comment by gwern · 2013-01-07T18:14:22.438Z · LW(p) · GW(p)

I correct my assertion; damage may begin in a minute or two such as going unconscious, but the more extreme permanent levels of damage take a bit longer:

In severe cases it is extremely important to act quickly. Brain cells are very sensitive to reduced oxygen levels. Once deprived of oxygen they will begin to die off within five minutes.

-- http://en.wikipedia.org/wiki/Cerebral_hypoxia

The longest human survival without breathing is 80 minutes.

If you're referring to Anna Bågenholm, you're wrong; she survived in an air pocket and did not freeze but was hypothermic. Hypothermic techniques are already used in medicine, with no visible uptick in cryonics support.

How badly would the brain have to be shredded at microscale until cryonicists wouldn't sign up?

I don't think anyone bothers past a day or so post-death, by which point decay processes have set in.

Why no one would cut brain into pieces and preserve it chemically, squishy pieces in a jar style?

Why would you do that? We don't know where the exact crossing line is, so every additional level of degradation and poor preservation increases the chance of failure.

If you mean chemopreservation or plastination, the answer is, I think, historical convenience: fast freezing and then vitrification were developed long before fast versions of either of the former. Existing techniques of chemopreservation or plastination still don't scale to an entire brain the way cooling can; although Darwin's been working on a proposal for plastination+cryonics, and the Brain Preservation Prize should be getting evidence allowing direct comparison, so 'brain in a jar' methods may yet work out. (Cold comfort for anyone who already has died or will soon die, however.)

comment by gwern · 2013-01-07T17:45:58.765Z · LW(p) · GW(p)

Not what I asked, and I suspect that it's not very likely because you would still have problems reheating everything and avoiding anoxia - if the brain dies in a minute or two without any oxygen, icing a living person sounds quite dicy.

More importantly: that point is basically 'well if cryonics already worked, then maybe it'd be more popular'. Yes, one would rather hope so! But that would be a very convenient world indeed, and tells us nothing.

comment by gwern · 2013-01-07T18:32:55.239Z · LW(p) · GW(p)

I'm not sure what insane tangent you are running off in.

And you were doing so well up to that point. I guess a leopard can't change its spots. One final minor bit: I think ALCOR actually switched from freezing to vitrification in 2001, not 2000.

comment by rasputin · 2013-01-05T23:56:17.763Z · LW(p) · GW(p)

That's a strawman argument. You're implying your idea is the only on she'd ever hear. If that same highschool dropout (not sure why she has to be a girl you feminist) were to hear more ideas on the subject, with her opinion conforming with the winning argument, eventually she'd land on the correct one. I'm sure there are a few accepted incorrect scientific theories running around at the moment, but we have to accept them while they're the best we have in order to come to the next logical conclusion.

Replies from: rasputin
comment by rasputin · 2013-01-06T00:23:32.591Z · LW(p) · GW(p)

How did this merrit negative points

Replies from: benelliott, Jayson_Virissimo, TimS
comment by benelliott · 2013-01-06T00:29:38.988Z · LW(p) · GW(p)

That's not the whole argument, there's a lot more if you actually follow the link. Hence you were bashing a strawman (albiet understandably so).

comment by Jayson_Virissimo · 2013-01-06T00:28:54.043Z · LW(p) · GW(p)

Trying using the principle of charity next time.

comment by TimS · 2013-01-06T00:43:39.063Z · LW(p) · GW(p)

Also, feminism is a boo light with some round these parts - particularly a weak accusation like you made.