Epistemic Luck

post by Alicorn · 2010-02-08T00:02:50.262Z · LW · GW · Legacy · 133 comments

Contents

133 comments

Who we learn from and with can profoundly influence our beliefs. There's no obvious way to compensate.  Is it time to panic?

During one of my epistemology classes, my professor admitted (I can't recall the context) that his opinions on the topic would probably be different had he attended a different graduate school.

What a peculiar thing for an epistemologist to admit!

Of course, on the one hand, he's almost certainly right.  Schools have their cultures, their traditional views, their favorite literature providers, their set of available teachers.  These have a decided enough effect that I've heard "X was a student of Y" used to mean "X holds views basically like Y's".  And everybody knows this.  And people still show a distinct trend of agreeing with their teachers' views, even the most controversial - not an unbroken trend, but still an obvious one.  So it's not at all unlikely that, yes, had the professor gone to a different graduate school, he'd believe something else about his subject, and he's not making a mistake in so acknowledging...

But on the other hand... but... but...

But how can he say that, and look so undubiously at the views he picked up this way?  Surely the truth about knowledge and justification isn't correlated with which school you went to - even a little bit!  Surely he knows that!

And he does - and so do I, and it doesn't stop it from happening.  I even identified a quale associated with the inexorable slide towards a consensus position, which made for some interesting introspection, but averted no change of mind.  Because what are you supposed to do - resolutely hold to whatever intuitions you walked in with, never mind the coaxing and arguing and ever-so-reasonable persuasions of the environment in which you are steeped?  That won't do, and not only because it obviates the education.  The truth isn't anticorrelated with the school you go to, either!

Even if everyone collectively attempted this stubbornness only to the exact degree needed to remove the statistical connection between teachers' views and their students', it's still not truth-tracking.  An analogy: suppose you give a standardized English language test, determine that Hispanics are doing disproportionately well on it, figure out that this is because many speak Romance languages and do well with Latinate words, and deflate Hispanic scores to even out the demographics of the test results.  This might give you a racially balanced outcome, but on an individual level, it will unfairly hurt some monolingual Anglophone Hispanics, and help some Francophone test-takers - it will not do as much as you'd hope to improve the skill-tracking ability of the test.  Similarly, flattening the impact of teaching on student views won't salvage truth-tracking of student views as though this trend never existed; it'll just yield the same high-level statistics you'd get if that bias weren't operating.

Lots of biases still live in your head doing their thing even when you know about them.  This one, though, puts you in an awfully weird epistemic situation.  It's almost like the opposite of belief in belief - disbelief in belief.  "This is true, but my situation made me more prone than I should have been to believe it and my belief is therefore suspect.  But dang, that argument my teacher explained to me sure was sound-looking!  I must just be lucky - those poor saps with other teachers have it wrong!  But of course I would think that..."

It is possible, to an extent, to reduce the risk here - you can surround yourself with cognitively diverse peers and teachers, even if only in unofficial capacities.  But even then, who you spend the most time with, whom you get along with best, whose style of thought "clicks" most with yours, and - due to competing biases - whoever agrees with you already will have more of an effect than the others.  In practice, you can't sit yourself in a controlled environment and expose yourself to pure and perfect argument and evidence (without allowing accidental leanings to creep in via the order in which you read it, either).

I'm not even sure if it's right to assign a higher confidence to beliefs that you happen to have maintained - absent special effort - in contravention of the general agreement.  It seems to me that people have trains of thought that just seem more natural to them than others.  (Was I the only one disconcerted by Eliezer announcing high confidence in Bayesianism in the same post as a statement that he was probably "born that way"?)  This isn't even a highly reliable way for you to learn things about yourself, let alone the rest of the world: unless there's a special reason your intuitions - and not those of people who think differently - should be truth-tracking, these beliefs are likely to represent where your brain just happens to clamp down really hard on something and resist group pressure and that inexorable slide.

133 comments

Comments sorted by top scores.

comment by Alex Flint (alexflint) · 2010-02-09T12:27:14.933Z · LW(p) · GW(p)

Try this: Choose a book that you expect to disagree with and read it from start to finish over several weeks. See what impact it has on you. I tried this and felt my beliefs changing despite none of the arguments being convincing. It seemed to peter out a few weeks after I finished the book. I hypothesize that in an extended experiment we could actually brainwash ourselves to the point of holding some radically different views.

Replies from: mattnewport, Normal_Anomaly, Viliam_Bur
comment by mattnewport · 2010-02-09T17:10:48.561Z · LW(p) · GW(p)

The psychology research I'm aware of suggests the opposite effect if anything - reading opposing views tends to make you more sure of your original views. This is a feature of confirmation bias.

Replies from: Sticky
comment by Sticky · 2010-02-12T22:37:26.628Z · LW(p) · GW(p)

The study described in the link only exposed the subject to a single article. The effect might be different for different amounts of exposure.

In my own experience this seems to be the case. When I briefly read politically opposing blogs I find them so obviously stupid that I'm amazed anyone could take the other side seriously, but when I spend a long while doing it I find my views moderating and sometimes even crossing over despite not being convinced by any of their actual arguments, and begin to be embarrassed by figures I normally admire even though most of what I find directed against them are mere pejoratives. Then afterward the effect wears off. I could be unusually easily-led, but I've heard of enough other similar experiences that I doubt it.

comment by Normal_Anomaly · 2011-07-01T19:30:43.135Z · LW(p) · GW(p)

This is actually quite useful, and it's how I got to the political views I have today. I started out very liberal, having been raised by liberals. I realized that this was a problem for the reasons in the OP, and also that I was biased by my need to identify with the group. To break myself out of these thought patterns, I read The Fountainhead with as open and accepting a mind as possible. It didn't contain much in the way of rational argument, but it shook up my belief system enough that I no longer feel the same tribal attachment to any political party. This in turn let me form an opinion on each issue based on evidence, and I now hold some opinions that would be called "liberal" and others that would be called "conservative".

Replies from: alexflint
comment by Alex Flint (alexflint) · 2011-07-01T23:25:21.112Z · LW(p) · GW(p)

Oddly enough, the book that prompted my post was Atlas Shrugged :)

Replies from: CBHacking
comment by CBHacking · 2014-11-02T13:07:41.047Z · LW(p) · GW(p)

Can't say I'm surprised, since I was about to mention my own reaction to Atlas Shrugged. I have continuously, near-obsessively dissected the book (both in terms of the rationality of its arguments and the quality of writing it contains)... and I still find my views changing, or at least my initial reactions changing, the more of it I read. It's a very odd experience. I have no idea what would have happened if I'd started reading it younger (I'm 28) and less aware of the way that politics and business proceed in the real world.

With that said, I think the effect is a net positive. I now see more of the stuff that Objectivists object to in the everyday world - it grabs my attention much more to hear somebody say something like "well, he really needed the job" - but it doesn't seem to have interfered with my ability to analyze the situation (for example, when multiple candidates are sufficiently qualified and no other differences are significant, it is the most productive thing to give a job to the qualified person who needs it most). Picking apart the places where Rand is wrong, or at least fails to make a convincing argument, has both equipped me to argue against those viewpoints when expressed by others, and has heightened my ability to see the places where she's right.

Bringing this back on topic, though, I'm not sure how parallel the scenarios (reading a book by choice but with the conscious intention of exploring the author's ideas and biases vs. picking up biases by accident from a teacher) really are. Part of that may be that I do not automatically associate a book with the personhood of the author, the way I associate a class with the personhood of the teacher (indeed, I have to constantly consciously remind myself of Rand's own experiences to even begin to comprehend some of her arguments; I have never had to similarly remind myself more than once when dealing with a person in the flesh). I certainly internalize lessons and viewpoints much more in person than I do from a text.

Relatedly, I need to get myself to some presentations and/or workshops on rationality, as I'm new and still find many of the concepts that I am trying to learn are... slippery, in a way that things I learned from a "real person" almost never are. Of course, the fact that I'm trying to become more rational, while I am in no way trying to become Objectivist, may make a big difference. Too many axes for the data that I have, I think, though further analysis may show otherwise.

comment by Viliam_Bur · 2011-10-07T10:41:39.344Z · LW(p) · GW(p)

I hypothesize that in an extended experiment we could actually brainwash ourselves to the point of holding some radically different views.

This is the first of Lifton's eight criteria for thought reform -- one is systematically exposed only to one side of evidence, and isolated from the other side of evidence.

To make this brainwashing more effcient, there are additional techniques:

Live among people who will make you feel ashamed for not believing in X. These people should love you as a person, but hate any non-X-ness. Expose your doubts about X in front of the group -- it will help them understand and modify your mind processes.

Develop a group-specific jargon -- if you make your pro-X arguments using words that an outsider does not understand, then the outsider cannot refute these arguments.

If you ever experience or remember something that disagrees with X, be a good Bayesian and remember that with probability at least epsilon, your experience or memory is wrong. On the other hand, assing prior probability 1 to X. This way, whatever evidence you have, it is perfectly rational to believe in X.

comment by CronoDAS · 2010-02-08T09:53:55.767Z · LW(p) · GW(p)

"Isn't it lucky that we happened to be born into the one true faith?"

In my case, I got a lot of my atheism from my father; I haven't had the kind of "crisis of faith" that many other aspiring rationalists said they had. (My mom is a non-practicing Catholic.) I was practically born into this culture, so I get worried about this on occasion.

Replies from: jhuffman, roland
comment by jhuffman · 2010-02-08T22:36:00.986Z · LW(p) · GW(p)

It would be interesting to know how many people in the US who are raised as atheists adopt a practicing faith in their adolescence or adulthood that they maintain for at least ten years; versus people raised as practicing theists who become atheists and remain so for at least ten years.

comment by roland · 2010-02-09T19:54:40.471Z · LW(p) · GW(p)

I was practically born into this culture, so I get worried about this on occasion.

Upvote for this. I get really annoyed by atheists that keep criticizing religious people without being aware of their own irrationality.

Replies from: AllanCrossman, ciphergoth, Kaj_Sotala, AndyWood, MrHen
comment by AllanCrossman · 2010-02-09T23:39:13.045Z · LW(p) · GW(p)

Do you also get annoyed by people who don't believe in ghosts who criticize people who do without being aware of their own irrationality?

Replies from: roland
comment by roland · 2010-02-10T00:08:25.318Z · LW(p) · GW(p)

No because I don't read/hear from these people, I've never met an aghostist.

Replies from: mattnewport
comment by mattnewport · 2010-02-10T00:09:55.343Z · LW(p) · GW(p)

You've never met someone who doesn't believe in ghosts?

Replies from: roland
comment by roland · 2010-02-10T00:58:07.396Z · LW(p) · GW(p)

I should have clarified better. I usually don't meet people who make a big fuss about being aghostists and ridiculing ghostists and how irrational it is to be a ghostist and then enumerate all the pedophiles that are ghostists and how much money is stolen by ghostists and that ghostists fly planes into buildings and ghostists are the ones who are responsible for all kind of violence and human suffering and etc... etc... etc...

EDIT: Consider this slogan: "Science flies you to the moon. Religion flies you into buildings." It was suggested to be used at the Bus campaign. There is just so much wrong with this, well I hope I don't have to explain what and you can figure this out by yourself.

Replies from: ciphergoth, AllanCrossman
comment by Paul Crowley (ciphergoth) · 2010-02-10T15:54:49.646Z · LW(p) · GW(p)

I prefer a slight variant: "Maybe science can make jet aeroplanes or tall buildings, but it takes religion to bring these things together."

Replies from: MrHen, roland
comment by MrHen · 2010-02-10T22:44:36.549Z · LW(p) · GW(p)

Oh! Haha, I finally got it. :P

comment by roland · 2010-02-10T22:04:35.484Z · LW(p) · GW(p)

You see, there is nothing in religion about bringing those things together, sure religious people can do that, but so can atheists.

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2010-02-10T22:37:34.973Z · LW(p) · GW(p)

If you're interested in this question, I recommend Sam Harris's The End of Faith.

comment by AllanCrossman · 2010-02-10T15:40:27.321Z · LW(p) · GW(p)

Nobody bothers to make a fuss about ghostists because ghostism isn't particularly important.

Replies from: MrHen
comment by MrHen · 2010-02-10T16:23:54.103Z · LW(p) · GW(p)

I agree, but this comment is vapid unless you offer a reason why ghostism isn't particularly important.

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2010-02-10T16:42:14.549Z · LW(p) · GW(p)

No, that isn't so; it suffices that as a matter of fact ghostism doesn't wield very much political or other thought-shaping power in the countries in which we live.

Of course if we lived in countries where people get executed for being witches, we might have different priorities.

Replies from: MrHen
comment by MrHen · 2010-02-10T17:23:59.504Z · LW(p) · GW(p)

Details are tasty and good. A comment like Allan's ends conversations and there is nothing more to learn afterward.

A comment like yours can lead into useful conversations about the specific differences between ghostism and theism and the wonderful followup question: Is there something other than theism that qualifies as important but people don't make a big fuss over?

To ask that question we need to know the details about what is important.

Of course, if no one wants to ask questions, that is fair enough. But I consider those comments/discussions vapid.

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2010-02-10T17:41:43.437Z · LW(p) · GW(p)

Hang on, there' s a huge gap between "vapid" and "doesn't spark the particular discussion I'm interested in having". I think the things you raise would indeed be interesting to discuss, but AllanCrossman's comment is a specific and sufficient answer to the specific question that roland asks - "what justifies the decision to put more work into attacking theism than ghostism".

It's sufficient because no-one disputes the factual accuracy of the answer.

Replies from: MrHen
comment by MrHen · 2010-02-10T17:59:40.414Z · LW(p) · GW(p)

Hmm... when I looked up vapid in my dictionary I got this:

offering nothing that is stimulating or challenging

Looking at other dictionaries it seems like it can also mean lacking life or tedious. I was going for more the former use than the latter use.

So I agree with you. Replace "vapid" with "boring" and you'll have more of what I was aiming for. "Boring" was too weak so I amped it up with "vapid," but apparently that was too strong. But whatever. It's not important.

Allan, no insult was meant. Your comment is fine.

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2010-02-10T19:10:59.471Z · LW(p) · GW(p)

I definitely would like to discourage you from castigating commentators for failing to entertain you. It's enough that their comments advance the argument, surely?

Replies from: MrHen
comment by MrHen · 2010-02-10T19:18:40.078Z · LW(p) · GW(p)

Fair enough.

comment by Paul Crowley (ciphergoth) · 2010-02-10T15:57:08.298Z · LW(p) · GW(p)

This is just the "tu quoque" fallacy. Read also No One Can Exempt You From Rationality's Laws.

comment by Kaj_Sotala · 2010-02-09T20:20:09.926Z · LW(p) · GW(p)

This got pretty heavily downvoted (-4 points before I gave it an upvote), but I think it does have a good point. It doesn't need to mean (as I suppose the downvoters assumed) that religion and atheism would be an equal footing. Rather, it can be taken as a reminder of the fact that a person's atheism isn't yet enough to show that they're actually sane. See science as attire and related posts.

Replies from: roland
comment by roland · 2010-02-09T22:49:42.823Z · LW(p) · GW(p)

Exactly! And using your words I would add that a person's atheism isn't yet enough to show that they're actually saner/more rational than a religious person.

comment by MrHen · 2010-02-09T19:59:34.827Z · LW(p) · GW(p)

I don't understand what you are saying.

Replies from: bgrah449, roland
comment by bgrah449 · 2010-02-09T20:31:35.301Z · LW(p) · GW(p)

Here's some metaphors I use; if it's bogus, someone please crush them.

Imagine a city with a slum. People ask why the police don't clean up the slum. The police know that if they come in and break up the slum, they'll decentralize the crime - better to keep it fenced in, under watchful eyes, than run wild.

I see a lot of people working roughly with the model that religion is an infection. Sometimes atheism is presented as antibiotics, but regardless of what the prescription is, there seems to be an impression that religion is some kind of foreign force, where once religious belief is removed, there has been clean removal, like surgically removing a boil, instead of getting some of the boil while the infection spreads through the bloodstream.

Religion is an easy target, but removing it, I think, has a tendency to take an army in a fortress and turn it into an army of nomadic assassins, everywhere and nowhere, decentralized and pervasive.

Replies from: MrHen, tut
comment by MrHen · 2010-02-09T20:39:45.697Z · LW(p) · GW(p)

If religion is an infection, than removing the infection would solve the problem.

What you are describing is that something is causing the infection of religion. In this case, cure the cause and the infection goes away. Rationality is making the promise of curing the cause of infection, not just dressing up the infection and sending the diseased on their way.

To drift this backward into your police analogy, if you could get rid of the crime than the slum would disappear. If this doesn't make perfect sense than the analogy is broken.

Replies from: bgrah449
comment by bgrah449 · 2010-02-09T20:40:43.964Z · LW(p) · GW(p)

It does make sense. I think it's as likely to get rid of crime as it is to get rid of the cause of religion.

Replies from: MrHen
comment by MrHen · 2010-02-09T20:43:30.718Z · LW(p) · GW(p)

Then when you ask the police to clean up the slums they will respond by saying, "We are," instead of, "But that will make it harder to fight the disease!"

comment by tut · 2010-02-09T20:40:50.513Z · LW(p) · GW(p)

... take an army in a fortress and turn it into an army of nomadic assassins ...

And if I was a medieval commander then I'd certainly prefer to fight a tribe of nomads to fighting an army in a castle.

comment by roland · 2010-02-09T22:40:59.701Z · LW(p) · GW(p)

CronoDAS expressed some self-concern about his POV. Contrasting to that I notice that a lot of atheists have a self-righteous, arrogant attitude. I have already heard one suggesting that we would make the world a better place by removing religion. I think the problem here is that religion is more of a symptom, a product of irrationality and if you are an atheist that doesn't necessarily mean that you are more rational.

So the solution would rather be increasing rationality instead of attacking particular beliefs.

Replies from: MrHen
comment by MrHen · 2010-02-09T22:52:10.035Z · LW(p) · GW(p)

Okay. That makes sense. I read your first comment as strongly implying that atheists (or possibly atheists that criticize religious people) are irrational. This isn't even close to what you meant, so I am glad I asked.

Replies from: roland
comment by roland · 2010-02-09T22:59:24.048Z · LW(p) · GW(p)

Hmmm. Did you really mean to say that atheists are rational?

Replies from: MrHen
comment by MrHen · 2010-02-09T23:02:05.666Z · LW(p) · GW(p)

No, but nevermind. The point is that I am glad I asked what you meant because I wasn't even close to guessing correctly.

comment by RichardChappell · 2010-02-08T03:17:47.948Z · LW(p) · GW(p)

Surely the truth about knowledge and justification isn't correlated with which school you went to

It seems pretty likely that there is some correlation. (Suppose, without loss of generality, that some kind of epistemic externalism is true. Then some schools -- the ones where externalism is popular -- correlate with the production of true beliefs about epistemology.) The problem is just that we don't know in advance which schools actually have it right.

Perhaps what you mean to suggest is that going to school X isn't a sensitive method of belief formation. Even if what it taught was false (like school Y), you would still end up believing it (just as the students of Y do).

Then again, one could say much the same thing about being born into a religious cult (or a different historical epoch). I do consider myself lucky to have avoided such an epistemically stunted upbringing. But this kind of luck does not in any way undermine my present beliefs.

In Meta-Coherence vs. Humble Convictions, I suggest that what matters is how you assess the alternative epistemic position. If you really think it is just as well-informed and well-supported as your own, then this should undermine your present beliefs. Otherwise, it need not. To consider oneself the beneficiary of epistemic luck is not necessarily irresponsible. (Indeed, it's arguably necessary to avoid radical skepticism!)

Replies from: Alicorn, Unknowns
comment by Alicorn · 2010-02-08T03:27:29.981Z · LW(p) · GW(p)

Perhaps it would have been more accurate to say that your choice of school does not causally interact with the truth of its pet theory.

Replies from: RichardChappell
comment by RichardChappell · 2010-02-08T03:32:21.782Z · LW(p) · GW(p)

Philosophical truths don't seem like the kinds of things that would causally interact with anything. (They don't have causal powers, do they?)

ETA: why is this being downvoted?

Replies from: SilasBarta, Alicorn
comment by SilasBarta · 2010-02-08T04:24:11.929Z · LW(p) · GW(p)

Right, they would inferentially interact with them. Causal map:

Universe's structure (plus some other stuff) --> Philosophical Truth

From observing the universe, you should make (some) change to your estimates of philosophical truths, but the truths don't cause the universe -- just the reverse.

What Alicorn was saying, I think, is that there's no "my choice of school" node that points to (i.e. is a cause of) Philosophical Truth. Rather, such a node would at best point to "my beliefs".

(And ideally, you'd want the universe's structure to be a cause of your school's theories...)

ETA: Related: Argument screens off authority. Silas summary: truth causally flows to proxies for the truth, sometimes through multiple intermediate stages. E.g. truth of a position causes good arguments for it which cause experts to believe it which cause good schools to teach it. But he closer you are to truth in the causal chain, the more you have screened off and made irrelevant.

Replies from: RichardChappell
comment by RichardChappell · 2010-02-08T14:50:29.299Z · LW(p) · GW(p)

What Alicorn was saying, I think, is that there's no "my choice of school" node that points to (i.e. is a cause of) Philosophical Truth. Rather, such a node would at best point to "my beliefs".

Again, how does the 'my choice of school' node here differ from the 'my not being born into a cult' node? The latter doesn't cause philosophical truths either. (Strictly speaking nothing does: only contingent things have causes, and philosophical truths aren't contingent on how things turn out. But let's put that aside for now.) What it does is provide me with habits of thought that do a better job of producing true beliefs than the mental habits I would have acquired if born into a cult. But then different schools of philosophy teach different habits of thought too (that's why they reach different conclusions). The flaws in the other schools of thought are much less obvious than the flaws found in cults, but that's just a difference in degree...

Replies from: SilasBarta
comment by SilasBarta · 2010-02-08T18:08:13.041Z · LW(p) · GW(p)

Again, how does the 'my choice of school' node here differ from the 'my not being born into a cult' node? The latter doesn't cause philosophical truths either.

Right, it doesn't. But they're still going to be inferentially connected (d-connected in Judea Pearl's terminology) because both a) your beliefs (if formed through a reliable process), and b) philosophical truths, will be caused by the same source.

And just a terminology issue: I was being a bit sloppy here, I admit. "X causes Y", in the sense I was using it, means "the state of X is a cause of the state of Y". So it would be technically correct but confusing to say, "Eating unhealthy foods causes long life", because it means "Whether you eat unhealthy foods is a causal factor in whether you have a long life".

(Strictly speaking nothing does: only contingent things have causes, and philosophical truths aren't contingent on how things turn out. But let's put that aside for now.)

Yes, I assumed that how philosophers define the terms, but a) I don't find such a category useful because b) of all the instances where philosophers had to revise their first-principles derivations based on subtle assumptions about how the universe works.

What [my education in philosophy] does is provide me with habits of thought that do a better job of producing true beliefs than the mental habits I would have acquired if born into a cult. But then different schools of philosophy teach different habits of thought too (that's why they reach different conclusions). The flaws in the other schools of thought are much less obvious than the flaws found in cults, but that's just a difference in degree...

I actually agree. Still, to the extent that they do converge on reliable truth finding mechanisms, they should converge on the same truth-finding mechanisms. And one's admission that one's own truth-finding mechanism is so heavily school-dependent would indeed be quite worrisome, as it indicates insufficient critical analysis of what one was taught.

Of course, merely being critical is insufficient (someone who said so in this discussion was rightfully modded down for such a simplistic solution). I would say that you additionally have to check that the things you learn are multiply and deeply connected to the rest of your model of the world, and not just some "dangling node", immune to the onslaught of evidence from other fields.

Replies from: RichardChappell, RichardChappell
comment by RichardChappell · 2010-02-08T18:51:21.548Z · LW(p) · GW(p)

I don't find such a category [the 'non-contingent'] useful because b) of all the instances where philosophers had to revise their first-principles derivations based on subtle assumptions about how the universe works.

This sounds like a metaphysics-epistemology confusion (or 'territory-map confusion', as folks around here might call it). It's true that empirical information can cause us to revise our 'a priori' beliefs. (Most obviously, looking at reality can be a useful corrective for failures of imagination.) But it doesn't follow that the propositions themselves are contingent.

Indeed, it's easy to prove that there are necessary truths: just conditionalize out the contingencies, until you reach bedrock. That is, take some contingent truth P, and some complete description of C of the circumstances in which P would be true. Then the conditional "if C then P" is itself non-contingent.

comment by RichardChappell · 2010-02-08T18:37:39.382Z · LW(p) · GW(p)

Still, to the extent that they do converge on reliable truth finding mechanisms, they should converge on the same truth-finding mechanisms. And one's admission that one's own truth-finding mechanism is so heavily school-dependent would indeed be quite worrisome, as it indicates insufficient critical analysis of what one was taught.

Not sure how this engages with my challenge. The idea is that different schools might not all be converging on "reliable truth finding mechanisms". Maybe only one is, and the rest are like (non-obvious) cults, in respect of their (non-obvious) unreliability. [I'm not suggesting that this is actually the case, but just that it's a possibility that we need to consider, in order to tighten the arguments being presented here.] As the cult analogy shows, the contingency of our beliefs on our educational environment does not entail "insufficiently critical analysis of what one was taught". So I'm wanting you guys to fill in the missing premises.

comment by Alicorn · 2010-02-08T03:33:48.195Z · LW(p) · GW(p)

Well, ideally, they'd interact on some level with the arguments in their favor.

comment by Unknowns · 2010-02-08T10:31:13.728Z · LW(p) · GW(p)

The problem with this is that the alternative epistemic position also thinks that your position is not as well-informed and well-supported as your own. Are they justified as well?

Replies from: RobinHanson, RichardChappell
comment by RobinHanson · 2010-02-08T20:09:46.646Z · LW(p) · GW(p)

Yes, the key issue is not so much whether on a first analysis you came to think those other folks are not as well informed as you, but whether you would have thought that if you had been taught by them. The issue is how to overcome the numerous easy habits of assuming that what you were taught must have been better. Once you see that on a simple first analysis you would each think the other less informed, you must realize that the problem is harder than you had realized and you need to re-evaluate your reasons for so easily thinking they are wrong and you are right. Until you can find a style of analysis that would have convinced you, had you grown up among them, to convert to this side, it is hard to believe you've overcome this bias.

Replies from: bgrah449, RichardChappell
comment by bgrah449 · 2010-02-08T20:19:29.111Z · LW(p) · GW(p)

Robin Hanson just ended a post with the phrase "overcome bias." This feels momentous, like theme music should be playing.

Replies from: roland, orthonormal
comment by roland · 2010-02-09T20:11:04.559Z · LW(p) · GW(p)

May I suggest the following?

http://www.youtube.com/watch?v=cSZ55X3X4pk

comment by RichardChappell · 2010-02-08T20:45:05.891Z · LW(p) · GW(p)

The issue is how to overcome the numerous easy habits of assuming that what you were taught must have been better.

Well, that's one issue. But I was addressing a different -- more theoretical -- issue, namely, whether acknowledging the contingency of one's beliefs (i.e. that one would have believed differently if raised differently) necessarily undermines epistemic justification.

(Recall the distinction between third-personal 'accounts' of rational justification and first-personal 'instruction manuals'.)

Replies from: RobinHanson
comment by RobinHanson · 2010-02-08T20:57:10.068Z · LW(p) · GW(p)

"Necessarily" is an extremely strong claim, making it overwhelming likely that such a claim is false. So why ever would that be an interesting issue? And to me, first-person instruction manuals seem obviously more important than third-person "accounts".

Replies from: RichardChappell
comment by RichardChappell · 2010-02-09T00:20:57.520Z · LW(p) · GW(p)

I get the impression that many (even most) of the commenters here think that acknowledged contingency thereby undermines a belief. But if you agree with me that this is much too quick, then we face the interesting problem of specifying exactly when acknowledged contingency undermines justification.

I don't know what you mean by "important". I would agree that the instruction manual question is obviously of greater practical importance, e.g. for those whose interest in the theory of rationality is merely instrumental. But to come up with an account of epistemic justification seems of equal or greater theoretical importance, to philosophers and others who have an intrinsic interest in the topic.

It's also worth noting that the theoretical task could help inform the practical one. For example, the post on 'skepticism and default trust' (linked in my original comment) argues that some self-acknowledged 'epistemic luck' is necessary to avoid radical skepticism. This suggests a practical conclusion: if you hope to acquire any knowledge at all, your instruction manual will need to avoid being too averse to this outcome.

Replies from: RobinHanson
comment by RobinHanson · 2010-02-09T03:46:42.399Z · LW(p) · GW(p)

The vast majority of claims people make in ordinary language are best interpreted as on-average-tendency or all-else-equal claims; it almost never makes sense to interpret them as logical necessities. Why should this particular case be any different?

comment by RichardChappell · 2010-02-08T14:20:59.694Z · LW(p) · GW(p)

Well, they might be just as internally consistent (in some weak, subjective, sense). But if this kind of internal consistency or ratification suffices for justification, then there's no "epistemic luck" involved after all. Both believers might know full well that their own views are self-endorsing.

I was instead thinking that self-ratifying principles were necessary for full justification. On top of that, it may just be a brute epistemic fact which of (say) occamism and anti-occamism is really justified. Then two people might have formally similar beliefs, and each sticks to their own guns in light of the other's disagreement (which they view as a product of the other's epistemic stunting), and yet only one of the two is actually right (justified) to do so. But that's because only one of the two views was really justifiable in the first place: the actual disagreement may play no (or little) essential role, on this way of looking at things.

For further background, see my discussion of Personal Bias and Peer Disagreement.

comment by JGWeissman · 2010-02-08T00:39:55.525Z · LW(p) · GW(p)

This reminds me of Where Recursive Justification Hits Bottom, where Eliezer explains that an agent that uses Occamian priors and Bayes' rule, when evaluating whether this is effective, would assign an Occamian prior to the theory that it is effective and update using Bayes' rule to account for its success and failures.

If you learn and adopt a mode of epistemology at one school, that is what you will use to evaluate a competing mode advocated by another school.

comment by billswift · 2010-02-08T09:24:56.408Z · LW(p) · GW(p)

Signaling is a factor, but there is another issue. That is the choice of what to work on. Academics and academic departments, no less than individuals, have to make choices of where to focus their time and resources - no one has unlimited time and attention and choices have to be made. So one person, or school, may focus on cognitive biases and skimp on memory biases, or the other way around. This choice of what is important is actually very strongly affected by signaling issues - much more so than signaling affects particular beliefs, I think.

Replies from: roland, MBlume
comment by roland · 2010-02-09T19:56:28.945Z · LW(p) · GW(p)

Hmmm. If I understood correctly Alicorn was talking more about the situation where people are talking/working on the same thing and still have different beliefs.

comment by MBlume · 2010-02-08T23:21:22.046Z · LW(p) · GW(p)

This excuses having different maps in the sense that my map may be very blurry in the areas in which I don't work. On the other hand, I don't think it at all excuses incompatible maps.

Replies from: billswift
comment by billswift · 2010-02-09T10:14:54.807Z · LW(p) · GW(p)

It doesn't excuse them, but it does explain them - since each still has the default, evolved/social map in the unexplored areas. And they will have until the new knowledge penetrates enough to become the newer default, at least among academics.

comment by Karl_Smith · 2010-02-12T19:23:32.955Z · LW(p) · GW(p)

I see some problems here but it doesn't seem quite as intractable as Alicorn suggests.

If your beliefs are highly correlated with those of your teachers then you need to immerse yourself in the best arguments of the opposing side. If you notice that you are not changing your mind very often then you have a deeper problem.

To give a few related examples. One of the things that gives me confidence in my major belief structure is that I am an Atheist Capitalist. But, as I child I was raised and immersed in Atheist Communism. I rejected the communism but not the Atheism. At least in the small set my parents/early teachers were only 50% right in their basic belief structure and that doesn't sound too unlikely.

On the other hand I have been troubled by the extent to which I have become more sensitive to liberal arguments over the past 2 years. My social and professional circle is overwhelmingly liberal. It is unlikely that this does not have an effect on my beliefs.

To compensate I am attempting immerse myself in more conservative blogs.

Now of course there is no way to be sure that the balancing act is working. However, if we take as a starting point that errors among well informed people are randomly distributed then as a rough approximation your adherence to the beliefs of your community should be proportional to the number of intellectuals who hold those same beliefs.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2010-02-12T19:29:14.095Z · LW(p) · GW(p)

I've also noticed "liberals" making more sense, but I attribute this to smart people abandoning conservative groups and jumping ship to liberal ones. This may mean that "conservative" policies are being under-argued.

Replies from: inklesspen
comment by inklesspen · 2010-02-16T06:43:34.753Z · LW(p) · GW(p)

There may also be a limit to how wisely one can argue that spending money on wars while cutting taxes for the wealthy is sound economic policy.

Does any viewpoint have a right to survive in spite of being wrong?

Replies from: DanielLC
comment by DanielLC · 2010-09-05T03:23:07.794Z · LW(p) · GW(p)

But if they're wrong, then they'd have always been wrong, and Karl should have just been liberal, rather than becoming more so when surrounded by liberals.

Replies from: danlowlite
comment by danlowlite · 2010-12-08T14:28:49.112Z · LW(p) · GW(p)

But that means that people cannot change their mind and realize when they are wrong.

comment by orthonormal · 2010-02-11T06:15:26.716Z · LW(p) · GW(p)

Having been sincerely religious as a young adult has its drawbacks for my present self (e.g. lost time and effort), but one positive effect is that I'm not as worried about this, because I've felt what it's like to find that my entire worldview fails on its own terms. (Basically, I eventually came to realize that it was completely out of character for the deity I knew— or any deity derived from the scriptures and theologies I professed— to create a universe and humanity completely un-optimized along any recognizable moral axis. But I digress.)

I was lucky enough to have a starting point with an intellectual and philosophical tradition (rather than, say, the extremes of evangelical Christianity), so I can't credit myself too much. But unlike people who haven't ever had to fundamentally change their minds about the world, I can point to a set of evidence that would make me switch to another worldview, and I can be reasonably confident that I'd do so if given that evidence. (It would generally take more evidence for me than for a pure Bayesian, but working inefficiently is better than refusing to work at all.)

comment by MrHen · 2010-02-08T18:17:11.068Z · LW(p) · GW(p)

Can you think of another behavior pattern that is more accurate than this?

Assuming that someone isn't going to hold a belief they know to be false, they are teaching you perceived truth. Why wouldn't you adapt those beliefs? If you, the student, possessed the ability to denounce those beliefs in a manner fitting rational discussion it seems likely that the master would have been able to do so as well.

This isn't to say all masters are all right all the time, but what else do we have to go on until we go into the territory ourselves? These people went ahead and brought back a map. Different trailblazers saw the territory differently and now they get to bicker about which has a more accurate map.

When learning from people, I assume we copy portions from their map. They are lined up next to my discoveries and observations and are assimilated as best as possible. At some point, when I wander through that territory, I get to compare my map to what I see.

Am I wrong or missing something? Or is that the whole point of the post? Or... ?

Lots of biases still live in your head doing their thing even when you know about them. This one, though, puts you in an awfully weird epistemic situation. It's almost like the opposite of belief in belief - disbelief in belief. "This is true, but my situation made me more prone than I should have been to believe it and my belief is therefore suspect. But dang, that argument my teacher explained to me sure was sound-looking! I must just be lucky - those poor saps with other teachers have it wrong! But of course I would think that..."

Emphasis mine. I don't see how the emphasized part is a bias issue. I don't see how thinking about how a certain area of your map was colored causes the weird epistemic situation. I think the problem is this:

I must just be lucky - those poor saps with other teachers have it wrong!

A predictor is right or wrong. Your teachers are irrelevant once the belief is in your map. Treating beliefs transported from a teacher as a lucky event has a few problems:

  • It implies that your teacher is your teacher because you were lucky. Choose your teacher better.
  • It implies that teacher A and teacher B have the same map quality of a particular territory but they disagree. This is irrelevant until you get to look at the competing maps or the territory involved. Your teacher is still the best guess you have. Why would your map look like anything else?
  • It implies that you cannot change you (or your unlucky friend's) beliefs as new evidence or more maps become available.

But all of this seems obvious and you address these points elsewhere in your post.

So perhaps you are saying that once you believe something, it becomes entrenched and now has an unfair advantage? I certainly understand this. As far as I can tell, this is one of the problems that rationality was supposed to address. Is the teacher scenario merely a special case?

comment by dclayh · 2010-02-08T01:20:04.127Z · LW(p) · GW(p)

These have a decided enough effect that I've heard "X was a student of Y" used to mean "X holds views basically like Y's".

It still does. The question is open to what extent this is because students tend to choose advisers whose views they already agree with.

Replies from: Alicorn
comment by Alicorn · 2010-02-08T01:22:26.087Z · LW(p) · GW(p)

I don't deny that some selection effect is likely present, but there's only so much choice to be had in a smallish department, especially since you also have to sort by area of interest.

Replies from: dclayh
comment by dclayh · 2010-02-08T01:25:47.430Z · LW(p) · GW(p)

True, although the student is generally picking the department also.

Replies from: Alicorn
comment by Alicorn · 2010-02-08T01:31:10.757Z · LW(p) · GW(p)

I don't know about most people, but I went to the one that happened to accept me.

Replies from: sark
comment by sark · 2010-02-08T12:41:22.709Z · LW(p) · GW(p)

You select them. They select you. Selection effect in both cases.

Replies from: RobinZ
comment by RobinZ · 2010-02-09T02:21:06.418Z · LW(p) · GW(p)

"the one that happened to accept me" doesn't sound like a criterion that is strongly correlated with "the one whose views accord most closely with mine".

Replies from: Alicorn, sark
comment by Alicorn · 2010-02-09T02:28:40.632Z · LW(p) · GW(p)

Well, I did do my entrance essay on an article by a philosopher who is inordinately popular at the department I just abandoned. I was talking about how he was wrong, but I did communicate that I thought he was worth reading and writing about.

comment by sark · 2010-02-09T11:15:38.792Z · LW(p) · GW(p)

It doesn't, but the point was about influence vs. selection effects, not different kinds of selection effects.

Replies from: RobinZ
comment by RobinZ · 2010-02-10T15:15:36.184Z · LW(p) · GW(p)

It's still luck - if I choose a car based on whoever happens to drive past with a "For Sale" sign in the window, the street I'm standing on isn't a very good determinator of the quality of automobile.

Replies from: MrHen
comment by MrHen · 2010-02-10T15:40:30.653Z · LW(p) · GW(p)

It can, depending on what city you live in. I don't know if this is at all related to your point, but the street you find a car for sale can be a great indicator of its quality.

Namely, what neighborhood is the car in? Is the street notorious for certain things? A personal example: I would never buy a car I found for sale on Hwy 14. I would buy a car I found for sale on Hwy 110, but only outside of the Loop.

Replies from: RobinZ
comment by RobinZ · 2010-02-11T03:10:24.574Z · LW(p) · GW(p)

Yeah, I'll give you that one. I was imagining choosing the street by, say, walking out your front door, not by the expected quality of automobile.

comment by thomblake · 2010-02-08T14:48:10.931Z · LW(p) · GW(p)

Hmm... This might actually be a major breakthrough of sorts. When I have time, I'll delve into the literature to see what folks have to say about it. On the surface, this seems related to the notion that we just need a good-enough theory to make progress towards a better theory.

It might be the case that folks in different departments are climbing the same mountain from different sides; if people in department A are advocating Big Theory p, and people in department B are advocating Big Theory q, and p and q are both approximately correct, then we shouldn't be surprised that they both have proponents.

comment by aausch · 2010-02-08T06:10:38.194Z · LW(p) · GW(p)

Are you sure you have the causality of it right? I always thought of graduate schools as selectors/filters for certain kinds of intelligences and pov, rather than causators.

Replies from: grouchymusicologist, roland
comment by grouchymusicologist · 2010-02-08T14:40:21.256Z · LW(p) · GW(p)

You mean, you don't think that any belief-formation occurs during people's graduate educations? Even if you don't think grad school is all it's cracked up to be, that can't be quite right.

Replies from: aausch
comment by aausch · 2010-02-08T16:43:33.772Z · LW(p) · GW(p)

Not belief formation, but change. I think it unusual for a grad student to significantly change beliefs. So I think of the process more as a refinement, or sharpening, of existing beliefs - the formation of more fine grained beliefs that build on existing ones, rather than any major changes in the existing set.

Replies from: grouchymusicologist, bogdanb
comment by grouchymusicologist · 2010-02-08T22:25:42.147Z · LW(p) · GW(p)

Fair enough, in cases where students enter grad school with a moderately robust inclination toward some particular orientation on some important issue. I think I differ from you in my estimation of the frequency of these kinds of events:

(1) New grad student has some pre-existing but vague ideas regarding the issue; these are quickly replaced by the prevailing position at the graduate institution. Reasons for this could very well include increasing intellectual maturity, deeper/more complete study of issues at the graduate level, and the sense that it's a graduate student's responsibility to have positions on major scholarly issues (as opposed to undergraduates more frequently just being set up as "observers" of scholarly controversies).

(2) New grad student enters school quite literally without any inkling that some particular scholarly controversy exists, and upon being made aware of it, takes up the prevailing position at the graduate institution.

Anecdotally, I find both of these to be pretty common scenarios, whereas I don't think you do. This could be related to the fields I work in (music history and theory).

Replies from: aausch
comment by aausch · 2010-02-09T06:01:14.896Z · LW(p) · GW(p)

Yes, I think you've nailed it. I have a romanticized view of things - where students pick fields/schools after some research, and avoid grad school if they don't get where they want to be.

Maybe I've just been avoiding the territory. :P

comment by bogdanb · 2010-02-08T19:29:21.520Z · LW(p) · GW(p)

This makes intuitive sense; the two are processes seem mutually reinforcing:

X prefers (vaguely) point of view A over B. X goes to university Y because it's known (to X, vaguely) to favor A over B. X's preference of A over B is sharpened because of being exposed to preference of A over B in his environment (via confirmation bias and the like).

In other words, given that somebody leaned to an option in a “neutral” environment, it is intuitive that when exposed to an environment that favored that option they will lean harder toward it.

[ETA:] it the post terms, this means that if the two views (A&B above) are equally “wrong” or “right” (i.e., an approximation of the truth, but with comparable accuracy), the effect doesn't make your opinion significantly more or less suspect. However, if one is closer to the truth, it would polarize “knowledge quality”: those with good initial “guesses” will be dragged closer to the truth, the others further away.

It seems to me that the latter situation is not as bad as (I suspect) it sounds: it should make it easier for outsiders (those not leaning towards either view yet) to “spot the difference”. That's because instead of a continuum of worse to better views would congeal in an opposition of two “bad” and “good” views (in respect to how accurately they approximate truth).

Replies from: aausch
comment by aausch · 2010-02-08T19:52:42.499Z · LW(p) · GW(p)

Also, look at it from the point of view of the professors. If you have a list of students to choose from, which ones are you most likely to be interested in working with?

comment by roland · 2010-02-09T20:05:18.879Z · LW(p) · GW(p)

I think this is a great point. I read about ET Jaynes that he had the option to do his PHD in physics with Bohr but he declined because he knew that his POV was in contradiction with Bohr's and therefore he chose to do it with someone else(I don't remember with whom). My respect of Jaynes increased tremendously after I read this.

Replies from: PhilGoetz
comment by PhilGoetz · 2010-02-09T20:10:30.237Z · LW(p) · GW(p)

Why did contradicting Bohr raise your opinion of someone tremendously?

Replies from: roland, aausch
comment by roland · 2010-02-09T22:46:30.276Z · LW(p) · GW(p)

Well AFAIK Bohr was considered one of the greatest physicists of his time and a lot of students would be vying to be able to do their PHD with him. So not only did Jaynes not care about his status but he also thought with his own brain and came to the conclusion that Bohr had a flawed POV.

comment by aausch · 2010-02-09T22:11:54.445Z · LW(p) · GW(p)

I read that to mean that the respect was for Jaynes' integrity, and not for his specific point of view.

comment by G-Max · 2015-01-14T12:00:33.475Z · LW(p) · GW(p)

"But dang, that argument my teacher explained to me sure was sound-looking! I must just be lucky - those poor saps with other teachers have it wrong!"

This is actually something I've been wondering about regarding the disproportionate overlap between libertarianism and anthropogenic global warming skepticism. I'd like to think that this disproportionate overlap is because both views stem from a rational and objective assessment of the available data, but at the same time, I can't deny that anthropogenic global warming would throw a monkey wrench into libertarian philosophy if it was real, so being skeptical of it saves us from doing an awful lot of mental gymnastics...

comment by DanielLC · 2010-09-05T03:31:17.232Z · LW(p) · GW(p)

Learning the arguments of opposing viewpoints is helpful, but if you're being truly rational, it shouldn't be necessary to remove bias.

I figure that if you know there are other groups with different beliefs, they are likely to have heard arguments you haven't. You should assume there are arguments, and use them to adjust your beliefs. If you later hear such arguments, and they make more sense then you thought, then that should make you believe them more. If they make less sense, even if they're still good arguments and would only help convince you if you hadn't already assumed their existence, you should believe them less.

On the other hand, if you've studied it significantly, you've probably heard pretty much all the best arguments on each side, and the problems are just due to being irrational. I guess you should try to fight that by trying to remain skeptical of the opinions of people near you, regardless of whether they're more or less normal than what you believe.

comment by JamesAndrix · 2010-02-08T16:42:27.584Z · LW(p) · GW(p)

Thinking out loud: So if you're french,trying to figure out what true english way to say something would be, you don't want to... you want to specifically avoid the kinds of mistake french speakers (you) tend to make because they are french. What a native english speaker might call frenglish. You have access to native spanish and japanese speakers, you know they are making mistakes, and you know that some of their mistakes are made because of their native language.

One source of information is more fluent french speakers, they have overcome some obstacles unique to francophones, and they can point them out in less fluent speakers. This may give a pattern of errors indicating the kind of errors francophones tend to make in learning english. (especially if you can compare to known early errors in spanish/japanese speakers)

So, first look at what the near-fluent speakers of all nationalities agree on as english. Then look at the mistakes made by learners in various nationalities. This will give you a fairly empirical picture of what a strong french accent looks like. That should give you some hints about what a weak[slight] french accent would look like. Correct for that, and you'll be closer to native english.

comment by wiresnips · 2010-02-08T15:20:20.290Z · LW(p) · GW(p)

Widening the spread of your mentors should reduce this bias, as long as you didn't choose mentors that agree with each other. Obviously, there isn't really enough time to be taught from a wide enough sample of perspectives to properly eliminate it.

comment by Vladimir_Nesov · 2010-02-08T10:57:16.889Z · LW(p) · GW(p)

It's almost like the opposite of belief in belief - disbelief in belief.

More like belief in disbelief.

comment by Daniel_Burfoot · 2010-02-08T05:15:41.518Z · LW(p) · GW(p)

During one of my epistemology classes, my professor admitted (I can't recall the context) that his opinions on the topic would probably be different had he attended a different graduate school.

I read this as an admission that modern academic philosophy has nothing whatever to do with the search for truth and everything to do with status-seeking, signalling, and affiliation games. But at least he was being sort of honest.

Replies from: Mitchell_Porter, komponisto
comment by Mitchell_Porter · 2010-02-08T08:47:44.588Z · LW(p) · GW(p)

Concern with status can actually foster truth-seeking, and not just interfere with it. You can be the star who discovered something new, you can expose someone else's status as undeserved, and simply maintaining your own status (in a truth-seeking community) can motivate you to take extra care with what you say and do. The social emotions are not inevitably a source of noise and error.

(And by the way, your own comment is an intemperate denunciation. I hope that a little sober reflection would lead you to conclude that maybe modern academic philosophers do have a professional interest in truth after all, and that even if they are collectively doing it badly, your particular diagnosis is factually wrong.)

Replies from: Daniel_Burfoot, wedrifid
comment by Daniel_Burfoot · 2010-02-08T15:01:57.423Z · LW(p) · GW(p)

And by the way, your own comment is an intemperate denunciation

This I admit happily. The only question is whether or not my intemperance is justified. Consider how absurd it would be for a professor of physics to admit that his opinion regarding a problem in physics would be different if he had attended a different graduate school.

maybe modern academic philosophers do have a professional interest in truth after all, and that even if they are collectively doing it badly, your particular diagnosis is factually wrong.

Imagine I had said "Soviet communism has nothing whatever to do with feeding the people and everything to do with status-seeking, coalition games, and power politics". You could as easily complain that soviet communists do have a professional interest in feeding the people, even if they were doing it badly, and that my diagnosis was factually wrong.

Replies from: thomblake, Alicorn, Mitchell_Porter
comment by thomblake · 2010-02-08T15:15:39.426Z · LW(p) · GW(p)

The only question is whether or not my intemperance is justified.

If you think your intemperance is justified, then you're using the word "intemperance" wrong.

Consider how absurd it would be for a professor of physics to admit that his opinion regarding a problem in physics would be different if he had attended a different graduate school.

I would be no more surprised by this claim than the one about the epistemologist. Should I be? Do proponents of string theory tend to come from the same schools?

Replies from: byrnema
comment by byrnema · 2010-02-08T16:17:08.254Z · LW(p) · GW(p)

I would be no more surprised by this claim than the one about the epistemologist. Should I be? Do proponents of string theory tend to come from the same schools?

I think this is pushing the argument too much. I'm sure you see his point that physics is different than philosophy.

You can still make interesting counter-arguments (for example, string theory is one -- another is that if even if you agree about the solution, you can have different philosophies on the correct approach) but I think it muddies the waters to pretend you don't know what he's talking about in the first place.

I'm choosing this as an example only -- I haven't noticed you doing this before, but I've noticed it in threads with people arguing with me and I'd like to have a name for this, so I can call them on it. Mainly it makes the argument extremely inefficient. In his next thread, Daniel Burfoot might imagine he needs to go in more detail about the differences between physics and philosophy (which would be tedious) or just recognize that he was parried and ignore it.

Replies from: thomblake
comment by thomblake · 2010-02-08T18:01:13.853Z · LW(p) · GW(p)

I think this is pushing the argument too much. I'm sure you see his point that physics is different than philosophy.

The claim was that it would be "absurd" if a professor of physics admitted that his opinion regarding a "problem" would be different if he attended a different school.

I fail to see what would be absurd about it. I take the word "problem" here to mean something that doesn't have a settled answer in the field, so I had to reach as far out as string theory to find something accessible to the non-physicist. I honestly don't see in what relevant way physics is different from philosophy here. I likely would've had the same reaction regarding any academic field.

Replies from: SilasBarta, byrnema
comment by SilasBarta · 2010-02-08T18:22:02.620Z · LW(p) · GW(p)

I agree with your point: something doesn't become a "problem" in physics unless it's so hard to find observational evidence that favors one view over another that it can't be quickly resolved.. So the remaining things left that count as problems are the very ones where different experts can reasonably hold different views, and one will see a stronger case in schools that support a particular view.

On the opposite end, problems in philosophy that actually get solved are then spun off into other fields and so no longer count as philosophy. What we now call "physics" was at one time "natural philosophy".

Still, I think there's a difference in that philosophy hasn't been spinning off productive scientific research programs in the past few decades. But don't think that was Daniel_Burfoot's point.

comment by byrnema · 2010-02-08T18:13:00.801Z · LW(p) · GW(p)

I see, the fallacy was mine: the typical mind fallacy. I thought of 'problem' as something that -- in physics - would be already solved and straightforward.

comment by Alicorn · 2010-02-08T15:12:59.753Z · LW(p) · GW(p)

Consider how absurd it would be for a professor of physics to admit that his opinion regarding a problem in physics would be different if he had attended a different graduate school.

It's a weird thing for anyone to admit; the epistemologist is just in a particularly awkward position. I spoke of philosophy because that's what I have experience with; but I wouldn't be at all surprised to find that similar trends of students following teachers hold in other disciplines.

comment by Mitchell_Porter · 2010-02-10T05:25:03.318Z · LW(p) · GW(p)

OK then. On what basis did you form this opinion - that university philosophers are not seeking truth? What's your evidence, your argument?

Replies from: Daniel_Burfoot
comment by Daniel_Burfoot · 2010-02-12T09:49:13.782Z · LW(p) · GW(p)

I thought about this for a while and came up with a tight argument:

First, note that it is implausible to claim that academic philosophers are the only people doing philosophy. There are certainly private individuals engaged in the search for philosophical truth as well - Nassim Nicholas Taleb and Eliezer Yudkowsky are two such individuals whose names spring immediately to mind; there are doubtless many others. To claim that only philosophy professors can do good philosophy would be like claiming that only literature professors can write good literature.

Given that there are many non-academic philosophers out there, if we assume that academic philosophers (and philosophy journal editors) are disinterested truth-seekers who are not motivated by political and status concerns, then we should expect to see articles written by non-academics published frequently in philosophy journals, or at the very least we should expect academic philosophers to frequently cite the non-academics.

So, not knowing the actual state of things, since I don't read many philosophy journals, I will expose my theory (academy philosophy is not about truth-seeking) to falsification by predicting that philosophy journals almost never publish articles by non-academics (i.e. someone without a university affiliation and a Phd), and academic philosophers very rarely cite work done by non-academics.

Replies from: Mitchell_Porter, CarlShulman, Jack
comment by Mitchell_Porter · 2010-02-13T08:20:56.375Z · LW(p) · GW(p)

Let's try inverting your central deduction:

Given that there are many academic philosophers out there, if we assume that non-academic philosophers are disinterested truth-seekers who are not motivated by political and status concerns, then we should expect to see frequent collaborations between non-academics and academics, or at the very least we should expect non-academic philosophers to frequently cite the academics.

A very analogous argument to yours would allow us to conclude that non-academic philosophy is not about truth-seeking.

Replies from: Tyrrell_McAllister
comment by Tyrrell_McAllister · 2010-02-13T09:32:56.170Z · LW(p) · GW(p)

[. . .] or at the very least we should expect non-academic philosophers to frequently cite the academics.

A very analogous argument to yours would allow us to conclude that non-academic philosophy is not about truth-seeking.

But we do see non-academics citing academics. Non-academic amateurs will refer to the likes of Quine, Russell, or Searle.

Replies from: Mitchell_Porter
comment by Mitchell_Porter · 2010-02-13T10:15:24.496Z · LW(p) · GW(p)

It goes the other way too; Taleb and Yudkowsky are not completely ignored by academia. Nonetheless, the insularity of academic intellectuals and the disdain for academia of non-academic intellectuals are real phenomena. There is a symmetry to the situation, but Dan wants to draw an asymmetric conclusion.

comment by CarlShulman · 2010-02-12T12:03:49.229Z · LW(p) · GW(p)

Philosophy journals use blind anonymous review (although of course this is imperfect). Journals are actually the least status-influenced major venues for academic philosophy influence (comparing to books, workshops, teaching elite students).

comment by Jack · 2010-02-12T09:56:00.889Z · LW(p) · GW(p)

I didn't realize journals of theoretical physics, biology, cognitive science and history were publishing a lot of non-academics.

Replies from: ciphergoth, CarlShulman, gregconen
comment by Paul Crowley (ciphergoth) · 2010-02-12T11:28:28.540Z · LW(p) · GW(p)

Cryptography conferences have published at least some articles from non-academics: both (or all four, depending on what you count) of my publications at the least.

Replies from: Jack
comment by Jack · 2010-02-12T11:40:06.383Z · LW(p) · GW(p)

Interesting. Do you work in a related field in private industry? I assume fields like pharmacology and chemistry publish a lot of non-academics because there is so much corporate research.

Replies from: ciphergoth
comment by Paul Crowley (ciphergoth) · 2010-02-12T11:54:09.267Z · LW(p) · GW(p)

No, I'm pretty much a dilettante, a coder who takes an amateur interest in these things, though my employers are usually in favour and pay eg expenses to go to conferences. I haven't done much, but here's what I've done if you're interested.

comment by CarlShulman · 2010-02-12T12:09:18.228Z · LW(p) · GW(p)

Actually, quite a lot of articles are published by people in industry (e.g. pharmaceutical companies, biotech companies) for biology. Greg Cochran (a physicist by training, with no academic affiliation) publishes evolutionary biology articles in good journals, attends conferences, etc. You see more non-academics publishing when there are more people with the relevant skillsets outside academia, and fewer otherwise.

comment by gregconen · 2010-02-12T11:57:27.439Z · LW(p) · GW(p)

It's not like no status seeking occurs in those fields.

comment by wedrifid · 2010-02-08T13:04:56.574Z · LW(p) · GW(p)

And by the way, your own comment is an intemperate denunciation. I hope that a little sober reflection would lead you to conclude that maybe modern academic philosophers do have a professional interest in truth after all, and that even if they are collectively doing it badly, your particular diagnosis is factually wrong.

I totally agree. I think Daniel will sit down and realize 'remarkably little' would have been more accurate 'nothing whatever' and also marginalize most potential backlash.

comment by komponisto · 2010-02-08T05:26:55.268Z · LW(p) · GW(p)

I read this as an admission that modern academic philosophy has nothing whatever to do with the search for truth and everything to do with status-seeking, signalling, and affiliation games.

Hansonian cynics may be inclined to argue that one could just as well substitute almost any other discipline for "philosophy" in that sentence.

comment by Psilence · 2010-02-09T13:55:26.988Z · LW(p) · GW(p)

This is good. It seems (to me) to mean that the LessWrong community is starting to "get the hang" of the importance of explanation...

By that I mean that a person who found themselves in the state of being "very" "intelligent" might, at the exact same time that they realized their state of intelligence had been the result of what we call "insights"--a working out of the problem on a level-independent way from the presuppositions inherent in the overwhelming bias of the problem as stated...

that that agent would also concurrently understand that the simple act of "explaining" how it might look to another person trapped further down the scale would cause them to agree, and hence look upwards, seeking the thing we had described for their benefit.

tl;dr in this case is: explanation IS intelligence, as much as insight if not more. "Misunderstood geniuses" is an oxymoron, no matter what genius-level you find yourself in,

Replies from: MrHen, PhilGoetz
comment by MrHen · 2010-02-09T18:24:38.578Z · LW(p) · GW(p)

Obviously this concept breaks down when considering the nature of communication. If I were a genius and moved to Kyrgyzstan I would never be understand because I don't know Kyrgyz or Russian.

More relevantly, I would expect that the ability to explain things has a stronger correlation with the ability to model other minds than it does with intelligence.

comment by PhilGoetz · 2010-02-09T18:11:03.132Z · LW(p) · GW(p)

"Misunderstood geniuses" is an oxymoron

An oxymoron, or a tautology?

Replies from: MrHen
comment by MrHen · 2010-02-09T18:17:29.650Z · LW(p) · GW(p)

Or just a stupid cached thought.

Replies from: AndyWood
comment by AndyWood · 2010-02-09T19:01:12.072Z · LW(p) · GW(p)

Keeping in mind, though, that a "stupid cached thought" is not stupid merely because it is cached. It looks to me like this might be a confusion between "cached thought" and "short handle that refers to a more detailed, commonly understood situation", both here and in your earlier response to me in this thread.

Replies from: linkhyrule5, MrHen
comment by linkhyrule5 · 2013-07-30T18:48:46.350Z · LW(p) · GW(p)

Ironically, this means you need to avoid the cached thought "cached thoughts are stupid" and actually think about the problem.

comment by MrHen · 2010-02-09T19:35:35.073Z · LW(p) · GW(p)

In this case, I think it is (a) stupid and (b) a cached thought.

The earlier response, that you have linked to, was mostly me realizing that the knee-jerk reaction was because I tend to have that reaction to cached thoughts. It just didn't mean anything to me in that context and it triggered the reaction I have when I encounter cached thoughts. I wasn't trying to give the impression that I thought, "Think for yourself" was stupid.

In retrospect, I suppose saying "reject out of hand" was a bit harsh...

Hopefully that clarifies something I muddled up earlier. :P

comment by AndyWood · 2010-02-08T03:11:29.766Z · LW(p) · GW(p)

There's no obvious way to compensate.

Think for yourself.

I know it sounds trite, but really, any time I might spend expouding on the theme of socialized learning would only yield fancier ways of saying pretty much that.

Replies from: MrHen
comment by MrHen · 2010-02-08T18:19:42.528Z · LW(p) · GW(p)

Fancier ways are more convincing than a sentence I hear all the time from people who have wildly varying beliefs about socialized learning.

Replies from: AndyWood
comment by AndyWood · 2010-02-08T19:44:21.155Z · LW(p) · GW(p)

But I find that simpler admonitions, such as the above, serve me much better in real life.

My point is simply that if you want to know what's correct, you should scrutinize every idea hard, with every correctness-checking tool you have available. The content of the idea should carry much more weight in your evaluation than the source of the idea, whether it's your favorite author, your teacher, a highly-regarded figure, your best friend, or your hated enemy. It may not be possible to eliminate the effects of affiliation entirely, but I think it's a good goal.

Replies from: MrHen
comment by MrHen · 2010-02-08T19:48:40.066Z · LW(p) · GW(p)

I agree with this but tend to reject the phrase, "think for yourself," out of hand because it doesn't mean anything without clarification.

Coincidentally, I just read Cached Thoughts. That may explain my negative reaction.

I don't really have anything to add to the point you made.