The Litany Against Gurus

post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-12-18T20:11:02.000Z · LW · GW · Legacy · 36 comments

Contents

36 comments

I am your hero!
I am your master!
Learn my arts,
Seek my way.

Learn as I learned,
Seek as I sought.

Envy me!
Aim at me!
Rival me!
Transcend me!

Look back,
Smile,
And then
Eyes front!

I was never your city,
Just a stretch of your road.

 

Part of the Politics Is the Mind-Killer subsequence of How To Actually Change Your Mind

Next post: "Politics and Awful Art"

Previous post: "Rationality and the English Language"

36 comments

Comments sorted by oldest first, as this post is from before comment nesting was available (around 2009-02-27).

comment by RobinHanson · 2007-12-18T20:15:57.000Z · LW(p) · GW(p)

Are people biased on average to follow someone else, rather than to make their own path? It is not obvious to me. Yes, many great failings have come from groups dedicatedly following a leader. But surely many other failings have come from groups not dedicatedly following a leader.

comment by Nick_Tarleton · 2007-12-18T20:30:26.000Z · LW(p) · GW(p)

These recent posts will be very useful to point to next time someone accuses Singularitarians of being a cult.

comment by Zubon · 2007-12-18T20:57:06.000Z · LW(p) · GW(p)

Or, Nick, a great source of irony for those people. "For a site called 'Overcoming Biast'..."

comment by ben_mathes · 2007-12-18T21:10:04.000Z · LW(p) · GW(p)

I suspect that people tend towards following versus leading, much in the way that pack wolves have leaders and followers.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-12-18T21:15:52.000Z · LW(p) · GW(p)

Robin, this verse is about following someone else - just following with intent to overtake, rather than following with intent to worship. When is it ever appropriate to do the latter?

The Litany describes a dilemma that should only appear in arts, not sciences. In a true science with many contributors, you wouldn't follow any single hero, unless you thought some portion of their work had been left undone.

Nick, that is not their purpose.

comment by Nick_Tarleton · 2007-12-18T21:22:59.000Z · LW(p) · GW(p)

I know, but they can serve that purpose - how many actual cult leaders write about how to avoid becoming a cult?

comment by Chris · 2007-12-18T22:00:32.000Z · LW(p) · GW(p)

What about the Guru who wrote 'Why work towards the Singularity' ? It is a text with a distinctly Messianic feel. Or, to be more generous, a Promethean feel. While it is true that Hom Sap has a nasty itch to create anything that can be created, regardless, thre's no need for such pseudo valuations as the following : "If there's a Singularity effort that has a strong vision of this future and supports projects that explicitly focus on transhuman technologies such as brain-computer interfaces and self-improving Artificial Intelligence, then humanity may succeed in making the transition to this future a few years earlier, saving millions of people who would have otherwise died. Around the world, the planetary death rate is around fifty-five million people per year (UN statistics) - 150,000 lives per day, 6,000 lives per hour. These deaths are not just premature but perhaps actually unnecessary. At the very least, the amount of lost lifespan is far more than modern statistics would suggest." Who says that continuing the lives of us dull old farts, to the inevitable detriment of the unborn, has any positive value ? I'd say that's monstruous. The transhuman AI may be an unavoidable consequence of our Luciferian inclination to meddle. That doesn't mean it's a cause. Any chance of it becoming a cult ?

comment by CarlShulman · 2007-12-18T22:26:50.000Z · LW(p) · GW(p)

"Envy me! Aim at me! Rival me! Transcend me!"

Eliezer,

Can you name 3 people who have transcended you in particular areas of rationality, and those areas? How about Spearman's g? Capacity/willpower for altruistic self-sacrifice? Conscientiousness? Tendency not to be overconfident about disastrous philosophical errors? Philosophical creativity? Mathematical creativity? Same questions with respect to 'rivaled.'

Also, your use of poetry and talk of the 'Way of Rationality' seems to be counter-signaling.

Nick,

Plenty of religious and political organizations accuse outsiders and heretics of various kinds of bias and irrationality, and 'apply' the same criteria to themselves. The problem is that they do so in a biased fashion.

Replies from: Swimmer963
comment by Swimmer963 (Miranda Dixon-Luinenburg) (Swimmer963) · 2011-03-05T18:54:58.543Z · LW(p) · GW(p)

"Also, your use of poetry and talk of the 'Way of Rationality' seems to be counter-signaling."

In what sense is this counter-signaling?

comment by Nick_Tarleton · 2007-12-18T23:29:10.000Z · LW(p) · GW(p)

I realize it's not likely to convince someone who's already committed to seeing Singularitarianism as a cult, but it might help someone who's relatively unfamiliar with the territory and getting a slight cultish feel.

comment by Roko · 2007-12-19T00:13:52.000Z · LW(p) · GW(p)

@Chris: "Who says that continuing the lives of us dull old farts, to the inevitable detriment of the unborn, has any positive value ?"

I hear this argument against life extension and transhuman technologies over and over, and I think it is the absolute height of hypocrisy. Why? Well, if you care so much about the unborn ( = potential people, of whom there are infinitely many), then why aren't you eagerly campaigning for the immediate colonization of the solar system, followed by the galaxy? Remember, there are always more potential people left to be realized, and the best way of realizing them is by continually increasing the rate at which new people come into existence.

Surprisingly enough, the creation of a safe and powerful AI is probably the most effective way of accomplishing this increase in new-people-creation that will benefit the unborn. Chris, if you're really interested in the rights of potential persons [as I am], you should wholeheartedly support and work towards positive, safe technological acceleration.

comment by Tiiba2 · 2007-12-19T01:21:49.000Z · LW(p) · GW(p)

I don't know about you guys, but if there was only one country in the entire universe, I'd rather it be Monaco than Congo.

comment by Caledonian2 · 2007-12-19T02:43:26.000Z · LW(p) · GW(p)
Why? Well, if you care so much about the unborn ( = potential people, of whom there are infinitely many), then why aren't you eagerly campaigning for the immediate colonization of the solar system, followed by the galaxy? Remember, there are always more potential people left to be realized, and the best way of realizing them is by continually increasing the rate at which new people come into existence.

'Unborn' does not equal 'potential people'. I see little point in trying to exhaustively explore the space of potential people. But given that there will be people coming after us, I fail to see the purpose in extending the lives of this generation at their expense.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-12-19T03:21:23.000Z · LW(p) · GW(p)

Can you name 3 people who have transcended you in particular areas of rationality, and those areas? How about Spearman's g? Capacity/willpower for altruistic self-sacrifice? Conscientiousness? Tendency not to be overconfident about disastrous philosophical errors? Philosophical creativity? Mathematical creativity? Same questions with respect to 'rivaled.'

Daniel Kahneman undoubtedly knows more about heuristics and biases than I do; E. T. Jaynes was superior in manipulating and applying Bayesian calculus; Robyn Dawes has taught more students of rationality; Von Neumann was probably brighter than I am; Gandhi endured more for less; Edison put in longer hours; Epicurus seemed pretty skeptical; If Siddhārtha Gautama was a real person, he was one hell of an imaginative philosopher; Conway has probably created more math than I've learned.

comment by Goplat · 2007-12-19T05:30:09.000Z · LW(p) · GW(p)

Caledonian: If a longer lifespan is bad, surely a shorter one must be good? It would be a pretty unlikely coincidence if the current average of 67.2 years just happened to be morally optimal - and even if that coincidence were true now, it won't be for much longer.

So if you really believe what you're saying, then stop extending your own life; go kill yourself and knock that number down a notch. But there's the rub - you don't really believe it, and I'd bet that as soon as radical life extension comes on the market you'll go for it even while ridiculing others doing the same.

You just use irrational ideas as a way of sounding "cool", because there are all too many moronic humans who eat that crap up, thinking that anything must be right if it goes against the "establishment". If a poll was done, probably more than half of all people would claim to be non-conformists.

Obviously you're not going to find very many such idiots here, so who's your real audience? Do you show your posts to all your (nominally) progress-hating friends, gushing over how you dealt such a huge blow to The Man? Or are you such a sad, pathetic creature that you do all this purely to prove your own coolness to yourself?

comment by CarlShulman · 2007-12-19T06:34:32.000Z · LW(p) · GW(p)

Eliezer,

That wasn't a very strong signal of non-guru status. Six out of those nine people are dead (why choose the dead?) and can't condemn your ideas or compete for current authority with you, making for a less informative signal of non-guru status. You praise Kahneman for academic knowledge of heuristics and biases, but notably not for actually overcoming bias. Mentioning Dawes' total output of students, given his line of work and greater age, is very different from praising his ability to actually convey rationality.

A guru could say those things and still consistently claim to be the most generally intelligent and personally rational do-gooder currently living on the planet Earth, a view which is false for most. Are you ready to explicitly reject that proposition with respect to yourself? To say that people who do not agree with you on some important matters of fact and of value (e.g. relating to your work), and who might hinder your accumulation of supporters and resources, are your rivals or superiors in general rationality? To specify significant ways in which you have been persistently (and harmfully on balance) more biased than interlocutors concerned with rationality like Nick or Robin?

You could easily address such questions in a much more informative fashion than in the list above.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-12-19T06:48:59.000Z · LW(p) · GW(p)

Carl, I'm not trying to signal non-guru status. If I was trying to signal non-guru status, I wouldn't write verse! But a verse to remember and repeat as a mantra might be useful to someone trying to resist the slide into cultishness ("I was never your city" keeps going through my own mind). I have little compunction about "looking like a guru" if it conveys information nicely. So long as I'm not actually a guru.

Regarding the rest of your question, I acknowledge no superior in my own specialty, and would be expected to have many superiors anywhere outside my own specialty.

comment by CarlShulman · 2007-12-19T07:43:23.000Z · LW(p) · GW(p)

"I have little compunction about "looking like a guru" if it conveys information nicely. So long as I'm not actually a guru." You're also in good company on verse with the MIT AI koans:

"A novice was trying to fix a broken Lisp machine by turning the power off and on. Knight, seeing what the student was doing, spoke sternly: "You cannot fix a machine by just power-cycling it with no understanding of what is going wrong." Knight turned the machine off and on. The machine worked." http://en.wikipedia.org/wiki/Hacker_koan

comment by Ben_Jones · 2007-12-19T10:43:44.000Z · LW(p) · GW(p)

"So long as I'm not actually a guru."

Why not? The word guru doesn't necessarily have negative connotations for me. This isn't a gripe over lexical definitions. I think most of the nine people listed could have been described as a guru - the last certainly was. They had devoted followers and imparted their knowledge to them. A guru does not a cult create - that honour's reserved for those who aren't comfortable with the prospect of being usurped or overtaken.

I have limitless admiration for someone who can teach all they know, and look on with nothing but pride as their protegés go on to surpass their achievements. I know I'd have trouble with that.

comment by Caledonian2 · 2007-12-19T14:50:13.000Z · LW(p) · GW(p)
But a verse to remember and repeat as a mantra might be useful to someone trying to resist the slide into cultishness

I suspect that ritualistic behavior is unlikely to aid in resisting the slide into cultishness. Quite the opposite.

Perhaps you should study the people who are exposed to teachings that seem to favor cultishness, find value in the teachings, but do not enter the cult.

comment by LG · 2007-12-19T15:04:34.000Z · LW(p) · GW(p)

Maybe I'll take up the mantle of adversary, Eli, when the circumstances are right. You are far ahead, but I think I can catch up. Who else will do learn and over take, instead of idly chatting?

comment by HighlyAmused · 2007-12-19T18:01:03.000Z · LW(p) · GW(p)

"Von Neumann was probably brighter than I am;"

That one made me chuckle ... Good for you Eliezer! I do enjoy your posts. But that comment cracked me up. So I can only presume it was in jest, of course. It would be a somewhat ironic attempt at modesty to compair yourself to one of the greatest minds in history.

comment by JulianMorrison · 2007-12-19T18:26:49.000Z · LW(p) · GW(p)

Caledonian: ritual behavior is only cultish if the ritual reinforces non-thought. Given that humans are so obviously born hungry for ritual, I'd be inclined to think rational/scientific culture is making far too little use of it, and more would be better. If anything, starving yourself of ritual will make you easy prey for cults.

Elizer: your PDF in the previous post changed the way I think about AI as a concept, stripping off much anthropomorphism. So to that extent you do get to be a guru to me, at least until I get good enough to make advances of my own ;-P

comment by Chris · 2007-12-19T19:04:32.000Z · LW(p) · GW(p)

Couldn't resist adding a complaint about the abuse of the term 'guru' as a term of ...abuse. It represents in fact an exponent of a perfectly respectable form of expertise transmission in non-rational domains. Drift into abuse of authority by such an exponent is perhaps more likely because the method relies on authority rather than argument, but that doesn't mean that the concept is invalid, or indeed that there is any other method possible in those domains.

comment by Chris · 2007-12-19T21:24:35.000Z · LW(p) · GW(p)

Goplat, can't answer for Caledonian, but as I'm pretty sad & pathetic myself, I'll take a stab. The unborn represent variety and potentiality. More of the same represents sterility. Sure I'd like to live 500 productive & happy years, but am in my better moments conscious that with present biotechnology this is unlikely. With SIAI improved biotechnology who knows ? However, my totally uninformed intuition is that however superproductive & longlived the ultra-new curly-wurly chromosomes that my friendly neighbourhood SIAI will give me are, they would do better (in accordance with their interest) endowing them on the young of the species. Your argument that we now are happy living 80 years where our ancestors were lucky to make 40 is pertinent, but adding years after 40 still doesn't increase the productive lifespan of a mathematician. Jesus died at 30 (or was it 33 ?). Mother Theresa was doing productive caring work into advanced old age. So perhaps youth = creativity, age = caring. A 'Self Improving' AI would surely privilege the 1st option. For better or for worse. Personally I'm for balance, and am all for the increase of life expectancy at a rate which is compatible with human capacity to adapt. I wrote a piece on the Impossibility of a 'Friendly' SIAI which I may inflict on the world someday.

comment by Chris · 2007-12-19T21:26:48.000Z · LW(p) · GW(p)

Just had a response to Goplat rejected as spam. Wonder what the biases built in to the new antispam filter are ?

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-12-19T22:09:49.000Z · LW(p) · GW(p)

"Von Neumann was probably brighter than I am;"

That one made me chuckle ... Good for you Eliezer! I do enjoy your posts. But that comment cracked me up. So I can only presume it was in jest, of course. It would be a somewhat ironic attempt at modesty to compair yourself to one of the greatest minds in history.

Carl asked for someone with superior Spearman's g, which is more widely known as g-factor. Not "least upper bound", just "superior".

Spearman's g is tricky. It's easy for me to see that Jaynes is better at Bayesian calculus than I am, but that doesn't mean I can infer that Jaynes was doing it through superior g-factor (nor that he wasn't).

Traditional IQ tests sensibly and reliably measure a range of around 60-140. Richard Feynman's measured IQ was 137, but you have to translate that as "outside the range of the IQ test", not "80 IQ points dumber than Marilyn vos Savant".

There have been attempts to devise measures of "genius IQ" but I'm not impressed with any of their validation measures, and in any case, I haven't taken any.

Von Neumann was famous as a genius who scared other geniuses. I still added the qualifier "probably" because I don't actually know that von Neumann did his stuff via g-factor per se, rather than, say, by working so hard that he scared other hardworking mathematicians. It does seem likely that von Neumann had one of the highest Spearman's-g of the 20th century, but it's not certain. Anyone above a certain range tends to specialize in modes of cognition, and they do what they do by choosing tasks that fit their peculiar genius, not necessarily by being generally "better" than other geniuses in any directly comparable sense. Was Einstein smarter than Newton? I don't know; they applied different kinds of genius. So I picked von Neumann as the archetype - his genius wasn't necessarily the most effectively applied of the twentieth century, but he comes to mind as having a damned high g-factor.

If you just say "smart", or something like that, then you're really asking after a sort of generalized status ranking, in which case merely to compare oneself to von Neumann would be an act of great social audacity. Perhaps this is what made you laugh? But Carl didn't ask about life accomplishment or social status, he asked about Spearman's g, which is a very specific request about a characteristic that's very hard to infer above the IQ 140 range.

Replies from: DanielLC
comment by DanielLC · 2011-11-28T07:20:34.716Z · LW(p) · GW(p)

Are you talking to yourself, or is there something wrong with the name on this post?

Replies from: Zack_M_Davis
comment by Zack_M_Davis · 2011-11-28T07:46:36.898Z · LW(p) · GW(p)

One presumes that the second paragraph ought to be italicized, to indicate that it is a quotation of HighlyAmused's earlier comment. Interestingly, the Internet Archive's record of this thread as it appeared on its old home at Overcoming Bias does display the italics correctly; it is certainly odd that the move to Less Wrong should result in such an idiosyncratic error.

comment by Caledonian2 · 2007-12-20T00:13:34.000Z · LW(p) · GW(p)
Traditional IQ tests sensibly and reliably measure a range of around 60-140. Richard Feynman's measured IQ was 137, but you have to translate that as "outside the range of the IQ test"

No. It is far more likely that the qualities that made Feynman a genius were not those that were measured by IQ tests.

It's not a matter of his intellect being outside of a range. Intellect has a dimensionality far greater than IQ tests measure, period.

comment by JulianMorrison · 2007-12-20T00:37:52.000Z · LW(p) · GW(p)

I have a suspicion that very high IQ is like comparing cheetahs to dogs. The dog isn't worse, he's just less of a specialist. High IQ means using the same wetware differently. More computation effort is devoted to a particular range of tasks. When you get into the ultra-genius range, you are actually starting to chip away at features used by the rest of the system. A narrow focus on one mode of cognition is unavoidable.

comment by anonymous20 · 2007-12-20T03:30:52.000Z · LW(p) · GW(p)

Eliezer or Carl:

Is reading "GENERAL INTELLIGENCE," OBJECTIVELY DETERMINED AND MEASURED (http://psychclassics.yorku.ca/Spearman/) the best way to understand what you mean by "Spearman's g?"

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-12-20T03:54:29.000Z · LW(p) · GW(p)

Anon, Spearman's original book is pretty old. Try Wikipedia, or search Gene Expression, or this page seems to have a lot of resources. Jensen had a nice intro paper at Psycoloquy, but Psycoloquy seems to be down at the moment.

comment by Peter_de_Blanc · 2007-12-20T05:43:41.000Z · LW(p) · GW(p)

Chris:

I'd agree that mathematicians probably peak somewhere around 40. I can see two things contributing to the subsequent decline: mental degeneration, and an increasingly irrelevant skill set; if your specialty is narrow enough, you may have solved all the easy problems in your microfield. I expect life extension technologies to fix the former problem. For the latter problem, Feynman recommends changing fields every 7 years. Of course, he's Feynman; maybe 10 years is better for us ordinary folks.

comment by Colombi · 2014-02-20T05:26:30.473Z · LW(p) · GW(p)

Is there a hidden meaning to this? I only grasp the exterior feel to this shiny poem.

comment by SeanMCoincon · 2014-07-31T22:22:52.205Z · LW(p) · GW(p)

"...Although, do please make the check out to 'Cash'."