To Lead, You Must Stand Up

post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-12-29T06:38:47.000Z · LW · GW · Legacy · 31 comments

Contents

31 comments

Followup toLonely Dissent

True story:  In July, I attended a certain Silicon Valley event.  I was not an organizer, or a speaker, or in any other wise involved on an official level; just an attendee.  It was an evening event, and after the main presentations were done, much of the audience hung around talking... and talking... and talking...  Finally the event organizer began dimming the lights and turning them back up again.  And the crowd still stayed; no one left.  So the organizer dimmed the lights and turned them up some more.  And lo, the people continued talking.

I walked over to the event organizer, standing by the light switches, and said, "Are you hinting for people to leave?"  And he said, "Yes.  In fact [the host company] says we've got to get out of here now - the building needs to close down."

I nodded.

I walked over to the exit.

I shouted, "LISTEN UP, EVERYONE!  WE'VE GOT TO GO!  OUR TIME HERE HAS PASSED!  YOU CAN TALK OUTSIDE IF YOU LIKE!  NOW FOLLOW ME... TO FREEDOM!"

I turned.

I marched out the door.

And everyone followed.

I expect there were at least two or three CEOs in that Silicon Valley crowd.  It didn't lack for potential leaders.  Why was it left to me to lead the CEOs to freedom?

Well, what was in it for them to perform that service to the group?  It wasn't their problem.  I'm in the habit of doing work I see being left undone; but this doesn't appear to be a common habit.

So why didn't some aspiring would-be future-CEO take the opportunity to distinguish themselves by acting the part of the leader?  I bet at least five people in that Silicon Valley crowd had recently read a business book on leadership...

But it's terribly embarrassing to stand up in front of a crowd.  What if the crowd hadn't followed me?  What if I'd turned and marched out the door, and been left looking like a complete fool?  Oh nos!  Oh horrors!

While I have sometimes pretended to wisdom, I have never pretended to solemnity.  I wasn't worried about looking silly, because heck, I am silly.  It runs in the Yudkowsky family.  There is a difference between being serious and being solemn.

As for daring to stand out in the crowd, to have everyone staring at me - that was a feature of grade school.  The first time I gave a presentation - the first time I ever climbed onto a stage in front of a couple of hundred people to talk about the Singularity - I briefly thought to myself:  "I bet most people would be experiencing 'stage fright' about now.  But that wouldn't be helpful, so I'm not going to go there."

I expect that a majority of my readers like to think of themselves as having strong leadership qualities.  Well, maybe you do, and maybe you don't.  But you'll never get a chance to express those leadership qualities if you're too embarrassed to call attention to yourself, to stand up in front of the crowd and have all eyes turn to you.  To lead the pack, you must be willing to leave the pack.

31 comments

Comments sorted by oldest first, as this post is from before comment nesting was available (around 2009-02-27).

comment by Nominull2 · 2007-12-29T06:48:40.000Z · LW(p) · GW(p)

Hey everybody, let's post our self-perceptions regarding our leadership abilities so that Eliezer can get some feedback as to his expectations!

As for me, I couldn't lead a starving man to a buffet.

comment by Carl_Shulman · 2007-12-29T07:20:22.000Z · LW(p) · GW(p)

People not performing an altruistic service to the organizers in clearing the building isn't very surprising. I find deciding on which restaurant/venue/entertainment a group should go to much more interesting. Often everyone in a group will repeatedly defer, until deadweight costs in time eat away a hefty chunk of the benefit of the excursion. That could be a real public good problem, or just a market failure to capture a private good.

comment by Toli_G. · 2007-12-29T07:45:45.000Z · LW(p) · GW(p)

Whoever has the strongest frame, whoever has the most certainty of their reality, wins. If your voice had wavered, if you had looked back in doubt to see if the people were actually following you, they would have been less willing to comply. But because you assumed authority, they assumed you had it and felt pressure to conform. It's one of those beautiful and counterintuitive things in life, but not a surprising one.

comment by Vladimir_Nesov2 · 2007-12-29T11:58:47.000Z · LW(p) · GW(p)

Eliezer,

Your accent on leadership in this context seems strange: it was in no one's interest to leave, so biased decision was to follow you, not hesitation in choosing to lead others outside.

comment by Chris · 2007-12-29T13:37:36.000Z · LW(p) · GW(p)

By that time everyone knew it was time to leave, they had seen the lights repeatedly dimmed, but they were comfortable in the hall, and as long as no individual could be blamed for the antisocial act of staying, they would do so. Nevertheless their discomfort level was rising. Your action precipitated the decision, like seeding a supersaturated solution precipitates crystallisation. It's another example of an unstable group equilibrium just waiting to be disturbed, like the lonely dissenter in a group where the majority have private doubts. If the lights hadn't previously been repeatedly dimmed, the group might well not have followed you.

comment by Overcoming_Cryonics · 2007-12-29T17:10:49.000Z · LW(p) · GW(p)

Eliezer, enough with your nonsense about cryonicism, life-extensionism, trans-humanism, and the singularity. These things have nothing to do with overcoming bias. They are just your arbitrary beliefs.

comment by Carl_Shulman · 2007-12-29T17:29:51.000Z · LW(p) · GW(p)

OC,

How would you define 'arbitrary belief?' The phrase brings to mind a 'belief' that the next toss of two fair dice will come up snake eyes.

comment by Unknown · 2007-12-29T18:24:37.000Z · LW(p) · GW(p)

I think everybody realizes that many of Eliezer's posts have little to do with overcoming bias. But they're very interesting anyway: compare the number of comments on his posts with the comments on other posts. So it doesn't seem very fair to attack him for that reason.

comment by Chris · 2007-12-29T19:12:29.000Z · LW(p) · GW(p)

OC, in the immortal words of Paul Simon : "A Bayesian estimates the priors he wants to estimate "And disregards the rest... "Lih lih lih........."

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-12-29T20:00:49.000Z · LW(p) · GW(p)

Rationality for the sake of rationality ends up as an arbitrary morality. If you do not have something more important in your eyes than "being rational", then you cannot learn from your success or failure; you will simply keep whatever definition of "rationality" you started with, because it is more important to you to be "rational" than to succeed.

No one puts a desperate effort (isshoukenmei) into rationality unless more than their own life is at stake. Leaving the pack is scarier than risking your life; that is why there are more motorcycle riders than rationalists.

I therefore decline to entertain the notion that I should provide you with some kind of idealized pure rationality which does not mention the goals that drive me as a rationalist. A pure desire to be "rational" is not the source of the drive to throw away old conceptions of "rationality" and invent better ones.

comment by Vladimir_Nesov2 · 2007-12-29T20:24:12.000Z · LW(p) · GW(p)

OC: Eliezer, enough with your nonsense about cryonicism, life-extensionism, trans-humanism, and the singularity. These things have nothing to do with overcoming bias. They are just your arbitrary beliefs.

I guess it's other way around: the point of most of the questions raised by Eliezer is to take a debiased look at controversial issues such as those you list, to hopefully build a solid case for sensible versions of them. For example, existing articles can point at fallacies in your assertions: you assume cryonics, etc. to be separate magisteria outside of the domain of rationality and argue from apparent absurdity of these issues.

comment by Overcoming_Cryonics · 2007-12-29T20:28:35.000Z · LW(p) · GW(p)

Carl Shulman, I will define arbtirary belief as something presupposed on faith and not exposed to testing.

My neighbor believes scratching off a lottery ticket with his lucky coin will improve his odds of winning. My neighbor's daughter believes Santa Claus delivered her presents last week. And some people believe in cryonicism, life-extensionism, trans-humanism, and the singularity.

comment by Doug_S. · 2007-12-29T21:56:49.000Z · LW(p) · GW(p)

My neighbor's daughter believes Santa Claus delivered her presents last week.
Considering that everyone she knows will tell her that Santa Claus did indeed deliver those presents, that she can meet Santa Claus (or his representative) in person at a local mall, and that his travel is tracked in real time by the United States Department of Defense, I'd say your neighbor's daughter's belief is quite reasonable.

Your average 6-year-old is presented with far more evidence in favor of the Santa Claus hypothesis than the average adult is given in favor of the hypothesis that Jesus rose from the dead or that Lee Harvey Oswald acted alone...

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-12-29T22:02:01.000Z · LW(p) · GW(p)

If you pursue rationality as a means to some end, you will reject it if it ever comes into conflict with your deeper priority.

Never happens unless you get your concept of "rationality" wrong. How exactly does knowingly being irrational actually help you to build a space shuttle? Or a steam shovel? Or cure a disease? Only people with mistaken ideas about "rationality" will find themselves believing that "irrationality" will help.

In like sense, you are far better off wanting to know whether the sky is blue or green, than wanting to know The Truth About The Sky. "Truth" and "rationality" only work as top-level goals if you phrase them exactly correctly on the first try; otherwise you're a lot better off as a rationalist if your real goal is to build a steam shovel, because then you can notice what works and what doesn't.

This is why progress from the ancient Greeks was driven by empirical discovery rather than philosophy. Philosophy just compares the current conception of "rationality" to itself.

See Why truth?, Doublethink, and the nameless virtue.

comment by Overcoming_Cryonics · 2007-12-29T22:20:40.000Z · LW(p) · GW(p)

I just think some on this blog religiously accept cryonicism, life-extensionism, trans-humanism, and the singularity. If they truly overcame bias I suspect they would be less overconfident in these beliefs.

Eliezer writes, "I therefore decline to entertain the notion that I should provide you with some kind of idealized pure rationality which does not mention the goals that drive me as a rationalist." Translation: Bias is allowed but only in service of Eliezer's cryonicism/life-extensionism/trans-humanism/singularity religion.

comment by Caledonian2 · 2007-12-29T22:30:15.000Z · LW(p) · GW(p)
How exactly does knowingly being irrational actually help you to build a space shuttle?

If being rational requires behaviors and choices incompatible with constructing the space shuttle, irrationality is the only way to reach the goal of building it.

You can see this all the time in politics. If logical arguments help support the goals that people are trying to forward, they'll use them. If logic works against them, they will ignore it - and try their hardest to distract everyone else from it.

Tell us: what was the rational purpose of the space shuttle in the first place?

"Truth" and "rationality" only work as top-level goals

Bottom-level. The bottom goal is the support for everything else. You can take off the top without disturbing anything beneath it, but changing anything lower-down affects everything above it.

'Rationality' isn't a fundamental goal. It's a means toward an end. The end, the only one that actually matters. Moving up through the goaltree, rationality only then becomes the goal we strive for.

In most cases, people only want as little truth as is necessary to accomplish their goals. Their actual goals, not what they claim to want or even what they think they want.

Eliezer will probably just pretend that all of those things [cryo etc.] are justified without ever suggesting that it could even be otherwise. Alternate strategies involve claiming that anyone who doesn't accept that believing those things is justified is irrational, putting forward logically invalid arguments that 'prove' belief in those things is justified, and trying to claim their validity as axiomatic.

[EY: I've compacted five Caledonian comments into one comment. See OB policy on commenting frequency. I'm letting C get away with his sixth comment below, but if he keeps it up I'll start deleting outright. If you want to post more, C, then wait.]

Ah, I see posts are now being edited without being marked to show that they have changed. Splendid!

comment by Vladimir_Nesov2 · 2007-12-29T22:44:34.000Z · LW(p) · GW(p)

Caledonian, I think you are confusing goals with truths. If truth is that the goal consists in certain things, rationality doesn't oppose it in any way. It is merely a tool that optimizes performance, not an arbitrary moral constraint.

comment by Overcoming_Cryonics · 2007-12-29T22:48:19.000Z · LW(p) · GW(p)

Eliezer writes, "I therefore decline to entertain the notion that I should provide you with some kind of idealized pure rationality which does not mention the goals that drive me as a rationalist."

Translation: Bias is allowed but only in service of Eliezer's cryonicism/life-extensionism/trans-humanism/singularity religion.

comment by Caledonian2 · 2007-12-29T23:35:35.000Z · LW(p) · GW(p)
If truth is that the goal consists in certain things, rationality doesn't oppose it in any way.

I'm going to presume that was meant to be "consists of", because I can't make the sentence parse as-is.

If you want to actually accomplish a goal, you must confirm that the "certain things" that the goal consists of are both possible and mutually compatible. If you don't check that the goal is really attainable, then you don't really hold it: claiming you're seeking that end is part of your goaltree.

A person who never bothers to analyze whether his goals are attainable, and instead repeatedly asserts that they are so, isn't actually trying to accomplish those things. Their assertion is just a means towards making some unknown, actual goal or goals.

comment by Tom_McCabe2 · 2007-12-29T23:48:12.000Z · LW(p) · GW(p)

"Only people with mistaken ideas about "rationality" will find themselves believing that "irrationality" will help."

Conjecture: Take two arbitrary, identical optimization processes. Pick some verifiable piece of information A. Give the first process the belief "A", and the second process the belief "~A". The average end utility of the first process will always be equal to or greater than the average end utility of the second process.

Attempted disproof: Suppose that both processes were programmed by naive human rationalists, who give the processes the supergoal of correcting mistaken beliefs. Also suppose that the universe is finite, and has a finite complexity. Both processes will improve their models of the universe, in accordance with the laws of rationality, until they eventually reach an asymptote due to the finite amount of modelable information. Because the utility is determined by the amount of new information that had to be added to the model, the second process will have a higher end utility, as it had to add the information "A" to correct the false information "~A".

comment by Carl_Shulman · 2007-12-30T00:23:31.000Z · LW(p) · GW(p)

Caledonian,

Why not accompany your assertions of irrationality and unexamined views with some substantive supporting arguments and create your own anonymous blog?

comment by Nominull2 · 2007-12-30T00:23:50.000Z · LW(p) · GW(p)

Nobody followed my lead. I guess that means my belief about my leadership abilities is well-founded!

comment by Overcoming_Cryonics · 2007-12-30T00:33:27.000Z · LW(p) · GW(p)

Eliezer Yudkowsky, if you're going to enforce the comments policy then you should also self-enforce the overcoming bias posting policy instead of using posts to blithely proselytize your cryonicism/life-extensionism/trans-humanism/singularity religion.

comment by GreedyAlgorithm · 2007-12-30T01:13:40.000Z · LW(p) · GW(p)

Tom: What actually happens under your scenario is that the naive human rationalists frantically try to undo their work when they realize that the optimization processes keep reprogramming themselves to adopt the mistaken beliefs that are easiest to correct. :D

comment by Caledonian2 · 2007-12-30T01:39:35.000Z · LW(p) · GW(p)
"Only people with mistaken ideas about "rationality" will find themselves believing that "irrationality" will help."

Because of computational restrictions, humans can easily maintain two mutually-contradictory beliefs, as long as those beliefs are never allowed to interact with each other or their necessary consequences.

Imagine a person that had the goal of avoiding the necessity of recognizing that two of their beliefs were incompatible. This goal cannot be explicitly stated, because if the person recognizes that they're avoiding the realization, they'd automatically failed. They MUST behave irrationally in order to meet their goal. The sub-module of their mind directly connected with this goal is behaving rationally in drawing the rest of the mind into delusion and madness, because that's the only way the goal can be met, but the whole mind can only be described as acting contrary to reason.

I can recognize this truth, that the goal in question can only be reached through irrational thinking. It's an insane, irrational goal, and only an irrational person would desire it.

Only a person with mistaken ideas about 'rationality' would make the claim you just did. Rationality is not compatible with all possible goals, and irrationality is a necessary precondition for some goals to be met. Your denial of this reality is... irrational.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2007-12-30T01:51:41.000Z · LW(p) · GW(p)

Eliezer Yudkowsky, if you're going to enforce the comments policy then you should also self-enforce the overcoming bias posting policy instead of using posts to blithely proselytize your cryonicism/life-extensionism/trans-humanism/singularity religion.

OC, I'm not clear on what stated Overcoming Bias policy you could possibly be referring to. However, I will schedule a post for tomorrow (Sunday) on this topic, which, I confess, seems entirely unrelated to the subject of this particular post.

comment by CronoDAS · 2011-03-05T03:37:24.650Z · LW(p) · GW(p)

Maybe they didn't lead people out because they were too busy talking? ;)

comment by TraderJoe · 2012-04-26T10:56:41.509Z · LW(p) · GW(p)

Each individual conference guest had the opportunity to do something which was For The Greater Good [lead people out], with some personal risk [drawing attention to himself, embarrassment in the unlikely event that nobody follows him] and very little to personally gain [the chance to not be in a building he probably preferred to be in].

A better question is "why didn't the organiser do what you did?" And my guess is that he found the embarassment factor high, high enough to over-ride the gain [he was one of the few people who did stand to gain by doing what you did]

In which case, everyone acted perfectly rationally, assuming you treat aversion to embarrassment as rational. You're to be saluted for not being especially averse to embarrassment, but I don't think this is a question of rationality or of leadership.

Side point: +2 awesomeness for "FOLLOW ME... TO FREEDOM!"

comment by Odinn · 2013-03-06T10:06:30.604Z · LW(p) · GW(p)

Reading the comments, I saw it somehow turned into a discussion on whether or not Eliezer Yudkowski elects biased favor for cryonics, transhumanism, etc. Didn't read far enough to see anyone hurl any accusations of Nazism or Hitler-likeness, but I'll weigh in and say that I'm new to Lesswrong and enjoy a good amount of Eliezer's articles and find them to be good tools for learning clear thought, and I also have almost no familiarity with any of his theories (or opinions as it may be) that fall outside the scope of heuristics, fallacies, statistics or decision. So far I've only managed to read a smattering of Bayesian statistics and Feynman (still struggling with both), but I would concider the whole thing a wasted effort if elected any human to a level beyond question. If I read Eliezer's articles on the Affect Heuristic and think "I'll just accept this as true because Mr Yudkowski says it's true. Phew! Thank goodness someone smarter than me did all that heavy thinking for me" than CLEARLY I need to reread it

comment by Ahmed_Amine_Ramdani · 2013-05-31T00:30:25.590Z · LW(p) · GW(p)

EY, would you believe that my social life has greatly benefited from my pulling exactly this kind of stunt on a regular basis, and that HPJEV's example was a huge influence towards that?

Also, from now on, I'll be posting under my own name on LW.

Counterpoint to your post; the nail that sticks out gets hammered down. Leading and standing out means painting oneself as a target; so one needs to make sure that one's followers have one's back when it really counts. Even then, it might not be enough.

comment by momom2 (amaury-lorin) · 2021-07-10T10:58:45.299Z · LW(p) · GW(p)

It's regrettable that so little of the conversation in the comments was about the post itself, because 1- I have found such discussions under other posts to be very insightful and 2- I was disappointed that so few comments were useful.

As someone else mentioned, friends arguing about what place to have dinner at often eats up significant amounts of time. Based on my personal experience, people will often feel grateful if you take responsibility, because they care less about the meal than about not infringing on other's meal preferences. Thankfully, if they are so respectful that they will do that, they will also probably be open to fixing the issue.

More generally, to supplement Yudkowsky's argument that people should be more willing to stand up, here are a couple ways to help you do it : -Make a quick estimate of expected utility : little to no ill consequences if you fail, and big payoff if you succeed. -The payoff is not only in actually achieving something or in acquiring social status, but also in knowing your reasoning was stronger than your instinct. -It will benefit other people greatly. Do it for them ! -It will make it easier to take this kind of decisions in the future.