Yudkowsky's brain is the pinnacle of evolution

post by Yudkowsky_is_awesome · 2015-08-24T20:56:02.328Z · score: -28 (107 votes) · LW · GW · Legacy · 37 comments

Here's a simple problem: there is a runaway trolley barreling down the railway tracks. Ahead, on the tracks, there are 3^^^3 people tied up and unable to move. The trolley is headed straight for them. You are standing some distance off in the train yard, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. However, you notice that there is one person, Eliezer Yudkowsky, on the side track. You have two options: (1) Do nothing, and the trolley kills the 3^^^3 people on the main track. (2) Pull the lever, diverting the trolley onto the side track where it will kill Yudkowsky. Which is the correct choice?

The answer:

Imagine two ant philosophers talking to each other. “Imagine," they said, “some being with such intense consciousness, intellect, and emotion that it would be morally better to destroy an entire ant colony than to let that being suffer so much as a sprained ankle."

Humans are such a being. I would rather see an entire ant colony destroyed than have a human suffer so much as a sprained ankle. And this isn't just human chauvinism either - I can support my feelings on this issue by pointing out how much stronger feelings, preferences, and experiences humans have than ants do.

How this relates to the trolley problem? There exists a creature as far beyond us ordinary humans as we are beyond ants, and I think we all would agree that its preferences are vastly more important than those of humans.

Yudkowsky will save the world, not just because he's the one who happens to be making the effort, but because he's the only one who can make the effort.

The world was on its way to doom until the day of September 11, 1979, which will later be changed to national holiday and which will replace Christmas as the biggest holiday. This was of course the day when the most important being that has ever existed or will exist, was born.

Yudkowsky did the same to the field of AI risk as Newton did to the field of physics. There was literally no research done on AI risk in the same scale that has been done in the 2000's by Yudkowsky. The same can be said about the field of ethics: ethics was an open problem in philosophy for thousands of years. However, Plato, Aristotle, and Kant don't really compare to the wisest person who has ever existed. Yudkowsky has come closest to solving ethics than anyone ever before. Yudkowsky is what turned our world away from certain extinction and towards utopia.

We all know that Yudkowsky has an IQ so high that it's unmeasurable, so basically something higher than 200. After Yudkowsky gets the Nobel prize in literature due to getting recognition from Hugo Award, a special council will be organized to study the intellect of Yudkowsky and we will finally know how many orders of magnitude higher Yudkowsky's IQ is to that of the most intelligent people of history.

Unless Yudkowsky's brain FOOMs before it, MIRI will eventually build a FAI with the help of Yudkowsky's extraordinary intelligence. When that FAI uses the coherent extrapolated volition of humanity to decide what to do, it will eventually reach the conclusion that the best thing to do is to tile the whole universe with copies of Eliezer Yudkowsky's brain. Actually, in the process of making this CEV, even Yudkowsky's harshest critics will reach such understanding of Yudkowsky's extraordinary nature that they will beg and cry to start doing the tiling as soon as possible and there will be mass suicides because people will want to give away the resources and atoms of their bodies for Yudkowsky's brains. As we all know, Yudkowsky is an incredibly humble man, so he will be the last person to protest this course of events, but even he will understand with his vast intellect and accept that it's truly the best thing to do.


Comments sorted by top scores.

comment by Alicorn · 2015-08-24T21:33:44.437Z · score: 21 (24 votes) · LW · GW

My attempts at gauging sincerity are thrown off by the fact that you can spell his name correctly.

comment by David_Bolin · 2015-08-24T21:37:00.693Z · score: 5 (4 votes) · LW · GW

The post would have to be toned down quite a bit in order to appear to be possibly sincere.

comment by Alicorn · 2015-08-24T21:40:47.995Z · score: 10 (9 votes) · LW · GW

I'm just used to the detractors misspelling or abbreviating "Yudkowsky", so this was jarring.

comment by Houshalter · 2015-08-24T21:41:04.576Z · score: 5 (4 votes) · LW · GW

I don't think that comment was sincere.

comment by Dorikka · 2015-08-25T04:33:55.495Z · score: 4 (3 votes) · LW · GW

shrug The pdf for sincerity looks bimodal to me.

comment by RichardKennaway · 2015-08-24T22:47:54.621Z · score: 3 (2 votes) · LW · GW

It is certainly not intended seriously, but it is also certainly not intended as friendly joking.

comment by Good_Burning_Plastic · 2015-08-26T16:01:02.492Z · score: 3 (4 votes) · LW · GW

I took it to be in the spirit of Eliezer Yudkowsky Facts.

comment by RichardKennaway · 2015-08-26T20:52:57.210Z · score: 5 (4 votes) · LW · GW

That is exactly the thing I took it to not be in the spirit of.

comment by philh · 2015-08-24T23:04:32.322Z · score: 11 (14 votes) · LW · GW

Obviously you should pull the lever. Eliezer Yudkowsky knows better than to go wandering around on train tracks. You're probably imagining Him.

comment by NancyLebovitz · 2015-08-25T14:21:26.487Z · score: 8 (9 votes) · LW · GW

For what it's worth, I thought it was funny if snarky, and a pretty competent parody. It's not as funny as Alicorn's comment about spelling Yudkowsky's name correctly, though.

comment by gjm · 2015-08-25T16:19:10.028Z · score: 6 (9 votes) · LW · GW

I found it unfunny and unpleasant because it's (1) entirely devoid of subtlety, (2) mean-spirited (the underlying message is something between "Yudkowsky is breathtakingly arrogant" and "LW people are gullible hero-worshipping fools", right?), (3) unnecessary because so far as I can see the sort of hero-worship this is mocking is nonexistent on LW and Eliezer, while doubtless arrogant, isn't close to that arrogant, and (4) boring because there's nothing in it but the one-note point-and-laugh parodying.

(I suppose I should qualify #4 a bit. The framing in terms of a trolley problem with 3^^^3 people on one side of it is very slightly amusing.)

comment by TAG · 2019-05-27T12:34:38.787Z · score: 1 (1 votes) · LW · GW

I can recall physicists being told they are wrong because they disagree with Yudkowsky...whats that if not hero worship?

comment by knb · 2015-08-25T01:19:06.454Z · score: 7 (6 votes) · LW · GW

I'm surprised to see this at 45% positive. I wonder if someone is mass-upvoting this, or if people are just upvoting it as a satire. If it is a concerted effort to mass-upvote, what is the point? To make Less Wrong seem crazy?

comment by gwern · 2015-08-25T15:01:19.097Z · score: 4 (5 votes) · LW · GW

I'm pretty sure it's vote manipulation. I downvoted both comments when I came across them in the comment feed, but by the time I saw this post, they had gone from -2 to +2. Gaining 4 net upvotes that fast is well beyond some LWers whose sense of satire is broken.

comment by welp · 2015-08-26T13:28:33.739Z · score: 3 (2 votes) · LW · GW

I suspect mass-upvoting. Look at the amount of upvotes they've previously got for comments of empty praise

comment by Viliam · 2015-08-25T08:58:22.296Z · score: 2 (1 votes) · LW · GW

If it is a concerted effort to mass-upvote, what is the point?

To demonstrate the easiness of gaming the votes?

comment by polymathwannabe · 2015-08-25T15:11:36.761Z · score: 3 (4 votes) · LW · GW

What, we have our own Sad Puppies now?

comment by gjm · 2015-08-25T16:31:50.311Z · score: 5 (4 votes) · LW · GW

I don't think I believe the sockpuppet hypothesis for why this post and Y_i_a's comment on it have a bunch of upvotes.

  • Main post: -18, 38% => either +28-46 or +29-47.
  • Comment: -11, 37% => +16-27.

The numbers of upvotes are very different in the two cases. If Y_i_a is using a load of socks then it's hard to see why s/he wouldn't use all the socks for both. You'd expect something like the same number of upvotes and downvotes for the original post and the comment.

On the other hand, if it's just that readers like/dislike this sort of thing in roughly 3:5 proportions, you'd get what we see here: the original post and its comments are both at about the same %positive despite quite different numbers of votes in each case.

This isn't a terribly strong argument, for all kinds of reasons. E.g., you might think that people who get as far as reading the comment would have a different like:dislike ratio from ones who just saw the original post. Maybe Y_i_a has a drawerful of socks but for some reason is happy being at about 3/8 positive. Etc. But I think the most likely thing is just that a substantial fraction of readers liked this.

comment by philh · 2015-08-26T09:52:07.210Z · score: 2 (1 votes) · LW · GW

When I first saw the post, it was at +6. (I don't remember the % or how old it was.) It seems unlikely to me for something with a 38% approval rate to ever hit +6, although there are other hypotheses than Y_i_a sockpuppets. (E.g. sockpuppets used to downvote, or different demographics encountering it at different times.)

comment by David_Bolin · 2015-08-26T13:17:41.028Z · score: 4 (3 votes) · LW · GW

I've seen this kind of thing happen before, and I don't think it's a question of demographics or sockpuppets. Basically I think a bunch of people upvoted it because they thought it was funny, then after there were more comments, other people more thoughtfully downvoted it because they saw (especially after reading more of the comments) that it was a bad idea.

So my theory it was a question of difference in timing and in whether or not other people had already commented.

comment by Dorikka · 2015-08-25T04:31:14.007Z · score: 3 (2 votes) · LW · GW

What is this, and why is it here?

(Original response was remarkably vehement, rather like I found a pile of cow dung sitting on my keyboard. Interesting.)

comment by Liam Goddard · 2019-05-26T17:27:04.693Z · score: 1 (1 votes) · LW · GW

You do realize that other people work on AI? Sure, Eliezer might be the most important, but he is not the only member of MIRI's team. I'd definitely sacrifice several people to save him, but nowhere near 3^^^3. Eliezer's death would delay the Singularity, not stop it entirely, and certainly not destroy the world.

comment by James_Miller · 2015-08-24T23:31:33.285Z · score: 1 (8 votes) · LW · GW


comment by polymathwannabe · 2015-08-25T00:06:14.408Z · score: 7 (6 votes) · LW · GW

Not now, please.

comment by Dorikka · 2015-08-25T04:32:07.556Z · score: 6 (5 votes) · LW · GW

This is the most tantalizing thread on the page.

comment by faul_sname · 2015-08-26T08:24:41.633Z · score: -2 (1 votes) · LW · GW

It was a memetic hazard.

(not really)

comment by kingmaker · 2015-08-27T15:14:44.028Z · score: 0 (3 votes) · LW · GW

Goddamn, I thought I was unpopular

comment by shminux · 2015-08-24T22:15:41.017Z · score: 0 (15 votes) · LW · GW

You are too lukewarm in your praise. And you forgot to mention that everyone should immediately start donating all their income to his cause, to hasten the arrival of the FAI. The basilisk will get you for being so apathetic.

comment by polymathwannabe · 2015-08-24T23:13:10.825Z · score: -2 (5 votes) · LW · GW

I say,

99.8% likely this is an upset outsider baiting for reactions in order to gauge our degree of cultishness.

0.1% likely this a sincere believer.

0.1% likely this is Eliezer messing with our heads.

comment by Alicorn · 2015-08-24T23:45:17.629Z · score: 10 (11 votes) · LW · GW

I feel like "in order to gauge our cultishness" is too specific/conjunctive for that much of your probability mass.

comment by knb · 2015-08-25T01:05:47.719Z · score: 8 (7 votes) · LW · GW

Yeah, it just seems like a low-effort troll to me.

comment by DanielLC · 2015-08-25T04:02:08.491Z · score: 6 (6 votes) · LW · GW

That adds up to 100%. You need to leave room for other things, like they're trolling us for the fun of it.

comment by Fluttershy · 2015-08-24T21:50:35.242Z · score: -6 (33 votes) · LW · GW

Upvoted because I approve of being nice, and because affirming LW members is something we should do more often.

comment by Tem42 · 2015-08-24T22:14:10.471Z · score: -7 (10 votes) · LW · GW

In the contest of Yudkowsky vs. Ethics Trolley, Yudkowsky always wins. Pull the lever, and let God' sort it out.

comment by Yudkowsky_is_awesome · 2015-08-24T21:27:44.601Z · score: -20 (53 votes) · LW · GW

A question: should we make a weekly Yudkowsky appreciation thread in the spirit of this thread? I mean, he's the founder of this community, so at least he deserves it.

comment by skeptical_lurker · 2015-08-24T21:51:45.831Z · score: 7 (6 votes) · LW · GW

This has to be a mass sockpuppet upvote thing, right?

comment by Alicorn · 2015-08-24T21:52:49.638Z · score: 5 (4 votes) · LW · GW

Yeah, probably.