Normal Ending: Last Tears (6/8)

post by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-02-04T08:45:35.000Z · LW · GW · Legacy · 68 comments

(Part 6 of 8 in "Three Worlds Collide")

Today was the day.

The streets of ancient Earth were crowded to overbursting with people looking up at the sky, faces crowded up against windows.

Waiting for their sorrows to end.

Akon was looking down at their faces, from the balcony of a room in a well-guarded hotel.  There were many who wished to initiate violence against him, which was understandable.  Fear showed on most of the faces in the crowd, rage in some; a very few were smiling, and Akon suspected they might have simply given up on holding themselves together.  Akon wondered what his own face looked like, right now.

The streets were less crowded than they might have been, only a few weeks earlier.

No one had told the Superhappies about that part.  They'd sent an ambassadorial ship "in case you have any urgent requests we can help with", arriving hard on the heels of the Impossible.  That ship had not been given any of the encryption keys to the human Net, nor allowed to land.  It had made the Superhappies extremely suspicious, and the ambassadorial ship had disgorged a horde of tiny daughters to observe the rest of the human starline network -

But if the Superhappies knew, they would have tried to stop it.  Somehow.

That was a price that no one was willing to include into the bargain, no matter what.  There had to be that - alternative.

A quarter of the Impossible Possible World's crew had committed suicide, when the pact and its price became known.  Others, Akon thought, had waited only to be with their families.  The percentage on Earth... would probably be larger.  The government, what was left of it, had refused to publish statistics.  All you saw was the bodies being carried out of the apartments - in plain, unmarked boxes, in case the Superhappy ship was using optical surveillance.

Akon swallowed.  The fear was already drying his own throat, the fear of changing, of becoming something else that wasn't quite him.  He understood the urge to end that fear, at any price.  And yet at the same time, he didn't, couldn't understand the suicides.  Was being dead a smaller change?  To die was not to leave the world, not to escape somewhere else; it was the simultaneous change of every piece of yourself into nothing.

Many parents had made that choice for their children.  The government had tried to stop it.  The Superhappies weren't going to like it, when they found out.  And it wasn't right, when the children themselves wouldn't be so afraid of a world without pain.  It wasn't as if the parents and children were going somewhere together.  The government had done its best, issued orders, threatened confiscations - but there was only so much you could do to coerce someone who was going to die anyway.

So more often than not, they carried away the mother's body with her daughter's, the father with the son.

The survivors, Akon knew, would regret that far more vehemently, once they were closer to the Superhappy point of view.

Just as they would regret not eating the tiny bodies of the infants.

A hiss went up from the crowd, the intake of a thousand breaths.  Akon looked up, and he saw in the sky the cloud of ships, dispersing from the direction of the Sun and the Huygens starline.  Even at this distance they twinkled faintly.  Akon guessed - and as one ship grew closer, he knew that he was right - that the Superhappy ships were no longer things of pulsating ugliness, but gently shifting iridescent crystal, designs that both a human and a Babyeater would find beautiful.  The Superhappies had been swift to follow through on their own part of the bargain.  Their new aesthetic senses would already be an intersection of three worlds' tastes.

The ship drew closer, overhead.  It was quieter in the air than even the most efficient human ships, twinkling brightly and silently; the way that someone might imagine a star in the night sky would look close up, if they had no idea of the truth.

The ship stopped, hovering above the roads, between the buildings.

Other bright ships, still searching for their destinations, slid by overhead like shooting stars.

Long, graceful iridescent tendrils extended from the ship, down toward the crowd.  One of them came toward his own balcony, and Akon saw that it was marked with the curves of a door.

The crowd didn't break, didn't run, didn't panic.  The screams failed to spread, as the strong hugged the weak and comforted them.  That was something to be proud of, in the last moments of the old humanity.

The tendril reaching for Akon halted just before him.  The door marked at its end dilated open.

And wasn't it strange, now, the crowd was looking up at him.

Akon took a deep breath.  He was afraid, but -

There wasn't much point in standing here, going on being afraid, experiencing futile disutility.

He stepped through the door, into a neat and well-lighted transparent capsule.

The door slid shut again.

Without a lurch, without a sound, the capsule moved up toward the alien ship.

One last time, Akon thought of all his fear, of the sick feeling in his stomach and the burning that was becoming a pain in his throat.  He pinched himself on the arm, hard, very hard, and felt the warning signal telling him to stop.

Goodbye, Akon thought; and the tears began falling down his cheek, as though that one silent word had, for the very last time, broken his heart.

 

 

 

 

 

 

 

 

And he lived happily ever after.

68 comments

Comments sorted by oldest first, as this post is from before comment nesting was available (around 2009-02-27).

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-02-04T08:50:30.000Z · LW(p) · GW(p)

This is the original ending I had planned for Three Worlds Collide.

After writing it, it seemed even more awful than I had expected; and I began thinking that it would be better to detonate Sol and fragment the human starline network, guaranteeing that, whatever happened in the future, true humans would continue somewhere.

Then I realized I didn't have to destroy the Earth - that, like so many other stories I'd read, my very own plot had a loophole. (I might have realized earlier, if I'd written part 5 before part 6, but the pieces were not written in order.)

Tomorrow the True Ending will appear, since it was indeed guessed in the comments yesterday.

If anyone wonders why the Normal Ending didn't go the way of the True Ending - it could be because the Superhappy ambassador ship got there too quickly and would have been powerful enough to prevent it. Or it could be because the highest decision-makers of humankind, like Akon himself, decided that the Superhappy procedure was the categorically best way to resolve such conflicts between species. The story does not say.

Replies from: AaronAgassi, PhilGoetz, wobster109, sboo
comment by AaronAgassi · 2011-02-14T13:33:18.896Z · LW(p) · GW(p)

In the spirit of true Soft Science Fiction, it seems more plausible that once they gain understanding of human interaction, the Supperhappy would simply make a technological gift of their communications modality, and allow social change to take it's course. The end result might be much the same, with the Confessor feeling progressively the more alienated as events unfold.

As for the Baby Eaters, 1quite frankly, they'd likely be Sadists. There is plenty of precedent in Human societies, of Sadism as a value, one way or another. But that might pose a conundrum even for the Superhappy.

comment by PhilGoetz · 2011-06-28T20:06:48.481Z · LW(p) · GW(p)

I like this original ending better; it's more thought-provoking (which is almost a synonym for more disturbing). And I'd like to see this submitted and published in a print SF magazine, likely Analog.

comment by wobster109 · 2011-08-20T07:08:41.845Z · LW(p) · GW(p)

I've just a couple days ago returned home from Rationality Camp, and to the best of my estimates, about half the participants prefer this ending, and also, among rationalists that I encounter elsewhere, a non-trivial portion of them prefer this ending as well. What am I saying? Other than the mass suicides, it is not immediately obvious that this original ending is "awful" in any way.

Replies from: AndrewH
comment by AndrewH · 2012-01-29T02:48:12.944Z · LW(p) · GW(p)

Other than the mass suicides...

And including the mass suicides? remember that in this story, 6 billion people become 1 in a million, and over 25% of people died in this branch of the story. Destroying Huygens resulted in 15 billion deaths.

As they say, shut up and multiply.

comment by sboo · 2014-04-22T03:58:30.913Z · LW(p) · GW(p)

"... must relinquish bodily pain, embarrassment, and romantic troubles."

that's worse than letting billions of children be tortured to death every year. that's worse than dying from a supernova. that's worse than dying from mass suicide. that's worse than dying because you can't have sex with geniuses to gain their minds and thus avert the cause of death that you die from.

you really think existence without pain is that bad? you really they are not "true humans".

what about the 3WC humans? are they not "true humans" either. only us?

what about those with CIP? what about cold people? are they not "true humans"?

do you think there should be less but non-zero pain in our minds? how much?

ignore the loophole. explain why this superhappy ending is worse than the supernova ending.

literally unbelievable.

Replies from: somervta
comment by somervta · 2014-04-22T05:39:44.309Z · LW(p) · GW(p)

that's worse than letting billions of children be tortured to death every year. that's worse than dying from a supernova.

No? The story explicitly rejects this. It is only because the Superhappies can deal with the Babyeaters on their own, and that solutions to the human problem do not prevent this that the story is resolved other ways.

that's worse than dying from mass suicide.

I don't see the story as advocating this - Akon does not suicide, for example. It is not that the value difference between human life before and after the change is so large (large than the negative value of death) that is the problem. It is that difference in value, multiplied by the entire human race and it's future potential/member is so large as to outweigh a comparatively tiny number of deaths. I'm not sure that is true, but it is the position of those in the story.

you really think existence without pain is that bad? you really they are not "true humans".

I don't think he thinks that. I think he (Eliezer_2009) thinks they have lost something important, some aspect of their humanity - but that doesn't mean they are completely inhuman.

comment by Steve_Rayhawk · 2009-02-04T09:11:41.000Z · LW(p) · GW(p)

If those are the two endings, then that definition procedure for the term "True Ending" was not very meta-ethically instructive.

comment by Manuel_Mörtelmaier · 2009-02-04T11:41:41.000Z · LW(p) · GW(p)

Awww... I was so looking forward to a heartwarming description of a family feast some years later where Akon delivers a toast to the SHs for "finally restoring the natural order of things."

HAPPY ENDING.

Replies from: Eliezer_Yudkowsky
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-06-08T05:12:08.666Z · LW(p) · GW(p)

Upon due reflection, I have edited the story somewhat to include the final line, in accordance with your suggestion.

Replies from: Baughn
comment by Baughn · 2010-06-29T13:23:08.979Z · LW(p) · GW(p)

And never before has that sentence looked quite so horrifying.

Replies from: Sara, Reeee, Reeee
comment by Sara · 2011-08-26T17:05:51.383Z · LW(p) · GW(p)

Seconded. For some reason, it makes this version of the ending feel an order of magnitute more horrific than it otherwise would have.

comment by Reeee · 2017-03-21T02:56:18.961Z · LW(p) · GW(p)

Now, I can't help but look at the normal ending as the preferable one. I would think along with the aesthetic design of the ships and quite possibly a merging of two races in the process, whether this has happened by this point in the story or not is not something I can guess at, but would be inevitable whether it has or not (or perhaps I misread something here and simple modification, and not outright merging, is actually all that took place)...

... I'd have to wonder what aspects of the babyeater nature and society that could be considered positive have been merged with the superhappies, such as a profound sense of tribal duty (arguably already existing in the superhappies, but more starkly expressed in the babyeaters), a very strong willingness to sacrifice one's own pleasure for the perceived good of the tribe and the whole (no more hiding from negative empathic emissions behind the superhappy confessors, well, quite as much), I'm sure there's more. At first glance, it looked to me like the superhappies basically ate their brains for their knowledge, but after a week of consideration, they would be just as much, no longer superhappies in the end.

What do they get from humans? Deception? Big beefy arms on the ship? I'm unable to say because I have difficulty separating my current perception of humanity from the evolved society in this one, but some constants stay true. Is it not a sort of evolution? A macrocosm of wanting to unite all people of differing perspectives and backgrounds for a shared goal, for the greater good of the whole? If you sat a human down next to our early ancestors, given the same backgrounds, would they be the same, or somehow different?

I know I'm far from the smartest person in the room, but the original ending seems to be a win and the true ending a failure. Blowing up the star and dooming all those people who had little to no say in the matter strikes me as more harmful and staggeringly less productive. The people in the first ending who commit suicide chose that for themselves, after choosing for their children, that was their decision entirely, based on a principle of what it means to be human, and not what it means to be a sentient being (which is why ending 1, imo, is less wrong than ending two, where a handful of people make that choice for everyone who could choose to opt out themselves, over their own opinion of what it means to be human). Just wanted to say my wrong-thinking piece because it's been nagging me for a week.

comment by Reeee · 2017-03-21T05:01:19.863Z · LW(p) · GW(p)

Just wanted to add it was a really thought provoking and fun read, by failure, I did not mean on the part of the author, it's his story, but on the part of humanity. Sorry to double post, probably won't see more from me, just found this a compelling read.

comment by infotropism · 2009-02-04T12:00:07.000Z · LW(p) · GW(p)

How long will the superhappy-human-babyeater conglomerate last ? How many other species will they meet in the universe ? How arbitrary can and will be the aesthetics, morality, utilities of those new species ? If they are arbitrary enough, and enough of them are met, what will the resulting following compromises look like ?

Depending on how many goals, values, etc. are more or less universal - and some would perhaps be, since after most if not all those species will have come into being through evolution in the same universe - then those are the only thing that'll remain, the only values and particularities. As the rest is arbitrary, the average will probably cancel any subtlety out.

The longer you go, the more monomaniacal and bland the resulting compromise will become. In the end, you'll have something like orgasmium, for maybe a handful of values that were shared between a majority of those species. The rest, noise. Would that be ok ?

comment by [deleted] · 2009-02-04T13:04:45.000Z · LW(p) · GW(p)

Up next: Three Worlds/Unlimited Blade Works!

I hope the Confessor gets more face time. He's so badass.

comment by Svein_Ove2 · 2009-02-04T13:16:22.000Z · LW(p) · GW(p)

I have my own doubts, but I don't think it would have exactly that effect.

Remember, the Superhappys actually adopted some of the values of the humans and baby-eaters; it seems to be a volume-conserving operation, not set-intersection. Not, I think, that that makes it very much better.

comment by Abigail · 2009-02-04T13:30:56.000Z · LW(p) · GW(p)

I have a very strong personal motivation for making the moral assertion, "Diversity is good". I am transsexual, often meet people who have never met a TS before and am rarely in a group which is majority TS. Yet, I do believe in it as a moral requirement. If we are all the same, we all have the same blind spots. If we are all different, we see different things, and this is good, and interesting, and in our interests.

I rather hope that the more powerful alien race we meet will also value diversity as a moral good. I even believe it is a moral good even when, for example during the Scramble for Africa, almost no-one or no-one at all believes it.

Replies from: None, FiftyTwo
comment by [deleted] · 2011-01-25T11:50:41.898Z · LW(p) · GW(p)

Upvoted.

I think we are a long way off of genuinely being able to value diversity among societies. The universalistic impulse to convert the infidels or enlighten the other is still very strong.

I hope we will allow a diverse range of minds to exist. And consequently I hope that humans will someday be ok with humanity branching off into several societies with different values. I value genetic and cultural diversity quite a bit.

comment by FiftyTwo · 2011-03-31T01:58:01.642Z · LW(p) · GW(p)

While I agree with the value of diversity in general and your points for it, I disagree that it is a good in itself. Consider the ways in which we are morally acceptable in limiting diversity and by a greater extension individual freedom. We limit the free choices of many people, the most relevant example here being child abusers. We don't value the diversity of a society which contains the viewpoints of child abusers anywhere near as highly as the value of a society where children are not abused.

The difference with the super-happies is that they are not just limiting humanities ability to harm one another, and to harm its children, but their ability to harm themselves. Analogously, we prevent people from committing suicide in most cases, prevent access to certain drugs and so on, whether this is moral is a separate question.

A classical Mill style liberal would say that an individual can be restricted from actions that affect only themselves only when they are either irrational or not in possession of all of the facts (e.g. a child or a mentally ill person is considered irrational, and we prevent people accidentally harmign themselves through ignorance).

So are the super-happies behaving morally under this remit? Assuming they consider us rational then they are not. A better solution would be to allow all of humanity the option to turn their pain on and off, and either prevent all children being born or prevent children feeling pain before emotional maturity. That would allow individuals to make a rational choice between super-happy and pain/pleasure ways of life, and humanity as a whole could absorb the information and gradually change.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-02-04T13:41:15.000Z · LW(p) · GW(p)

I don't expect humanity to ever encounter any aliens - I would guess that the explanation for the Fermi Paradox is that life is rare, but I can easily see how a civilization built out of colliding values could continue to occupy the fun border between complexity and chaos. If one of the contributing species valued that sort of thing, and the others didn't object.

Replies from: RussellThor
comment by RussellThor · 2012-10-22T23:20:23.689Z · LW(p) · GW(p)

It could be the case that civilization always goes down something like the super happy route, but without such rationality. So rather than getting disappointed about not achieving space travel, they just turn off such disappointment. There would be no reason for ambition, you can just give yourself the feeling of satisfied ambition without actually achieving anything. Once you have access to your own source code, perhaps thing always end up that way.

Replies from: PrometheanFaun
comment by PrometheanFaun · 2013-05-25T03:05:53.151Z · LW(p) · GW(p)

No. I personally exhibit a viable human idiogenetic strain which places no value on comfort or pleasure as end-goals - a living counterexample. I try to adhere to the essence of the dictum of life as closely as possible; survive and explore. I'd expect that to be a more enduring invariant shared by distinct lineages than a fear of pain.

Though if humanity were a species for which our agents truly couldn't resist merging thoughts in every moment- and we very nearly are- I wouldn't exist. But that still only speaks of humanity.

comment by Steve_Rayhawk · 2009-02-04T14:16:40.000Z · LW(p) · GW(p)

Or, wait... To find the plot hole that permits the other ending takes searching. If no commenter had recognized that they preferred the other ending strongly enough, they would not have searched deeply enough. Was the meta-ethics test only that?

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-02-04T14:20:33.000Z · LW(p) · GW(p)

Steve, there's no incredibly deep metaethical lesson in the fact that I, as author, decided that the second ending would only be the True one "that actually happened" if a reader thought of it. I just wanted to take advantage of the blog format to offer a choice a bit more interactive than picking "1" or "2".

The most important advice you can offer to a rationalist is to avoid motivated skepticism; the second most important advice is not to overcomplicate things. Not everything I do is incredibly deep. Some things, sure, and even some things that aren't obvious at a first glance, but not everything.

On the other hand, no one has decoded the names of the ships yet, so I guess there's also something to be said for looking deeper.

comment by Emiezer_Shirouski · 2009-02-04T14:23:08.000Z · LW(p) · GW(p)
I am the core of my mind.
Belief is my body and choice is my blood.
I have revised over a thousand judgments.
Unaware of fear
Nor aware of hope.
Have withstood pain to update many times
Waiting for truth's arrival.
This is the one uncertain path.
My whole life has been...
Unlimited Bayes Works!
comment by Steve_Rayhawk · 2009-02-04T14:40:12.000Z · LW(p) · GW(p)

My whole life has been...

Should read:

So, as I strive...

(The original is idiomatic and hard to filk cleanly.)

comment by Anonymous48 · 2009-02-04T15:03:44.000Z · LW(p) · GW(p)

It's rather funny to see this end desribed as awful by Eliezer, who, at the same time, endorses things such as In my head I have an image of the parliament of volitional shadows of the human species, negotiating a la Nick Bostrom. The male shadows and the female shadows are pretty much agreed that (real) men need to be able to better read female minds; but since this is a satisfaction of a relatively more "female" desire - making men more what women wish they were - the male shadows ask in return that the sex-drive mismatch be handled more by increasing the female sex drive, and less by decreasing male desire...

So, intraspecies convergence of values is somehow ok, but interspecies isn't?

comment by michael_vassar3 · 2009-02-04T15:21:27.000Z · LW(p) · GW(p)

The trouble is that some years later Akon is not a super-happy baby-eating human but rather a hodge-podge of zillions of values. The super-happy population or resources can double in 35 hrs at current tech. Their tech advances much faster than human tech does at current population. This is their first encounter at current tech and population but in a year they will probably encounter and mix with over 2^240 new species!

More practically, severing the human starline system, in addition to being a cliche, seems very positive values utilitarian and very anti-CEV in that it imposes a decision to maintain disunion and thus the continued existence of true humans upon all future human generations. I see the appeal, but it doesn't seem to minimize the ratio of bad human worlds to good human worlds in a big universe. Really I can't seem to look away from the instant doom implications of a big universe with superluminal travel and exponentially growing populations of finite starting density.

comment by Kaj_Sotala · 2009-02-04T15:24:34.000Z · LW(p) · GW(p)

I don't see this ending as awful at all, except of course for the suicides. But a quarter of the ship's crew, with even higher rates among the general population? That strikes me as unrealistically high. For most people, it takes a lot to be pushed over the edge.

I also note that this is part 6. That means either that the true ending is in two parts, or that there'll be Something Completely Different as part eight, maybe an "author's comments" or some such.

comment by Urizen · 2009-02-04T15:26:04.000Z · LW(p) · GW(p)

Not everything I do is incredibly deep. Some things, sure, and even some things that aren't obvious at a first glance, but not everything.

Sometimes, a cigar is just a cigar..

comment by Martin4 · 2009-02-04T15:50:39.000Z · LW(p) · GW(p)

How can the superhappies not see THAT happening?

Martin

comment by Peter_de_Blanc · 2009-02-04T16:39:04.000Z · LW(p) · GW(p)

Mike said: Really I can't seem to look away from the instant doom implications of a big universe with superluminal travel and exponentially growing populations of finite starting density.

Maybe the universe itself grows exponentially faster than populations of life.

comment by Faré · 2009-02-04T16:51:58.000Z · LW(p) · GW(p)

25% suicide rate? Over something completely abstract that they haven't felt yet?

You didn't tell us about humans having been overcome by some weird Death Cult.

But, now it makes sense why they would give power to the Confessor.

Obviously, in this fantasy of would-be-immortal 21st century abstract thinker, your immortal 21st century abstract thinkers are worshipped as gods. And unhappily, they were told too much about Masada and other Kool-Aid when they were young.

There comes your judeo-christian upbringing again, in addition to the intellectual masturbation.

Eliezer -- get a life! The worst thing that ever happened to your intelligence was to be disconnected from reality by too early success.

comment by Nebu_Pookins · 2009-02-04T17:02:03.000Z · LW(p) · GW(p)

"Just as they would regret not eating the tiny bodies of the infants." is one of the more moving passages I've read in a long time. Well done Eliezer.

comment by John_Maxwell2 · 2009-02-04T18:09:09.000Z · LW(p) · GW(p)

Why did the SuperHappies adopt the Babyeater's ethics? I thought that they exterminated them. Or is 6/8 an alternative to 5/8 instead of its sequel?

It might be better to number the sections 1, 2, 3, 4, 5A, 6A, 5B, 6B.

comment by Ian_Maxwell · 2009-02-04T19:02:23.000Z · LW(p) · GW(p)

Has anyone else noticed that in this particular 'compromise', the superhappies don't seem to be actually sacrificing anything?

I mean, their highest values are being ultra super happy and having sex all the time, and they still get to do that. It's not as if they wanted not to create literature or eat hundreds of pseudochildren. Whereas humans will no longer get to feel frustrated or exhausted, and babyeaters will no longer get to eat real children.

I don't think the superhappies are quite as fair-minded as Akon thought. They agreed to take on traits of humanity and babyeating in an attempt to placate everyone, not because it was a fair trade.

comment by Furcas · 2009-02-04T19:13:50.000Z · LW(p) · GW(p)

Sure. To the Superhappies, letting a sentient being experience pain or discomfort is evil. Since they're the strongest, why would they willingly do something they consider to be evil?

Akon isn't entirely wrong, though. The Superhappies could have transformed humanity and the Babyeaters without changing themselves or their way of life in the slightest, and no one would have been able to stop them. But they didn't. That does show a certain degree of fair-mindedness that humans probably wouldn't have shown had they been in the same position.

comment by ad2 · 2009-02-04T20:01:24.000Z · LW(p) · GW(p)

The Superhappies could have transformed humanity and the Babyeaters without changing themselves or their way of life in the slightest, and no one would have been able to stop them.

Why would I care about whether the Superhappies change themselves to appreciate literature or beauty? What I want is for them to not change me.

All their "fair-mindedness" does is guarantee that I will be changed again, also against my will, the next time they encounter strangers.

Replies from: complexmeme
comment by complexmeme · 2012-12-26T16:27:57.307Z · LW(p) · GW(p)

that I will be changed again, also against my will, the next time

The next time, it presumably wouldn't be against your will, due to the first set of changes.

comment by Richard4 · 2009-02-04T20:29:38.000Z · LW(p) · GW(p)

If we sufficiently value episodes of aesthetic appreciation (in general, not only when done by us), etc., then the "compromise" could be a net positive, even from the perspective of our current values.

(But perhaps the point is that our values are in fact not so agent-neutral.)

comment by Anonymous_Coward4 · 2009-02-04T22:32:57.000Z · LW(p) · GW(p)

Regarding ship names in the koan....

Babyeaters: http://en.wikipedia.org/wiki/Midshipman's_Hope. Haven't read, just decoded from the name in the story.

But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring. It should be known to many people on this thread but it's been about 10 years since I last read it. Asimov, the gods themselves.

Anonymous.

Replies from: Tamfang
comment by Tamfang · 2010-08-14T03:37:19.778Z · LW(p) · GW(p)

Ah! I read it as "Sailor's Heart's Desire", with no particular significance.

comment by a_soulless_automaton · 2009-02-04T22:48:34.000Z · LW(p) · GW(p)

There seems to be a fairly large contingent of humanity who regard self-determination as the most significant terminal value to roughly the same single-minded extent that the Babyeaters view baby eating; including a willingness to sacrifice every other moral value to very large degrees in its favor. I assume many of the suicides fell into this group.

While not universal among humanity as baby eating is among the Babyeaters, the concept should have been fairly explicit in at least some of the cultural material transmitted. I wonder, were the Superhappies being willfully oblivious to this value, considering the extent to which they willingly violate it?

comment by Ryan · 2009-02-04T22:51:30.000Z · LW(p) · GW(p)

I'm with Kaj Sotala in not finding this ending to be awful.

The prospect of never feeling pain again would not in the least disturb me. Oh, I may 'enjoy' the pain of a good workout, but only because I believe it will help to reduce or postpone more pain later on.

The babyeating is weird, but we are talking about being transformed to want to do that, not being forced to do something we would actually find disgusting.

Whats the trouble there? I don't regret my past self being unable to forever prevent my current self from enjoying brussels sprouts.

Replies from: DragonRIderIL
comment by DragonRIderIL · 2009-09-10T21:11:17.034Z · LW(p) · GW(p)

I admit to this being my first time here, and my first reading of a lot of this stuff. Very interesting and food for thought. I would like to point out though, regardless of how the superhappy people do it, pain, in all its myriad forms is helpful, needful and a survival issue for HomoSapiens. Pain is the body's way of letting us know that our carrier is having an issue, whether it is the mild pain of a rash, or the severe pain of a burn. It is our mechanism to tell us there is an issue that needs attention. Emotional pain is a bit more complicated, however I submit that this too is a survival trait, remorse, sadness, pain at the loss of a companion or a loved one, all provide feedback to behaviour modification. It is our pain that defines and measures our pleasure. Our brains, and our bodies are all wired with symmetry... the symmetry of opposites... What is beauty without ugliness? What is sweet without sour... while I think that these things might be able to stand on their own, I absolutely believe that one would loose some of their intrinsic quality without their opposite to compare them to....

To quote a quote... the difference between bad and worse is far more evident than that between good and better....

As for the premise that there is a difference between being forced or being altered to like something... I am sorry but I have to disagree... If one does not voluntarily submit to the modification, then it is merely force of a differently nature... one of the reasons brainwashing is banned.... If I tell you that I can create a prosthetic limb, that you can use just like your natural one, save that it is 5x superior and you elect to make the change that is one thing... but for me to make the replacement regardless of your personal desire, then alter you to LIKE it... that is FORCE, in fact that is the ultimate force... I enforce my will on you even to the point of you liking it...

And, I further submit... such a fundamental change in our metabolism, dna, and brain wiring would effectively be the same as genocide... as the human race would simply cease to exist...

On another note... As for the comment that we as a species will never meet aliens... I would like to say a couple of things... first of all, I distrust absolutes... At 52, I found found that absolutes are far too malleable... I remember when protons, neutrons and electrons were the smallest of matter... absolutely... not so much any more... Further I would submit that, IF there is a god (or race of beings that kick started life here, or some architect that had a hand in it) I cannot see that entity being satisfied with a single experiment. That entity would either being doing it from hubris, or from curiosity, or perhaps simply because it could. Having done it once for any of those reasons simply means that it would be done many times... And creature/race/or intelligence that is capable of engineering something as complex as homo sapien, would be able to create a ton of different viable life forms... and would. If it is all random chance that we are here(which I find hard to swallow) then the same argument applies.. the cosmos is a HUGE place, so huge that we simply cannot comprehend how huge it is.... we can come close with some serious analogies like a muon on the ocean floor would represent our corner of just our galaxy out of millions of galaxies and .... oops there we go again... incomprehensible... So given even random chance... and the size of everything and the amount of matter out there... I would say that statistically even infitessimal possibilities become a certainty given the size of the pool... 1/10 of 1/10 of one percent of 10^666666666666666666 is a huge number all in it self... LOL

comment by simon2 · 2009-02-04T23:59:54.000Z · LW(p) · GW(p)

John Maxwell:

No, they are simply implementing the original plan by force.

When I originally read part 5, I jumped to the same conclusion you did, based presumably on my prior expectations of what a reasonable being would do. But then I read nyu2's comment which assumed the opposite and went back to look at what the text actually said, and it seemed to support that interpretation.

comment by simon2 · 2009-02-05T00:53:54.000Z · LW(p) · GW(p)

Actually, I'm not sure if that's what I thought about their intentions towards the babyeaters, but I at least didn't originally expect them to still intend to modify themselves and humanity.

comment by simon2 · 2009-02-05T00:54:49.000Z · LW(p) · GW(p)

...with babyeater values.

comment by infotropism · 2009-02-05T01:34:23.000Z · LW(p) · GW(p)

"But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring."

Wouldn't that be the player of games from banks ? Would kinda make sense no ?

comment by Brian_Macker · 2009-02-05T02:02:57.000Z · LW(p) · GW(p)

"Why did the SuperHappies adopt the Babyeater's ethics? I thought that they exterminated them."

They only exterminated the one ship so that it wouldn't blow up the star.

comment by Martin4 · 2009-02-05T04:21:41.000Z · LW(p) · GW(p)

Regarding the ships names: Impossible possible worlds would point to Heinleins: Number of the beast.

comment by Doug_S. · 2009-02-05T06:04:50.000Z · LW(p) · GW(p)

"But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring."

The story you're thinking of is The Gods Themselves by Isaac Asimov, the middle section of which stars the aliens you describe.

comment by Daniel_Franke · 2009-02-05T06:36:33.000Z · LW(p) · GW(p)

Hmm, I just noticed that there's a slight contradiction here:

"I know. Believe me, I know. Only youth can Administrate. That is the pact of immortality."

Then how is it possible for there to be such person as a Lord Administrator, if the title takes 100 years to obtain? While a civilization of immortals would obviously redefine their concept of youth, it seems like a stretch to call a centenarian young if 500 is still considered mind-bogglingly old.

comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2009-02-05T10:41:50.000Z · LW(p) · GW(p)

Daniel, is it a stretch to call a 20-year-old young if you would be impressed to meet a 100-year-old? Though the actual relation would be more like "Akon is 30 years old, the Confessor is a 90-year-old survivor of a famous catastrophe."

comment by Anonymous_Coward4 · 2009-02-05T15:15:02.000Z · LW(p) · GW(p)
"But I'm having trouble figuring out the superhappys. I can think of a story with rational and emotional protagonists, a plot device relating to a 'charged particle', and the story is centered around a solar explosion (or risk of one). That story happens to involve 3 alien genders (rational, emotional, parental) who merge together to produce offspring."
The story you're thinking of is The Gods Themselves by Isaac Asimov, the middle section of which stars the aliens you describe.

Yes, I believe I already identified the story in the final sentence of my post. But thanks anyway for clarifying it for those that didn't keep reading till the end :-)

Anonymous.

comment by Nominull3 · 2009-02-05T15:42:13.000Z · LW(p) · GW(p)

"Normal" End? I don't know what sort of visual novels you've been reading, but it's rare to see a Bad End worse than the death of humanity.

comment by Aaron_D._Ball · 2009-02-05T17:31:59.000Z · LW(p) · GW(p)

"Ion" Banks was cute. I'm finally catching up on this series days late, so it's astonishing that nobody else got that one. (But that's the only one I got.)

comment by Nebu_Pookins · 2009-02-06T05:46:05.000Z · LW(p) · GW(p)
Why would I care about whether the Superhappies change themselves to appreciate literature or beauty? What I want is for them to not change me.

The bargain that the Superhappies are offering is to change you less than if they had just changed you by force. I'm guessing if the humans didn't agree to the deal, the Superhappies would have either exterminated the humans completely, or convert them completely to superhappy values.

The benefit of Superhappies changing themselves to appreciate literature and beauty is that when they convert you, you get to keep the part of you that appreciated literature and beauty.

All their "fair-mindedness" does is guarantee that I will be changed again, also against my will, the next time they encounter strangers.

Actually, it won't be against your will, because you will have had the same values as them (you're all merged now, remember?)

comment by rickc · 2012-08-17T22:06:16.886Z · LW(p) · GW(p)

First-time commenter.

Perhaps I missed this in a comment to a previous part, but I don't see why we have to assume the super-happies are honoring the original plan. If their negotiations with the baby-eaters failed, the SH owe the BE nothing. They have no reason not to forcibly modify the BE, and, consequently, no reason to alter themselves or the humans to eat babies. (They could have also simply wiped out the BE, but genocide seems like a worse solution than "fixing" the BEs.)

Replies from: kybernetikos
comment by kybernetikos · 2013-07-16T21:51:37.460Z · LW(p) · GW(p)

The point is that they are the kind of species to deal with situations like this in a more or less fairminded way. That will stand them in good stead in future difficult negotiatons with other aliens.

comment by [deleted] · 2012-11-04T23:07:42.896Z · LW(p) · GW(p)

I see somewhat of an analogy now between this and Clarke's Cradle. The theme of a very physically and abruptly changing humanity.

What I do wonder, though, is why, in this entire story, nobody ever seriously considered the option of just leaving each other be. Live and let live and all that. It seemed rather obvious to me. Three species meet, exchange what they will, and go their own separate ways. After all, morality is subjective, and any species that understands the prisoners dilemma should understand that as well. All they had to do was walk away.

Replies from: Baughn, fractalman
comment by Baughn · 2013-02-15T16:32:59.749Z · LW(p) · GW(p)

Well. Morality may be subjective, but morality encodes preferences over states of the universe.

Yours may discount states that are far away and don't reach you; mine doesn't, so I'm with the Lord Pilot here. It would be impossible for me to satisfy my sense of ethics without doing something about the Babyeaters, even if that requires splitting humanity.

comment by fractalman · 2013-06-08T07:05:14.341Z · LW(p) · GW(p)

because both humans and super-happies agree: baby-eating needs to STOP ASAP!

comment by Zephyr1011 · 2013-07-07T13:59:19.509Z · LW(p) · GW(p)

Honestly, I think I prefer this ending over the other one

comment by EndlessStrategy · 2013-12-11T00:16:51.949Z · LW(p) · GW(p)

25% of the population suicided? I'm sorry, but that just seems...extremely unrealistic. Like it was tacked on to cement this as the bad ending.

comment by yasm · 2018-12-26T16:29:02.293Z · LW(p) · GW(p)

I much prefer this ending, though I have my reservations. Happiness, for all of its desirability, is a kind of cognitive bias all on its own. People that are happy are reluctant to let go of it, even partially and even if rationally it would be preferable - say, by extending many times over the happiness of their descendants millions of years in the future.

If the Superhappies were able to make rational decisions despite the saturated happiness, maybe through the Kiritsugus, then I don't see any loss whatsoever for the humans or Superhappies - after all three races merged together, they'd be able to sensibly decide whether the baby-eating custom was worth keeping and whether pain and strife was a necessary component of what makes a human being human.

That's the thing about a scenario that solves all conflicts while retaining all information and rationality - all the sub-optimal aspects of the outcome can be revised later. If superhappy-humans decided that they were no longer human and that there being no humans in the Universe was sadder than there being some, they could very well create some again. Honestly, what is there to be afraid of?

comment by AaronAgassi · 2023-05-31T04:24:38.858Z · LW(p) · GW(p)

Much tell, little show.