Posts

Comments

Comment by ad2 on The Pascal's Wager Fallacy Fallacy · 2009-03-19T21:24:48.000Z · LW · GW

I don't have to tell you that it's easier to get a Singularity that goes horribly wrong than one that goes just right

Don't the acceleration-of-history arguments suggest that there will be another singularity, a century or so after the next one? And another one shortly after that, etc?

What are the chances that they will all go exactly right for us?

Comment by ad2 on On Not Having an Advance Abyssal Plan · 2009-02-24T19:49:31.000Z · LW · GW

Years ago at the Singularity Institute, the Board was entertaining a proposal to expand somewhat. I wasn't sure our funding was able to support the expansion, so I insisted that - if we started running out of money - we decide in advance who got fired and what got shut down, in what order. Even over the electronic aether, you could hear the uncomfortable silence. …

People are really, really reluctant to plan in advance for the abyss. But what good reason is there not to? How can you be worse off from knowing in advance what you'll do in the worse cases?

I don’t suppose you can. But the process of deciding in advance can cause a lot of trouble. It would be necessary for people to argue in favour of e.g. firing Eliezer Yudkowsky first, rather than anybody else. Then you might have to work with the person who made that argument. Perhaps after arguing that he should be fired first, instead.

Perhaps it would have been easier to decide in advance that everyone should take a pay cut...

Short version: People try to avoid hard choices because they are hard. If the choice will not have to be implemented for a long time, if ever, there is therefore a lot of pressure to defer making the choice. After all, if you defer it long enough, you might never have to make it at all.

Comment by ad2 on Epilogue: Atonement (8/8) · 2009-02-06T18:02:56.000Z · LW · GW

It's somehow depressing that in this story, a former rapist dirtbag saves the world.

Why is that depressing?

And if the good and decent officer who pressed that button had needed to walk up to a man, a woman, a child, and slit their throats one at a time, he would have broken long before he killed seventy thousand people.

I have my doubts about that. If he could do it seven times, he could do it seventy thousand times. Since when was it harder for a killer to kill again?

Comment by ad2 on True Ending: Sacrificial Fire (7/8) · 2009-02-05T19:29:30.000Z · LW · GW

Sure, I would turn this down if it were simply offered as a gift. But I really, really, cannot see preferring the death of fifteen billion people over it.

How many humans are there not on Huygens?

Comment by ad2 on Normal Ending: Last Tears (6/8) · 2009-02-04T20:01:24.000Z · LW · GW

The Superhappies could have transformed humanity and the Babyeaters without changing themselves or their way of life in the slightest, and no one would have been able to stop them.

Why would I care about whether the Superhappies change themselves to appreciate literature or beauty? What I want is for them to not change me.

All their "fair-mindedness" does is guarantee that I will be changed again, also against my will, the next time they encounter strangers.

Comment by ad2 on Interlude with the Confessor (4/8) · 2009-02-02T18:19:47.000Z · LW · GW

"Um," Akon said. He was trying not to smile. "I'm trying to visualize what sort of disaster could have been caused by too much nonconsensual sex -"

Akon obviously does not regard the idea of nonconsensual sex with much distaste. So why would he want it banned?

I think the important question is: Why does he not regard the idea of nonconsensual sex with much distaste?

(I can't help but think of The Forever War, where military custom and law require consent to any request for sex.)

Comment by ad2 on The Super Happy People (3/8) · 2009-02-01T22:26:58.000Z · LW · GW

Aleksei, children are rarely enthusiastic about the idea of leaving their parents. Why would they trust the Super Happy People?

"And you should understand, humankind, that when a child anywhere suffers pain and calls for it to stop, then we will answer that call if it requires sixty-five thousand five hundred and thirty-six ships."

How would they hear it? They did not even know about humanity until just now, much less hear the calls for help of any human child. All they have to do is not go looking for miserable children, and they will not find any, or feel their suffering.

On a related note: For whatever reason, you currently permit the existence of suffering which our species has eliminated. Bodily pain, embarrassment, and romantic troubles are still known among you. Your existence, therefore, is shared by us as pain.

What did the Super Happy People do with every previous species they have encountered?

Comment by ad2 on Total Nano Domination · 2008-11-27T20:35:37.000Z · LW · GW

There was never a Manhattan moment when a computing advantage temporarily gave one country a supreme military advantage, like the US and its atomic bombs for that brief instant at the end of WW2.

Did atomic bombs give the US "a supreme military advantage" at the end of WW2?

If Japan had got the bomb in late 1945 instead of the US, could it have conquered the world? Or Panama, if it were the sole nuclear power in 1945?

If not, then did possession of the bomb give "a supreme military advantage"?

Comment by ad2 on Mundane Magic · 2008-10-31T20:15:25.000Z · LW · GW

How many wielders of the Ultimate Power have been killed by humble microbes?

Comment by ad2 on Living in Many Worlds · 2008-06-06T17:40:00.000Z · LW · GW

the straightforward and unavoidable prediction of quantum mechanics.

Newtonian mechanics makes many straightforward and unavoidable predictions which do not happen to be true. I assume that no one has ever tested this prediction, or you would have given the test results to back up your assertion.

Just a thought.