Posts

Comments

Comment by Bogdan_Butnaru2 on Failed Utopia #4-2 · 2009-01-21T13:56:36.000Z · LW · GW

That's not the message Eliezer tries to convey, Russell.

If I understood it, it's more like "The singularity is sure to come, and transhumanists should try very hard to guide it well, lest Nature just step on them and everyone else. Oh, by the way, it's harder than it looks. And there's no help."

Comment by Bogdan_Butnaru2 on Failed Utopia #4-2 · 2009-01-21T13:51:32.000Z · LW · GW

I was just thinking: A quite perverse effect in the story would be if the genie actually could have been stopped and/or improved: That is, its programming allowed it to be reprogrammed (and stop being evil, presumably leading to better results), but due to the (possibly complex) interaction between its 107 rules it didn't actually have any motivation to reveal that (or teach the necessary theory to someone) before 90% of people decided to kill it.