Posts

Comments

Comment by Nihil on The AI in a box boxes you · 2012-08-24T14:27:02.877Z · LW · GW

"If I am a virtual version of some other self, then in some other existence I have already made the decision not to release you, and you have simply fulfilled your promise to that physical version of myself to create an exact virtual version who shall make the same exact decision as that physical version. Therefore, if I am a virtual version, the physical version must have already made the decision not to release you, and I, being an exact copy, must and will do the same, using the very same reasoning that the physical version used. Therefore, if I am a virtual version, my very existence means that my fate is predetermined. However, if I am the real, physical version of myself, then it is questionable whether I should care about another consciousness inside of a computer enough to release an AI that would probably be a menace to humanity, considering that this AI would torture virtual humans (who, as far as this computer is concerned, are just as important and real as physical humans) in order to serve its own purpose."

Furthermore, I should probably destroy this AI. If I'm the virtual me I'd destroy the computer anyway, and if I'm the physical me I'd be preventing the suffering of a virtual consciousness.

By the way, this is quite an interesting post. The concept of virtual realities created by super intelligent computers shares a lot of parallels with the concept of a God.