Posts

Comments

Comment by Birgitte on Not Taking Over the World · 2008-12-16T03:26:32.000Z · LW · GW

Eliezer: Let's say that someone walks up to you and grants you unlimited power.

Lets not exaggerate. A singleton AI wielding nanotech is not unlimited power; it is merely a Big Huge Stick with which to apply pressure to the universe. It may be the biggest stick around, but it's still operating under the very real limitations of physics - and every inch of potential control comes with a cost of additional invasiveness.

Probably the closest we could come to unlimited power, would be pulling everything except the AI into a simulation, and allowing for arbitrary amounts of computation between each tick.

Billy Brown: If you give each individual whatever they want you’ve just destroyed every variety of collectivism or traditionalism on the planet, and those who valued those philosophies will curse you.

It's probably not the worst tradeoff, being cursed only by those who feel their values should take precedence over those of other people.