If I ran the zoo

post by Optimization Process · 2024-01-05T05:14:57.631Z · LW · GW · 0 comments

Contents

  Problem statement
  What I'd do
    Object level
    Meta level
None
No comments

(I was musing about what it means for an incoherent lump of meat to "have preferences," and thought it might be illuminating to consider what I'd do if I were, approximately, God. It, uh, hasn't been illuminating yet, but it's been entertaining and still seems at least potentially fruitful.)

 

Problem statement

You suddenly become omnipotent! Except, you can only do things that you understand in sufficient detail that you could accomplish it by micromanaging all the atoms involved. And, what the heck, assume you have effortless access to infinite computational power.

What do you do?

For concreteness, here are some interventions you might try:

This being LessWrong, you'll probably quickly hit on some way to use ten billion sped-up simulated geniuses to speedrun AI alignment, build a friendly superintelligence, and delegate your Godlike power to it. But the purpose of this thought experiment is to elucidate your preferences, which that strategy -- though very reasonable! -- dodges.

 

 

 

What I'd do

 

Object level

Just, like, the obvious. Slay Famine, Pestilence, and War. Stop accidents from happening. Scrap the solar system for parts and give everybody ultra-customizable space habitats connected by teleportation booths. (All this can be micromanaged by zillions of zillions of simulated clones of me.)

Let people opt out, obviously, in whole or in part.

There are still, to be clear, important wishes I can't grant, such as "make me smarter" or "make my memory not degrade as I age" or "help me and my partner solve this relationship snarl."

 

Meta level

0 comments

Comments sorted by top scores.