Posts
Comments
Comment by
Sam_Bhagwat on
Fake Utility Functions ·
2007-12-07T04:19:54.000Z ·
LW ·
GW
"Bootstrap the FAI by first building a neutral obedient AI(OAI) that is constrained in such a way that it doesn't act besides giving answers to questions."
As long as we be sure not to feed it too hard questions, specifically, questions that it is hard to answer a priori without actually doing something. (eg, an AI that tried to plan the economy would likely find it impossible to define and thus solve the relevant equations without being able to adjust some parameters)