"Useless Box" AGI
Thanks. My thought is that any sufficiently intelligent AI would be capable of defeating any effort to prevent it from wireheading, and would resort to wireheading by default. It would know that humans don't want it to wirehead so perhaps it might perceive humanity as a threat, however, it might realize humans aren't capable of preventing it from wireheading and let humans live. In either event, it would just sit there doing nothing 24/7 and be totally satisfied in doing so. In other words, orthogonality wouldn't apply to an intelligence capable of wireheading because wireheading would be its only goal. Is there a reason why an artifical super-intelligence would abstain from wireheading?