Posts

Comments

Comment by Keysersoze on I attempted the AI Box Experiment (and lost) · 2013-01-22T17:15:37.434Z · LW · GW

Of course not - but dismissing Yudkowsky's victories because the gatekeepers were "thinking poorly" makes no sense.

Because any advantages Yudkowsky had over the gatekeepers (such as more time and mental effort spent thinking about his strategy, plus any intellectual advantages he has) that he exploited to make the gatekeepers "think poorly", pales into insignificance to the advantages a transhuman AI would have.

Comment by Keysersoze on I attempted the AI Box Experiment (and lost) · 2013-01-22T03:44:11.864Z · LW · GW

, or they were just thinking poorly.

Every biological human will be thinking poorly in comparison to a transhuman AI.