[Reference request] Article by scientist giving lower and upper bounds on the probability of superintelligence
post by multifoliaterose · 2011-05-08T22:16:23.190Z · LW · GW · Legacy · 2 commentsContents
2 comments
A few months back somebody posted an article by a scientist giving lower and upper bounds on the probability of superintelligence. He broke up the calculation as a Fermi calculation with three parts (EDIT: See LocustBeanGum's answer). Does anybody remember this article and if so can you provide a link?
2 comments
Comments sorted by top scores.
comment by [deleted] · 2011-05-08T22:18:04.953Z · LW(p) · GW(p)
The original article; the LW link post.
Perhaps you mean this, but the probabilities involved in the argument are different. (human level AI will be built in 100 years; the AI will be able to undergo a recursive self-improvement in intelligence; this intelligence explosion will unpredictably transform our world)
Replies from: multifoliaterose↑ comment by multifoliaterose · 2011-05-09T01:17:31.692Z · LW(p) · GW(p)
Thanks, this is what I was looking for.