Posts

Comments

Comment by edbarbar on Bloggingheads: Yudkowsky and Horgan · 2008-06-08T06:42:50.000Z · LW · GW

Hopefully anonymous: There are strong warnings against posting too much, but my personal suspicion is that the next generation of AI will not colonize other planets, convert stars, or any of the things we see as huge and important, but go in the opposite direction and become smaller and smaller. At least, should the thing decide that survival is ethical and desirable.

But as sand or worms or simply irrelevant, the result is the same. We shouldn't be worried that our children consume us: it's the nature of life, and that will continue even with the next super intelligent beings. To evolve, everything must die or be rendered insignificant, and there is no escape from death even for stagnant species. I think that will hold true for many generations.

Comment by edbarbar on Bloggingheads: Yudkowsky and Horgan · 2008-06-08T05:04:30.000Z · LW · GW

Eli, Enjoyed your conversation with John today, though I suspect he would have tried to convince the Wright brothers to quit because so many had failed.

I read your essay on friendly AI, and think this essay is off the mark. If the singularity happens, there will be so many burdens the AI can throw off (such as anthropomorphism) it will be orders of magnitude superior very quickly. I think the apt analogy isn't we would be apes among men, but more like worms among men. Men need not nor should be concerned with worms, and worms aren't all that important in a world with men.

Ed