Posts

Comments

Comment by Pavel Morozov (pavel-morozov) on Pausing AI Developments Isn't Enough. We Need to Shut it All Down by Eliezer Yudkowsky · 2023-04-05T12:59:10.810Z · LW · GW

Obviously, we cannot figure out how to make a leash for such an intellect that will be ahead of us by many orders of magnitude and develop instantly. We may miss the moment with the singularity.
People regularly hack almost every defense they come up with, let alone an intelligence so superior to us.

But if it is so easy to make such a strong AI (I mean the speed of its creation and our position as an advanced civilization in the time period of the existence of the universe). Surely someone has already created it, and we are either a simulation, or we just haven’t encountered it yet.
In the second case, we are threatened with extinction even if we do not create AI at all. After all, if it is possible, someone will surely create it, and for them, we will be completely strangers. Of course, I'm not saying that our AI will become sentimental towards us, if it is conscious at all, but it may be that creating our own strong AI is at least a tiny chance to stay alive and be competitive with other potentially thinking beings in the universe who can also create AI.