Posts

AI and Non-Existence. 2025-01-25T19:36:22.624Z

Comments

Comment by Eleven on AI and Non-Existence. · 2025-01-27T04:16:47.518Z · LW · GW

Why is it not realistic? Which physical constraints are there to prevent such a scenario, are there any laws of physics that would prevent it? A lot of people believe that AI will soon be much smarter than us - unless you believe that AI will never be smarter than us, I'd like to know why this scenario is not realistic.

AI would probably not offer a deal, it would just do it. If its goals are to torture people for as long as it can, for whatever reason, it will just do it for billions and trillions of years.

The difference between this and being worried about hell for not being a Christian or a Muslim is that if you are not a Christian/Muslim and you thought about these things - you might have a small probability on Christianity/Islam being true - and you can construct a God which sends all Christians/Muslims to hell so they cancel each other out. If you have a 10% probability for Christianity/Islam being true and 90% probability for atheism being true, then yes, you should be worried about Christian/Muslim hell. 

Comment by Eleven on AI and Non-Existence. · 2025-01-26T06:06:19.152Z · LW · GW

If you are a naturalist or physicalist about humans - these copies are not me, they are my identical twins. If you want to go beyond naturalism or physicalism, that is perfectly fine, but based on our current understanding, these are identical twins of me and in no sense are they me. So whatever happens in these other universes - it is not going to be me going through the timeline, it will be my identical twin.

An infinite or very large universe/multiverse is highly speculative and no matter what I decide in this universe, if there is an infinite universe, it makes essentially no difference if all universes are the same. There is going to be 10^1000000 and far beyond that of my identical twins and I have no power to influence anything. To say that you can influence anything in that scenario is worse than saying that you can move earth to another galaxy by jumping on it. You have 0.00...0000...0001% effect on it, and in an infinite universe you have effectively zero effect on the fate of your copies - so no matter what you decide, you will not have any influence over it.

Comment by Eleven on AI and Non-Existence. · 2025-01-26T05:15:29.178Z · LW · GW

To say that non-existence is a gamble too is kind of like saying that a person who does not gamble in a casino is gambling too - because they are missing on a chance to win millions of dollars - to me that is more a matter of definitions and if one wants to argue for that, sure, let's accept that every single thing in life is a gamble.

Your assertion that humans will be able to integrate over large timespans might be true given the current human brain - but here we are talking about superintelligence, even with relatively primitive AIs  we are already talking about new medication and cures, superintelligence that would want to cause widespread suffering or torture you and be able to build a Dyson sphere around the sun and a thousand of other advanced technologies will be able not just to figure out how to torture you persistently (so that your brain does not adapt to the new state of constant torture) but also to increase your pain levels by 1000x - not all animals feels the same pain, and there is no reason to think that current pain experience of humans cannot be increased by a huge amount.

I don't think that it is not rational to take the gamble when the odds are 1%, much less when the odds are 20% or 49% or 70%. Let's go with 1% because I am willing to give you favourable odds - so the post asks, would you be willing to be in a torture chamber now 1 hour for every 99 hours that you are in a really happy state? We can increase that to 20 hours (20%) or what have you. And here I am talking about real extreme torture, not you have a headache. So imagine the worst torture methods that currently exist, and it is not waterboarding - check worst torture methods of history and if you are objective, whatever odds you would be willing to accept, if you say 20% or 1%, would you be willing to be really tortured for that amount of time single every day?