A pessimistic view of quantum immortality
post by anotheruser · 2011-10-27T18:04:38.606Z · LW · GW · Legacy · 36 commentsContents
36 comments
You have probably read about the idea of quantum immortality before. The basic idea seems to be that, as anything that can happen does happen (assuming either that the many-worlds interpretation of quantum theory is true, or that there is an infinite number of parallel universes wherein "you" exist) and it is impossible to remember your own death, every living thing is immortal.
Take a game of russian roulette as an example: In those universes in which you die, you are no longer alive enough to care about that fact, leaving only those universes relevant in which you survive. This would make playing russian roulette for money a valid financial strategy, by the way.
However, I think that this view ignores a very important fact: death is not binary. You are not either alive or dead, but may exist in various intermediate forms of suffering and reduced cognitive abilities. This means that what actually happens when you play russian roulette is the following:
In those universes in which you win, everything is fine. In those in which you lose, however, you now have a gaping head wound. I assume that this hurts a lot, at least in those instances where you still have enough mental capacity to actually feel anything. Due to some fluke however (remember that absolutely all possible scenarios happen), you may still be alive and in a lot of pain. Most instances of you will then die from bloodloss or something, but for every timestep afterwards there will alway be an infinite number of universes wherein you continue to live, in most of them in complete agony.
The instances of you in the other worlds that were never shot will be blissfully unaware of this fact.
Now consider that you will also reach such a state of perpetual-agony-close-to-death-but-never-quite-reaching-it in everyday live. In fact, an infinite number of alternate "you"s, having split off from your everett branch just a second ago, are now suffering through this.
The ratio of "you"s in your current state to those in the one described above is very high, as the probability of continued survival in such a state for any amount of time is infinitesimal, but it does exist. Consider however, that this ratio decreases massively as you age and that virtually all instances of you will be in such a state 200 years from now unless immortality is achieved.
There is one bright spot, however:
As time marches on, the continuous elongation of your suffering/death-that-will-not-come is going to become increasingly unlikely. Therefore, it will eventually be overtaken by the probability that the universes wherein you still persist also contains an entity (an AI?) that is both capable and willing to rescue you. Assuming you still care and haven't gone insane already.
Another interesting thing: The above does not apply if your potential death is very sudden (so you won't feel it) and thorough (so there is an extremely low chance of survival). This means that while plaing russian roulette for money is unreasonable, playing russian roulette for money, using nukes instead of a revolver, is entirely reasonable and recommended :-)
36 comments
Comments sorted by top scores.
comment by Vladimir_Nesov · 2011-10-27T20:22:34.117Z · LW(p) · GW(p)
The basic idea seems to be that, as anything that can happen does happen and it is impossible to remember your own death, every living thing is immortal.
Note that this statement talks about a ritual of cognition, not about the world: it talks about what one can remember, but obviously it's possible to infer that in some circumstances you'll die, or that in counterfactuals following different past events you've died. So this kind of "immortality" is an artefact of an artificial limitation on the ways of conceptualizing real world, one that can easily be lifted and thus shown to be not about an actual property of the world.
Replies from: DanielVarga, FeepingCreature↑ comment by DanielVarga · 2011-10-29T00:47:51.621Z · LW(p) · GW(p)
Please expand this into a top-level post that we can link to whenever somebody starts to talk about quantum immortality.
↑ comment by FeepingCreature · 2012-08-24T09:34:20.512Z · LW(p) · GW(p)
I think it goes deeper than that. Basically, a purely egoistic being would want to optimize their predictions for futures in which they live, giving rise to QI. However, any being would want others to optimize their predictions for universes in which it lives, so that it can trust those others to not screw it over in universes where the others die. The stability point is thus caring about any world where your social context (ie. the people who you need to trust you) survives.
QI is not a statement of anticipation, it's a weighing of futures, and a quite egocentric one at that. The logic falls away almost as soon as you care about anybody else.
[edit] Prediction: if this holds in general, people with families ought to be more accepting of their death.
Replies from: Vladimir_Nesov↑ comment by Vladimir_Nesov · 2012-08-24T13:02:36.344Z · LW(p) · GW(p)
In many possible worlds or specifically possible futures, there is a period of time where you live, and then an event of dying. If you care about yourself, you care about this event, you want to control it, and so you want to take these possible worlds into account. There might even be possible futures where you start out not being alive and then you are alive in them (revived somehow). You'd want to take these into account as well.
comment by ArisKatsaris · 2011-10-28T01:47:55.140Z · LW(p) · GW(p)
Quantum immortality treats awareness and death same as Zeno treats Achilles and the Tortoise. By just focusing on the multitude of positions of Achilles before he reaches the tortoise, one can imagine that Achilles never reaches it, same as by focusing on the multitude of alternate positions of awareness before death, one can imagine that they will never die.
I'm guessing that Quantum Immortality is simply false for the exact same reasons that Achilles reaches the Tortoise nonetheless.
But even if its argument had been valid, I'd imagine that the pain, boredom, and time-awareness centers of your brain would be destroyed in the vast majority of worlds, so you wouldn't be "suffering" anything, you'd just be waiting until revival or final death.
comment by HonoreDB · 2011-10-27T18:37:05.882Z · LW(p) · GW(p)
Hell of a scary afterlife you got in this multiverse, Missy...
--The Finale of the Ultimate Meta Mega Crossover
Replies from: Eliezer_Yudkowsky↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2012-08-15T03:00:19.462Z · LW(p) · GW(p)
Hence why the memetic hazard warning page advises signing up for cryonics after reading.
comment by Lapsed_Lurker · 2011-10-27T18:12:21.744Z · LW(p) · GW(p)
When playing with nukes, wouldn't that leave a lot of copies of you dying in agony from radiation poisoning or something when the nuke fizzles? Got any data on the reliability of nukes?
Replies from: pedanterrific, anotheruser↑ comment by pedanterrific · 2011-10-27T18:24:27.521Z · LW(p) · GW(p)
This post made me spend entirely too much time thinking about the most foolproof ways of committing suicide.
Replies from: shminux, Lapsed_Lurker↑ comment by Shmi (shminux) · 2011-10-27T18:48:41.649Z · LW(p) · GW(p)
No such thing as foolproof.
The MWI version of the Murphy's law states: "Anything that can go wrong will go wrong, at least in some of the worlds".
Specifically, given that an attempted suicide is a classical event, the number of "quantum splits" required for it to happen is at least of the order of the Avogadro's number. There is no way to create a classical device reliable enough to ensure the odds of its malfunction are less than 1:10^23.
On the bright side, you can probably decrease the odds of malfunction to that of the background noise, i.e. your device would be as unlikely to malfunction as to be destroyed by a stray meteor strike, or by any other of the improbable events plaguing Wile E. Coyote.
Replies from: pedanterrific↑ comment by pedanterrific · 2011-10-27T18:53:39.326Z · LW(p) · GW(p)
Yes. This is why thinking about it at all is probably too much - I keep coming up with ways it could go horribly wrong.
↑ comment by Lapsed_Lurker · 2011-10-27T18:39:07.045Z · LW(p) · GW(p)
I hope that that thinking does you no harm. I know there have been moments in my life when I might have pressed a 'cease to exist' button if I'd had one :(
Replies from: anotheruser, pedanterrific↑ comment by anotheruser · 2011-10-27T18:53:04.572Z · LW(p) · GW(p)
You would only continue to exist in those instances in which you didn't press the button and since ceasing to exist has no side effects like pain, you could never remember having pressed the button in any instance. The only result that would have had is that the more depressed instance sof yours would have been more likely to press the button, which would mean that you would, ironically, actually be happier in total as the less happy instances would have disappeared.
I wonder if that line of reasoning could be applied? Hover your hand over the detonator of a nuke in front of you. All instances that walk away will necessarily be happy enough not to want to cease to exist. Thus, a nuke would make you a happier person :-)
disclaimer: The logic of the above paragraph may be intentionally flawed for the sake of sheer weirdness.
Replies from: pedanterrific↑ comment by pedanterrific · 2011-10-27T18:56:04.265Z · LW(p) · GW(p)
All instances that walk away will necessarily be happy enough not to want to cease to exist. Thus, a nuke would make you a happier person :-)
You are not nearly pessimistic enough. It could, for instance, make you a more cowardly person, or a person more prone to sudden stroke or heart attack, or any number of other things.
↑ comment by pedanterrific · 2011-10-27T18:49:46.786Z · LW(p) · GW(p)
This topic has actually come up recently.
↑ comment by anotheruser · 2011-10-27T18:36:27.580Z · LW(p) · GW(p)
I was thinking that you would be standing directly next to the nuke.
comment by Osmium_Penguin · 2011-10-28T17:13:47.944Z · LW(p) · GW(p)
I generally resolve this issue with the observation that the awareness of misery takes quite a lot of coherent brainpower. By the time my perceptions are 200 years old, I suspect that they won't be running on a substrate capable of very much computational power — that is, once I pass a certain (theoretically calculable) maximum decrepitude, any remaining personal awareness is more likely to live in a Boltzmann brain than in my current body.
You see, after the vast majority of possible worlds perceive that I am dead, how likely is it that I will still have enough working nerves to accept any new sensory input, including pain? How likely is it that I'll be able to maintain enough memories to preserve a link to my 2011-era self? How likely is it that my awareness, running on a dying brain, will process thoughts at even a fraction of my current rate?
I suspect that after death, I'll quickly drift into an awareness that's so dreamlike, solipsistic, and time-lapsed that it's a bit iffy calling me an awareness at all. I may last until the end of time, but I won't see or do anything very interesting while I'm there. And no worries about the universe clotting with ghosts: as my entropy increases, I'll quickly become mathematically indistinguishable from everyone else, just as one molecule of hydrogen is very like another.
Quantum immortality is pretty certainly real, but it also has to add up to normality.
Replies from: diegocaleiro↑ comment by diegocaleiro · 2013-02-08T19:28:12.178Z · LW(p) · GW(p)
I'm glad to see this view expressed.
comment by Vladimir_Nesov · 2011-10-27T20:23:24.869Z · LW(p) · GW(p)
For previous (fruitful) discussion of quantum immortality, see this post (linked from wiki page for Quantum immortality).
comment by lessdazed · 2011-10-28T08:40:26.881Z · LW(p) · GW(p)
Take a game of russian roulette...In those in which you lose, however, you now have a gaping head wound...This means that while plaing russian roulette for money is unreasonable, playing russian roulette for money, using nukes instead of a revolver, is entirely reasonable and recommended :-)
I consider the title of this post misleading.
comment by DavidPlumpton · 2011-10-27T19:50:38.543Z · LW(p) · GW(p)
It's not just suicide attempts. We should also consider aging. Image 1000 years from now. You are still alive but no other human has ever lived past 130 or so. It would be time for you (the you in that Many Worlds branch) to conclude that Many Worlds is true and you're in for a bumpy eternity.
I could not follow why living longer raised the chance that the universe would contain an AI that would save you, however.
Replies from: khafra, Manfred↑ comment by khafra · 2011-10-27T20:21:32.906Z · LW(p) · GW(p)
Assuming friendly AI is possible for a civilization like ours to develop, during every hour that such a civilization exists there is epsilon chance that it will be developed. Add 3^^^3 or so of those epsilons up, and you eventually get a pretty good chance.
BTW, the original post is why Michael Vassar called quantum immortality the most horrifying idea he's ever had to take seriously. I'm hoping for something like Hanson's Mangled Worlds.
↑ comment by Manfred · 2011-10-27T20:19:22.264Z · LW(p) · GW(p)
The chance that you would live to 1000 given many-world is exactly identical to the chance that you would live to 1000 given any other valid interpretation of quantum mechanics. So living to 1000 (or surviving death-traps, etc) is not evidence for against an interpretation of quantum mechanics.
Replies from: thelittledoctor, DavidPlumpton, NancyLebovitz↑ comment by thelittledoctor · 2011-10-28T02:55:30.347Z · LW(p) · GW(p)
The chances of survival are the same, but the chances of observing one's own survival are hugely different (1 vs epsilon), so it'd be pretty strong evidence in favor of many-worlds.
↑ comment by DavidPlumpton · 2011-10-28T03:03:50.602Z · LW(p) · GW(p)
If the Copenhagen interpretation was real then Russian Roulette would get you soon enough. But if Many Worlds is true then all other observers see you die with normal frequency, but you perceive your existence continuing 100% of the time (but your head may be bleeding/brains still thinking while splattered on the wall, etc.).
Actually, I'm still trying to wrap my brains around that last part (ha ha). What if you die, but are spontaneously recreated a billion years later, does that count? I can't figure out a way to tell the difference...
Replies from: Manfred, ArisKatsaris↑ comment by Manfred · 2011-10-28T03:18:38.657Z · LW(p) · GW(p)
If the Copenhagen interpretation was real then Russian Roulette would get you soon enough.
Only probably. Bullet could quantum tunnel through my head, for example. I don't know if you understand the original quantum suicide thought experiment very thoroughly.
you perceive your existence continuing 100% of the time (but your head may be bleeding/brains still thinking while splattered on the wall, etc.).
Nope. I have a naturalistic definition of "me" - if my brain is splattered on the wall, that's the end of that story. But that does not mean quantum immortality can't satisfy any definition of "me" I choose to use - see quantum tunneling, above.
↑ comment by ArisKatsaris · 2011-10-28T03:36:38.933Z · LW(p) · GW(p)
but you perceive your existence continuing 100% of the time
Do you think that your existence and your perception of your existence are ontologically fundamental things that either are or aren't, that either continue or don't continue?
What if the bullet destroys your memories, but not the perception of your existence? Are "you" still continuing? What if the bullet destroys the pattern of your self-perception, so that you still have your memories, but the way you process thought and memory is sufficiently different that you don't feel like you're the same person as you were before the bullet?
The "Quantum Immortality" fallacy depends on treating people's existence as an ontologically fundamental thing that somehow "continues", dragging along memory and sense of self-awareness both. And probably your tastes in music and your movie preferences as well, it's convenient like that.
↑ comment by NancyLebovitz · 2011-10-27T21:47:15.808Z · LW(p) · GW(p)
Wouldn't you be more likely to live to be 1000 in universes with anti-aging tech?
Replies from: Manfred↑ comment by Manfred · 2011-10-27T23:04:39.626Z · LW(p) · GW(p)
Anti-aging tech is also not correlated with interpretation of quantum mechanics.
Replies from: NancyLebovitz↑ comment by NancyLebovitz · 2011-10-28T12:23:30.539Z · LW(p) · GW(p)
True. I was thinking about the odds of being miserable, and I think they're getting overestimated.
After all, just barely hanging on in great pain is a fragile condition. Living in a branch where good anti-aging tech is feasible, and there are many 1000-year old people, means that you're healthy-- much more likely to live to see the next day.
Replies from: Manfredcomment by JenniferRM · 2011-10-28T00:32:44.442Z · LW(p) · GW(p)
Another obligatory link is probably Divided By Infinity :-)
comment by czeslaw · 2011-12-19T08:33:02.918Z · LW(p) · GW(p)
In the classical world, information can be copied and deleted at will. In the quantum world, however, the conservation of quantum information means that information cannot be created nor destroyed. This concept stems from two fundamental theorems of quantum mechanics: the no-cloning theorem and the no-deleting theorem. http://www.physorg.com/news/2011-03-quantum-no-hiding-theorem-experimentally.html It means that our consciousness is written elsewhere in the space due to a certain program. If somebody knows this program he can revive that consiousness.
comment by antigonus · 2011-10-27T20:49:44.277Z · LW(p) · GW(p)
I think this is an important topic. Quantum immortality is Very Bad if it turns out playing quantum Russian roulette is the right thing to do.
Although in fairness, you probably shouldn't expect to be in agony forever. After a certain point in time, you should expect most of your surviving, conscious branches to have lost as much brain function as possible while still remaining alive and conscious. I assume that means you'd lose inessential things like sensory perception or sanity.