Near-mode cryonics: A thought experiment
post by Mati_Roy (MathieuRoy) · 2023-04-09T22:21:19.704Z · LW · GW · 2 commentsThis is a question post.
Contents
2 comments
(x-post: LessDead)
Set-up: In a world that’s otherwise the same, an anthropomorphic God comes to you with a gun and a check, and offers you the following deal, should you accept it. (God can perfectly see the future and is honest.)
Dilemma: God will give you a check of X USD if you accept to be immediately and instantly killed iff the current cryonics protocole is sufficient to preserve identity.
Alternative: God kills you iff the following is true: If you were signed up for cryonics and died this year, you would get reanimated at some point in the future.
Question: What’s the minimum X for which you would accept this deal? What probability of dying are you trading this for?
Comparison:
Say 100,000 USD and 5%. This means this deal would give you:
- 100% earning 100k USD
- 5% of dying (right now, instead of ~0%)
If you die and get cryopreserved:
- 100% losing 100k USD (i.e. a typical cost for state of the art cryonics)
- 5% of living (instead of 0%)
If you die and don’t get cyropreserved:
- 100% of saving 100k USD (by not paying for cryonics)
- 5% more chance of dying (100% instead of 95%)
In this sense, taking the deal is similar to not getting cryopreserved (and vice versa).
Disanalogies:
There are significant disanologies of course:
- The type of life you would live
- How long you life would be if you didn’t die
- The social cost of signing up for cryonics
I still think this might be a useful thought experiment to bring this dilemma to a more near mode way of thinking [? · GW]. It also reverses the default option.
Answers
2 comments
Comments sorted by top scores.
comment by Jiro · 2023-04-11T00:06:00.911Z · LW(p) · GW(p)
For anyone who thinks that cryonics is Pascal's mugging, this thought experiment amounts to "if you don't want to accept Pascal's mugging, how about I construct an opposite Pascal's mugging. I gotcha now--if you reject the first Pascal's mugging, you have to accept this one!" Cryonics is a Pascal's mugging with a small chance of a large benefit and a large chance of a smaller loss. This version is a small chance of a large harm and a large chance of a smaller benefit.
Pascal's mugging is a bad deal either way. I won't accept a very small X in this version merely because I think that the chance of success of cryonics is small, even if the calculation works out that way--the proper response to Pascal's mugging is to not calculate.
comment by Dagon · 2023-04-12T14:00:34.067Z · LW(p) · GW(p)
This is a very complicated and hard (for me) to internalize setup, meaning it doesn't evoke new framings, and doesn't serve any purpose (again, for me) as a thought experiment. That it's stated with multiple levels of if and iff makes it harder, but I don't have any suggestions (except maybe a flowchart or decision matrix) to improve.
It also doesn't help that it mixes dollars and life, asking for a dollar value against the possibility (even if infinitesimal) of death. Nor that it supposes an omniscient god, which if I had sufficient evidence to believe in the first place SERIOUSLY changes my expectations and probabilities.
And my underlying probability IS infinitesimal that I will die in a way that current cryopreservation systems (tech, legal, and social) will actually lead to my future resurrection.