[LINK] Why Cryonics Makes Sense - Wait But Why
post by Paul · 2016-03-25T11:41:00.574Z · LW · GW · Legacy · 29 commentsContents
29 comments
Wait But Why published an article on cryonics:
http://waitbutwhy.com/2016/03/cryonics.html
29 comments
Comments sorted by top scores.
comment by James_Miller · 2016-03-25T15:50:34.442Z · LW(p) · GW(p)
Great article. I wonder if it will increase cryonics memberships?
Replies from: Elo↑ comment by Elo · 2016-03-26T09:39:15.574Z · LW(p) · GW(p)
I expect it will
Replies from: Soothsilver, Menilik↑ comment by Soothsilver · 2016-04-05T06:02:36.814Z · LW(p) · GW(p)
Apparently you were right.
From Alcor:
"Some of you may be already be familiar with Tim Urban’s remarkable blog, Wait But Why. You might be among the 336,693 subscribers to Tim’s blog, or you might just have come across one of his stunning detailed and clever posts, such as on procrastination, the genius of Elon Musk, The AI Revolution, or Putting Time in Perspective.
A few days ago, Tim posted what is possibly the single best piece ever written on cryonics. Warning: It is long and, once you start reading it, you will find it hard to stop. Please use it to persuade your non-cryonicist friends and relatives! The blog post has already generated a surge in visits to Alcor.org and in people engaging Marji in online chat, and in serious requests for membership information packets. You can find it Wait But Why, Why Cryonics Makes Sense."
Replies from: abcd-efgh↑ comment by abcd efgh (abcd-efgh) · 2020-06-12T02:29:46.939Z · LW(p) · GW(p)
I was wondering about this and emailed Alcor's leader, Mr. Max More. Apparently this article responsible for over 25 memberships alone.
comment by qmotus · 2016-03-25T16:39:20.647Z · LW(p) · GW(p)
I suppose the article does a good job answering some of the common objections, but I still think the most important thing that's stopping people from signing up is the fact that they just don't care: after all, life sucks, but at least then you die.
That said, there is one argument that I find kind of powerful that articles like this don't usually touch on (for somewhat obvious reasons): the point made in, for example, the preface to the finale of the Ultimate Meta Mega Crossover, that if we actually live in an infinite multiverse/many-worlds/nested simulverse/etc, we may be bound to find ourselves resurrected by someone eventually anyways, and cryonics could be a way to try to make sure that someone is friendly.
I'm not really sure what to make of that argument though. I wonder if there's anybody who's signed up because of reasons like that, despite not having any interest in cryonics in general?
Replies from: HungryHobo, Douglas_Knight, None↑ comment by HungryHobo · 2016-03-28T16:12:28.376Z · LW(p) · GW(p)
"could be a way to try to make sure that someone is friendly."
I don't believe in nested simulverse etc but I feel I should point out that even if some of those things were true waking up one way does not preclude waking up one or more of the other ways in addition to that.
Replies from: qmotus↑ comment by qmotus · 2016-03-29T12:53:14.489Z · LW(p) · GW(p)
I don't believe in nested simulverse etc
You mean none of what I mentioned? Why not?
but I feel I should point out that even if some of those things were true waking up one way does not preclude waking up one or more of the other ways in addition to that.
You're right. I should have said "make it more likely", not "make sure".
Replies from: HungryHobo↑ comment by HungryHobo · 2016-03-30T13:29:59.842Z · LW(p) · GW(p)
Why not?
Same reason I don't believe in god. As yet we have ~zero evidence for being in a simulation.
You're right. I should have said "make it more likely", not "make sure".
Your odds of waking up in the hands of someone extremely unfriendly is unchanged. You're just making it more likely that one fork of yourself might wake up in friendly hands.
Replies from: qmotus↑ comment by qmotus · 2016-03-31T07:49:30.157Z · LW(p) · GW(p)
As yet we have ~zero evidence for being in a simulation.
We have evidence (albeit no "smoking-gun evidence") for eternal inflation, we have evidence for a flat and thus infinite universe, string theory is right now our best guess at what the theory of everything is like; these all predict a multiverse where everything possible happens and where somebody should thus be expected to simulate you.
Your odds of waking up in the hands of someone extremely unfriendly is unchanged. You're just making it more likely that one fork of yourself might wake up in friendly hands.
Well, I think that qualifies. Our language is a bit inadequate for discussing situations with multiple future selves.
Replies from: HungryHobo↑ comment by HungryHobo · 2016-03-31T11:04:13.689Z · LW(p) · GW(p)
I find that about as convincing as "if you see a watch there must be a watchmaker" style arguments.
There are a number of ways theorized to test if we're in various kinds of simulation and so far they've all turned up negative.
String theory is famously bad at being usable to predict even mundane things even if it is elegant and "flat" is not the same as "infinite".
Replies from: qmotus↑ comment by qmotus · 2016-03-31T15:38:22.114Z · LW(p) · GW(p)
I find that about as convincing as "if you see a watch there must be a watchmaker" style arguments.
I don't see the similarity here.
There are a number of ways theorized to test if we're in various kinds of simulation and so far they've all turned up negative.
Oh?
String theory is famously bad at being usable to predict even mundane things even if it is elegant and "flat" is not the same as "infinite".
It basically makes no new testable predictions right now. Doesn't mean that it won't do so in the future. (I have no opinion about string theory myself, but a lot of physicists do see it as promising. Some don't. As far as I know, we currently know of no good alternative that's less weird.)
↑ comment by Douglas_Knight · 2016-03-27T17:46:07.769Z · LW(p) · GW(p)
By "the preface" do you mean the "memetic hazard warnings"?
Concepts contained in this story may cause SAN Checking in any mind not inherently stable at the third level of stress. Story may cause extreme existential confusion. Story is insane. The author recommends that anyone reading this story sign up with Alcor or the Cryonics Institute to have their brain preserved after death for later revival under controlled conditions. Readers not already familiar with this author should be warned that he is not bluffing.
I don't think that is claiming that it is a rational response to claims about the word.
we may be bound to find ourselves resurrected by someone eventually anyways, and cryonics could be a way to try to make sure that someone is friendly.
This is a quantum immortality argument. If you actually believe in quantum immortality, you have bigger problems. Here is Eliezer offering cryonics as a solution to those, too.
Replies from: qmotus↑ comment by qmotus · 2016-03-27T20:25:20.521Z · LW(p) · GW(p)
By "the preface" do you mean the "memetic hazard warnings"?
Yes.
I don't think that is claiming that it is a rational response to claims about the word.
I don't get this. I see a very straightforward claim that cryonics is a rational response. What do you mean?
This is a quantum immortality argument. If you actually believe in quantum immortality, you have bigger problems. Here is Eliezer offering cryonics as a solution to those, too.
I've read that as well. It's the same argument, essentially (quantum immortality doesn't actually have much to do with MWI in particular). Basically, Eliezer is saying that quantum immortality is probably true, it could be very bad, and we should sign up for cryonics as a precaution.
↑ comment by [deleted] · 2016-03-27T09:11:02.489Z · LW(p) · GW(p)
Why would someone make major decisions based on metaphysical interpretations of quantum physics that are lacking experimental verifiability? That seems like poor life choices.
Replies from: FeepingCreature, Lumifer, qmotus↑ comment by FeepingCreature · 2016-03-27T22:42:07.928Z · LW(p) · GW(p)
Tegmark 4 is not related to quantum physics. Quantum physics does not give an avenue for rescue simulations; in fact, it makes them harder.
As a simulationist, you can somewhat salvage traditional notions of fear if you retreat into a full-on absurdist framework where the point of your existence is to give a good showing to the simulating universes; alternately, risk avoidance is a good Schelling point for a high score. Furthermore, no matter how much utility you will be able to attain in Simulationist Heaven, this is your single shot to attain utility on Earth, and you shouldn't waste it.
It does take the sting off death though, and may well be maladaptive in that sense. That said - it seems plausible a lot of simulating universes would end up with a "don't rescue suicides" policy, purely out of a TDT desire to avoid the infinite-suicidal-regress loop.
I am continuously amused how catholic this cosmology ends up by sheer logic.
Replies from: qmotus↑ comment by qmotus · 2016-03-29T12:58:25.778Z · LW(p) · GW(p)
you can somewhat salvage traditional notions of fear ... Simulationist Heaven ... It does take the sting off death though
I find the often prevalent optimism on LW regarding this a bit strange. Frankly, I find this resurrection stuff quite terrifying myself.
I am continuously amused how catholic this cosmology ends up by sheer logic.
Yeah. It does make me wonder if we should take a lot more critical stance towards the premises that lead us to it. Sure enough, the universe is under no obligation to make any sense to us; but isn't it still a bit suspicious that it's turning out to be kind of bat-shit insane?
↑ comment by Lumifer · 2016-03-28T00:47:52.913Z · LW(p) · GW(p)
Why would someone make major decisions based on metaphysical interpretations of quantum physics that are lacking experimental verifiability?
As opposed to the usual "I've had a few beers and it seemed like a good idea at the time"..? X-)
↑ comment by qmotus · 2016-03-27T14:15:10.220Z · LW(p) · GW(p)
Perhaps you shouldn't. That said, it is recommended by Eliezer Yudkowsky, and his words often weigh quite heavily here.
I don't necessarily agree that lacking experimental verifiability means that we shouldn't take something into account when making decisions, if we have enough reasons to think that it's true nevertheless.
Replies from: Nonecomment by Menilik · 2016-04-03T22:49:26.854Z · LW(p) · GW(p)
This article made me think about how, since the really the beginning of time each generation of people has come up with some completely rational (at the time) argument of why they will get into the afterlife/heaven.
The Egyptians tried really hard to preserve the body using the best science they had available at the time - it kinda worked because they never suffered from the 'second death' (but I guess just not in the way they hoped).
And that thinking forced me to have a good long think about whether cryonics is just the same as previous beliefs or if it's something new. I did a video on it and (no I don't want to spam you guys), but this forum is the kinda people I really want to engage with - people who can both agree and question something at the same time. So here's my take on what Cryonics actually is: https://www.youtube.com/watch?v=etRz6qjVXs0
comment by Error · 2016-03-25T15:37:31.944Z · LW(p) · GW(p)
Heh. Damn, beat me to it.
I find this interesting because Tim doesn't seem to come from the LW-sphere, but still clicked on arguments that I typically associate with LW-type people. That may say more about what I'm exposed to than anything else, though.
Replies from: Furcas, SquirrelInHell↑ comment by SquirrelInHell · 2016-03-26T12:44:00.006Z · LW(p) · GW(p)
He explicitly quotes Eliezer, e.g.:
Most cryonicists have a hunch that you can survive cryopreservation intact (cryonicist Eliezer Yudkowsky argues that “successful cryonics preserves anything about you that is preserved by going to sleep at night and waking up the next morning”) but they also admit that this is yet another variable they’re not sure about. You might even want to consider this a fifth “If” to add onto our list: If what seems to be a revived me is actually me…
and also mentions at the end that he noticed Eliezer's writings commonly turn out to have something in common with what he wants to write about.
Replies from: Error↑ comment by Error · 2016-03-27T14:42:21.835Z · LW(p) · GW(p)
I think maybe we just interpreted it differently. That reads to me like someone on the outside coming in, not someone on the inside going out.
Replies from: SquirrelInHell↑ comment by SquirrelInHell · 2016-03-28T13:02:53.686Z · LW(p) · GW(p)
My interpretation was the same as yours, and I never said anything that contradicts it. I just provided some relevant information.
You might have exhibited a tendency to assume that all arguments attack your position by default?
Replies from: Error↑ comment by Error · 2016-03-29T01:49:01.440Z · LW(p) · GW(p)
Possibly.
I'm going to stick to possibly, though after a moment's mental grappling I realized that if I answer 'yes' to your question I'm acknowledging its truth, while if I answer 'no' to your question I'm demonstrating its truth....if that was intentional, well played. :-P
Replies from: SquirrelInHell↑ comment by SquirrelInHell · 2016-03-29T01:53:51.801Z · LW(p) · GW(p)
Herherher. This is why I always thought I would fit in the role of an evil mastermind.