Roughly you

post by JDR · 2016-04-21T15:28:01.973Z · LW · GW · Legacy · 7 comments

Contents

7 comments

Since, like everyone, I generalise from single examples, I expect most people have some older relative or friend who they feel has added some wisdom to their life - some small pieces of information which seem to have pervasively wormed their way into more of their cognitive algorithms than you would expect, coloring and informing perceptions and decisions. For me, this would most be my grandfather. Over his now 92 years he has given me gems such as "always cut a pear before you peel it" (make quick checks of the value of success before committing to time consuming projects) and whenever someone says "that's never happened before", finishing their sentence with "said the old man when his donkey died" (just because something hasn't happened before doesn't mean it wasn't totally predictable).

Recently, though, I've been thinking about something else he has said, admittedly in mock seriousness: "If I lose my mind, you should take me out back and shoot me". We wouldn't, he wouldn't expect us to, but it's what he has said.

The reason I've been thinking of this darker quotation is that I've been spending a lot of time with people who have "lost their minds" in the way that he means. I am a medical student, and on a rotation in old age psychiatry, so have been talking to patients most of whom have some level of dementia, often layered with psychotic conditions such as intractable schizophrenia, some of whom increasingly can't remember their own pasts let alone their recent present. They can become fixed in untrue beliefs, their emotional become limited, or lose motivation to complete even simple tasks.

It can be scary. In some ways, such illness represents death by degrees. These people can remain happy and have a good quality of life, but it's certain that they are not entirely the people they once were. In fact, this is a question we have asked relatives when deciding whether someone is suffering from early dementia: "Overall, in the way she behaves, does this seem like your mother to you? Is this how your mother acts?". Sometimes, the answer is "No, it's like she is a different person", or "Only some of the time". It's a process of personality-approximation, blurring, abridging and changing the mind to create something not quite the same. What my grandfather fears is becoming a rough estimate of himself - though again, for some, that re-drawn person might be perfectly happy with who they are when they arrive.

Why is this of interest to LessWrong? I think it is because quite a few people here (me included) have at least thought about bidding to live forever using things like cryogenics and maybe brain-download. These things could work at some point; but what if they don't work perfectly? What if the people of the future can recover some of the information from a frozen brain, but not all of it? What if we had to miss off a few crucial memories, a few talents, maybe 60 points of IQ? Or even more subtle things - it's been written a few times that the entirety of who a person is in their brain, but that's probably not entirely true - the brain is influenced by the body, and aspects of your personality are probably influenced by how sensitive your adrenals are, the amount of fat you have, and even the community of bacteria in your intestines. Even a perfect neural computer-you wouldn't have these things; it would be subtle, but the created immortal agent wouldn't completely be you, as you are now. Somehow, though, missing my precise levels of testosterone would seem an acceptable compromise for the rest of my personality living forever, but missing the memory of my childhood, half my intelligence or my ability to change my opinion would leave me a lot less sure.

So here's the question I want to ask, to see what people think: If I offered you partial immortality - immortality for just part of you - how rough an approximation of "you" would you be willing to accept?

7 comments

Comments sorted by top scores.

comment by gjm · 2016-04-21T17:11:38.867Z · LW(p) · GW(p)

I don't think it's just a matter of closeness of approximation.

  • I would feel much better about being replaced by an approximation with an extra 20 IQ points, than by an approximation with 20 fewer.
  • I would feel much better about being replaced by an approximation with extra memories of things that never actually happened to me, than by an approximation with lots of memories missing.
  • I would feel much better about being replaced by an approximation that had gained a keen interest in (say) cricket, than by an approximation that had lost its interest in (say) classical music.

The common theme here is that gains are better than losses, which is a bit content-free given that that's how we choose what to call a gain and what to call a loss. (But not wholly content-free. Think of it like this: I would be happier about becoming a part of someone "bigger" than I would be about having only a part of me survive.)

  • If some version of me is still alive in 50 years and is substantially different in personality, opinions, skills, etc., I think I would feel rather better about it if those changes happened gradually than if they happened suddenly; and better if they happened organically in response to external events, new evidence, etc, than if because someone rewrote my personality.

I fear this is mostly because "normal" things feel less threatening than "weird" things, and gradual mostly-endogenous changes are more normal than sudden exogenous ones.

As for "partial immortality" ... I think James_Miller has a point, which I will make more explicitly as follows: if you are going to grant anyone immortality, then good for them (unless their existence is horrible), so I would be glad about it even if they bore no relation to me at all. Perhaps the actual question is more "how much like me would this immortal being have to be in order for it to feel as if I am benefiting?". I'm not sure that's a question I should be trying to answer -- it seems to be encouraging a wrong way of thinking about personal identity.

comment by woodchopper · 2016-04-25T11:13:56.036Z · LW(p) · GW(p)

Why would something that is not atom to atom exactly what you are now be 'you'?

comment by DanArmak · 2016-04-23T01:59:25.690Z · LW(p) · GW(p)

If the future me is similar enough to present me, I want him to live. If he's too dissimilar to count as me, I have nothing against him living (although I may not want to spend a lot of money on it). The only reason I would want someone not to live is if that someone had a poor quality of life. If I had an option to sign up for cryonics, I would examine the clauses that (try to) determine under what conditions and with what expectations the frozen people are revived.

comment by Elo · 2016-04-21T22:31:15.996Z · LW(p) · GW(p)

In brief; above 90% for me personally.

I also wonder what fraction of "an approximation of "you", having a child or a few - would be.

comment by Gleb_Tsipursky · 2016-04-21T22:14:53.513Z · LW(p) · GW(p)

bidding to live forever using things like cryogenics and maybe brain-download. These things could work at some point; but what if they don't work perfectly? What if the people of the future can recover some of the information from a frozen brain, but not all of it?

I want to challenge this premise in relation to cryonics. Given that technology will improve, and that you indicate that you don't want to be one of the first ones whose information is recovered, there's a high probability that things will work out pretty well. After all, the kind of future where they choose to unfreez people is one that's resource-rich - otherwise why add more mouths to feed? So any future where you're likely to wake up is likely a good one.

comment by Dustin · 2016-04-21T17:31:32.788Z · LW(p) · GW(p)

I expect most people have some older relative or friend who they feel has added some wisdom to their life

Interesting, I expect this to not be the case...but my confidence in my expectation is weak.

If I offered you partial immortality - immortality for just part of you - how rough an approximation of "you" would you be willing to accept?

I think this question hides a pretty fundamental assumption. That assumption being whether or not we can talk coherently about "a rough approximation of 'you'".

If I'm missing 90% of my memories but with no hit on my IQ, I'd definitely accept that. But I'm not even sure the distinction between memories and IQ makes any sense. Would I accept coming back with 60IQ? Well, I don't think most people with 60 IQ want to die, so yes.

I think the only thing I can say with certainty is that if the state I come back in feels like it has some continuity with the current me and this future state does not want to exist, then I do not want to come back in that state.

I'm not even sure how coherent it is to say that last bit. For example, if you are to ask me what my wishes are if I was to develop Alzheimer's...I'm not positive that I have any claim over the disposition of this future being who shares some sort of continuity of physical existence with the me of now. To make claims about what should or should not be done to me at that point feels a little wrong. On the other hand, I am forced to make guesses at what future states of me would prefer the me of now to do so that I can make decisions about what to do now.

Is it possible to exist in a state where it's impossible to make decisions about whether or not I want to exist in that state while at the same time it making any difference whether or not I exist in that state? A rock, as far as I know, cannot make such decisions, but then I don't think it makes any difference if the rock exists or not. A worm doesn't seem to be able to make any decisions in a manner that has any important weight to me, and I don't think it makes any difference if it does or doesn't exist. A me with an IQ of 60 seems like it can make decisions about whether or not it exists. When it comes to a state of me with no ability to decide whether or not I want to exist...I have no idea whether that me should or should not exist. I also have no idea if that state of me is coherent to call a state of me.

Anyway, I'm just typing out a stream of thoughts without any coherent philosophy backing them. Which isn't to say that I haven't attempted to tackle the question, I'm just not smart enough to come to a satisfying answer.

comment by James_Miller · 2016-04-21T16:21:14.157Z · LW(p) · GW(p)

Anything >0 so long as life was better than non-existence for my successor.