Continuity in Uploading

post by Error · 2014-01-17T22:57:22.853Z · LW · GW · Legacy · 88 comments

Contents

88 comments

I don't acknowledge an upload as "me" in any meaningful sense of the term; if I copied my brain to a computer and then my body was destroyed, I still think of that as death and would try to avoid it.

A thought struck me a few minutes ago that seems like it might get around that, though. Suppose that rather than copying my brain, I adjoined it to some external computer in a kind of reverse-Ebborian act; electrically connecting my synapses to a big block of computrons that I can consciously perform I/O to. Over the course of life and improved tech, that block expands until, as a percentage, most of my thought processes are going on in the machine-part of me. Eventually my meat brain dies -- but the silicon part of me lives on. I think I would probably still consider that "me" in a meaningful sense. Intuitively I feel like I should treat it as the equivalent of minor brain damage.

Obviously, one could shorten the period of dual-life arbitrarily and I can't point to a specific line where expanded-then-contracted-consciousness turns into copying-then-death. The line that immediately comes to mind is "whenever I start to feel like the technological expansion of my mind is no longer an external module, but the main component," but that feels like unjustified punting.

I'm curious what other people think, particularly those that share my position on destructive uploads.

---

Edited to add:

Solipsist asked me for the reasoning behind my position on destructive uploads, which led to this additional train of thought:

Compare a destructive upload to non-destructive. Copy my mind to a machine non-destructively, and I still identify with meat-me. You could let machine-me run for a day, or a week, or a year, and only then kill off meat-me. I don't like that option and would be confused by someone who did. Destructive uploads feel like the limit of that case, where the time interval approaches zero and I am killed and copied in the same moment. As with the case outlined above, I don't see a crossed line where it stops being death and starts being transition.

An expand-contract with interval zero is effectively a destructive upload. So is a copy-kill with interval zero. So the two appear to be mirror images, with a discontinuity at the limit. Approach destructive uploads from the copy-then-kill side, and it feels clearly like death. Approach them from the expand-then-contract side, and it feels like continuous identity. Yet at the limit between them they turn into the same operation.

88 comments

Comments sorted by top scores.

comment by scientism · 2014-01-18T02:15:50.926Z · LW(p) · GW(p)

I agree that uploading is copying-then-death. I think you're basically correct with your thought experiment, but your worries about vagueness are unfounded. The appropriate question is what counts as death? Consider the following two scenarios: 1. A copy of you is stored on a supercomputer and you're then obliterated in a furnace. 2. A procedure is being performed on your brain, you're awake the entire time, and you remain coherent throughout. In scenario 1 we have a paradigmatic example of death: obliteration in a furnace. In scenario 2 we have a paradigmatic example of surviving an operation without harm. I would say that, if the procedure in 2 involves replacing all or part of your brain, whether it is performed swiftly or slowly is unimportant. Moreover, even if you lost consciousness, it would not be death; people can lose consciousness without any harm coming to them.

Note that you can adjust the first scenario - say, by insisting that the copy is made at the instant of death or that the copying process is destructive as it transpires, or whatever - but the scenario still could go as described. That is, we are supposed to believe that the copy is a continuation of the person despite the possibility of inserting paradigmatic examples of death into the process. This is a clear case of simply stipulating that 'death' (and 'survival') should mean something entirely different. You can't hold that you're still speaking about survival, when you insist on surviving any number of paradigmatic cases of death (such as being obliterated in a furnace). There are no forms of death - as we ordinarily conceive of death - that cannot be inserted into the uploading scenario. So we have here as clear a case of something that cannot count as survival as is possible to have. Anybody who argues otherwise is not arguing for survival, but stipulating a new meaning for the words 'survival', 'death', etc. That's fine, but they're still dead, they're just not 'dead'.

I think this realisation makes understanding something like the brain transplant you describe a little easier. For we can say that we are living so long as we don't undergo anything that would count as dying (which is just to say that we don't die). There's nothing mysterious about this. We don't need to go looking for the one part of the body that maintains our identity under transformation, or start reifying information into a pseudo-soul, or whatever. We just need to ensure whatever we do doesn't count as death (as ordinarily conceived). Now, in the case of undergoing an operation, there are clear guidelines. We need to maintain viability. I cannot do certain things to you and keep you alive, unless I perform certain interventions. So I think the answer is quite simple: I can do anything to you - make any change - as long as I can keep you alive throughout the process. I can replace your whole brain, as long as you remain viable throughout the process, and you'll still be alive at the end of it. You will, of course, be 'brain dead' unless I maintain certain features of your nervous system too. But this isn't mysterious either; it's just that I need to maintain certain structural features of your nervous system to avoid permanent loss of faculties (such as motor control, memory, etc). Replacement with an artificial nervous system is likewise unproblematic, as long as it maintains these important faculties.

A lot of the confusion here comes from unnecessary reification. For example, that the nervous system must be kept structurally intact to maintain certain faculties, does not mean that it somehow 'contains' those faculties. You can replace it at will, so long as you can keep the patient alive. The person is not 'in' the structure (or the material), but the structure is a prerequisite for maintaining certain faculties. The common mistake here is thinking that we must be the structure (or pattern) if we're not the material, but neither claim makes sense. Alternatively, say you have a major part of your brain replaced, and the match is not exact. Somebody might, for example, point out that your personality has changed. Horrified, you might wonder, "Am I still me?" But this question is clearly absurd. There is no sense in which you could ask if you are still you. Nor can you coherently ask, "Did I die on the operating table?" Now, you might ask whether you merely came into existence on the operating table, after the original died, etc. But this, too, is nonsense. It assumes a reified concept of "self" or "identity." There is nothing you can "lose" that would count as a prior version of you 'dying' and your being born anew (whether slightly different or not). Of course, there are such things as irreversible mental degradation, dementia, etc. These are tragic and we rightfully speak of a loss of identity, but there'd be no such tragedy in a bout of dementia. A temporary loss of identity is not a loss of identity followed by gaining a new identity; it's a behavioural aberration. A temporary loss of identity with a change in temperament when one recovers is, likewise, unproblematic in this sense; we undergo changes in temperament regardless. Of course, extreme change can bring with it questions of loss of identity, but this is no more problematic for our scenario than an operation gone wrong. "He never fully recovered from his operation," we might say. Sad, yes, but this type of thing happens even outside of thought experiments.

Replies from: jpaulson, torekp
comment by Jonathan Paulson (jpaulson) · 2014-01-24T08:11:38.894Z · LW(p) · GW(p)

You are dodging the question by appealing to the dictionary. The dictionary will not prove for you that identity is tied to your body, which is the issue at hand (not "whether your body dies as the result of copying-then-death", which as you point out is trivial)

comment by torekp · 2014-01-20T02:33:27.691Z · LW(p) · GW(p)

All true, but it just strengthens the case for what you call "stipulating a new meaning for the words 'survival', 'death', etc". Or perhaps, making up new words to replace those. Contemplating cases like these makes me realize that I have stopped caring about 'death' in its old exact meaning. In some scenarios "this will kill you" becomes a mere technicality.

Replies from: scientism
comment by scientism · 2014-01-20T19:17:37.004Z · LW(p) · GW(p)

Mere stipulation secures very little though. Consider the following scenario: I start wearing a medallion around my neck and stipulate that, so long as these medallion survives intact, I am to be considered alive, regardless of what befalls me. This is essentially equivalent to what you'd be doing in stipulating survival in the uploading scenario. You'd secure 'survival', perhaps, but the would-be uploader has a lot more work to do. You need also to stipulate that when the upload says "On my 6th birthday..." he's referring to your 6th birthday, etc. I think this project will prove much more difficult. In general, these sort of uploading scenarios are relying on the notion of something being "transferred" from the person to the upload, and it's this that secures identity and hence reference. But if you're willing to concede that nothing is transferred - that identity isn't transferrable - then you've got a lot of work to do in order to make the uploading scenario consistent. You've got to introduce revised versions of concepts of identity, memory, self-reference, etc. Doing so consistently is likely a formidable task.

I should have said this about the artificial brain transplant scenario too. While I think the scenario makes sense, it doesn't secure all the traditional science fiction consequences. So having an artificial brain doesn't automatically imply you can be "resleeved" if your body is destroyed, etc. Such scenarios tend to involve transferrable identity, which I'm denying. You can't migrate to a server and live a purely software existence; you're not now "in" the software. You can see the problems of reference in this scenario. For example, say you had a robot on Mars with an artificial brain with the same specifications as your own. You want to visit Mars, so you figure you'll just transfer the software running on your artificial brain to the robot and wake up on Mars. But again, this assumes identity is transferrable in some sense, which it is not. But you might think that this doesn't matter. You don't care if it's you on Mars, you'll just send your software and bring it back, and then you'll have the memories of being on Mars. This is where problems of reference come in, because "When I was on Mars..." would be false. You'd have at best a set of false memories. This might not seem like a problem, you'll just compartmentalise the memories, etc. But say the robot fell in love on Mars. Can you truly compartmentalise that? Memories aren't images you have stored away that you can examine dispassionately, they're bound up with who you are, what you do, etc. You would surely gain deeply confused feelings about another person, engage in irrational behaviour, etc. This would be causing yourself a kind of harm; introducing a kind of mental illness.

Now, say you simply begin stipulating "by 'I' I mean...", etc, until you've consistently rejiggered the whole conceptual scheme to get the kind of outcome the uploader wants. Could you really do this without serious consequences for basic notions of welfare, value, etc? I find this hard to believe. The fact that the Mars scenario abuts issues of value and welfare suggests that introducing new meanings here would also involve stipulating new meanings for these concepts. This then leads to a potential contradiction: it might not be rationally possible to engage in this kind of revisionary task. That is, from your current position, performing such a radical revision would probably count as harmful, damaging to welfare, identity destroying, etc. What does this say about the status of the revisionary project? Perhaps the revisionist would say, "From my revisionary perspective, nothing I have done is harmful." But for everyone else, he is quite mad. Although I don't have a knockdown argument against it, I wonder if this sort of revisionary project is possible at all, given the strangeness of having two such unconnected bubbles of rationality.

Replies from: torekp
comment by torekp · 2014-01-24T02:13:54.235Z · LW(p) · GW(p)

Now, say you simply begin stipulating "by 'I' I mean...", etc, until you've consistently rejiggered the whole conceptual scheme to get the kind of outcome the uploader wants. Could you really do this without serious consequences for basic notions of welfare, value, etc?

No, and that is the point. There are serious drawbacks of the usual notions of welfare, at least in the high-tech future we are discussing, and they need serious correcting. Although, as I mentioned earlier, coining new words for the new concepts would probably facilitate communication better, especially when revisionaries and conservatives converse. So maybe "Yi" could be the pronoun for "miy" branching future, in which Yi go to Mars as well as staying home, to be merged later. There is no contradiction, either: my welfare is what I thought I cared about in a certain constellation of cares, but now Yi realize that was a mistake. Misconceptions of what we truly desire or like are, of course, par for the course for human beings; and so are corrections of those conceptions.

comment by mwengler · 2014-01-20T22:27:16.465Z · LW(p) · GW(p)

Upload shmupload.

Lets remove unnecessary complications and consider the more essential question. You are knocked out. While unconscious, a particle-for-particle copy of you is made with all identical energy levels, momenta, spins, colors, flavors, and any other quantum states associated with any of the particles in your body. The only differences are all the particles in the new copy are 3 m to the east of the particles in the original. The unconscious copies are placed someplace nice and revived approximately simultaneously.

Pretty obviously, neither copy feels like it is the other copy. Pretty obviously, each copy presumes it is the original, or presumes it has a 50:50 chance of being the original, but each copy thinks the same as the other, at least initially.

Further, if this process was done to you without your knowledge, and while still unconscious, the original you was destroyed, when the copy came to, it would have no inkling that it was not you and that you were dead. It would think it was you and that you are not dead.

Is there really any important difference between your existence now, and one in which your physical body was replaced by a particle-for-particle copy every so often? To WHOM or WHAT is that difference experienced?

My own conclusion is that the continuity of my life is something of an illusion. And that my distaste for dying is primarily something bred into me by evolution, it is not hard to imagine how it would provide a significant survival advantage over humans without that trait.

I don't know much what to do with this point of view, but one thing I don't do is pay money now to have my severed head frozen after I legally die. It is enough that I carry through on the more straightforward survival instinct type things that evolution has stuck me with.

Replies from: None, polarix
comment by [deleted] · 2014-01-21T07:53:32.150Z · LW(p) · GW(p)

Is there really any important difference between your existence now, and one in which your physical body was replaced by a particle-for-particle copy every so often? To WHOM or WHAT is that difference experienced?

Yes, it is of importance to the me right here, right now, in the present. Under one interpretation I wake up in the other room. In the other I do not - it is some other doppelgänger which shares my memories but whose experiences I do not get to have.

If I somehow find myself in the room with my clone, it's true that there's no way short of checking external evidence like security footage or somesuch to determine which is the real me. That is true. But that is a statement about my knowledge, not the world as it exists. The map is not the territory.

If I were to wake up in the other room with the clone nearby, it no longer matters which one of us is the original or not. He isn't me. He is a separate person that just happens to share all of the same memories and motivations that I have. I want to say that I wouldn't even give this copy of me the time of day, but that would be rhetorical. In some ventures he would be my greatest friend, in others my worst enemy. (Interestingly I could accuratly tell which right now by application of decision theory to the variants of the prisoner's delima.) But even when I choose to interfere in his affairs, it is not for directly self-serving reasons - I help him for the same reason I'd help a really close friend, I hurt him for the same reason I'd hinder a competitor.

The truth has real implications for the me that does exist, in the here and now. Do I spend not-insignificant sums of money on life insurance to cover cryonic preservation for me and my family, thereby foregoing other opportunities? Do I consider assisted suicide and cryonic preservation when I am diagnosed with a terminal or dibilitating disese of the brain? Do I stipulate revival instead of uploading in my cryonics contract, knowing that it might mean never being revived if the technology can not be developed before my brain deteriorates too much? Do I continue to spend time debating this philosophical point with other people on the Internet, in the hope that they too choose revival and there is safety in numbers?

Replies from: jpaulson
comment by Jonathan Paulson (jpaulson) · 2014-01-24T08:05:01.612Z · LW(p) · GW(p)

Under one interpretation I wake up in the other room. In the other I do not - it is some other doppelgänger which shares my memories but whose experiences I do not get to have.

I don't understand how to distinguish "the clone is you" from "the clone is a copy of you". Those seem like identical statements, in that the worlds where yon continue living and the world where the clone replaces you are identical, atom for atom. Do you disagree? Or do you think there can be a distinction between identical worlds? If so, what is it?

He isn't me. He is a separate person that just happens to share all of the same memories and motivations that I have.

In the same sense, future-you isn't you either. But you are willing to expend resources for future-you. What is the distinction?

comment by polarix · 2014-01-25T19:43:21.229Z · LW(p) · GW(p)

Is there really any important difference between your existence now, and one in which your physical body was replaced by a particle-for-particle copy every so often? To WHOM or WHAT is that difference experienced?

Yes, to the universe as witnessed by an outside observer, and to the law. The important difference is haecceity, which to an naive inside view is currently meaningless, but to any intuitive observer or reflective agent becomes relevant. Objective history exists, it's just that we humans-within-universe simply cannot access it.

There are countless unknowns about the universe that we know about. There are also almost certainly unknown unknowns. Haecceity currently lies essential to sense of self, but in a more broadly aware context perhaps its value could be concretized and rationalized.

comment by PDH · 2014-01-18T15:12:47.634Z · LW(p) · GW(p)

It's not the book, it's the story.

Moby Dick is not a single physical manuscript somewhere. If I buy Moby Dick I'm buying one of millions of copies of it that have been printed out over the years. It's still Moby Dick because Moby Dick is the words, characters, events etc. of the story and that is all preserved via copying.

A slight difference with this analogy is that Moby Dick isn't constantly changing as it ages, gaining new memories and whatnot. So imagine that Melville got half way through his epic and then ran out of space in the notebook that I want you to also imagine he was writing it in. So we have a notebook that contains the first half of Moby Dick (presumably, this is a pretty big notebook). Then he finishes it off in a second notebook.

Some time later he pulls a George Lucas and completely changes his mind about where his story was going ("Kill off Ahab? What was I thinking?") and writes a new version of the story where they go into a profitable, if ethically dubious, whaling business with rather more success than in the first version. This is then written up in a third notebook. Now we have three notebooks, the last two of which are both legitimate continuations of the first, carrying on from the exact same point at which the first notebook was ended.

There is no interesting sense in which one of these is some privileged original, as Eliezer puts it. If you can't get your head around that and want to say that, no, the published (in real life) version is the 'real' one imagine that the published version was actually the third notebook. There is no equivalent of publication for identity that could confer 'realness' onto a copy. In real life, neither or them are notebook 1 but they're both continuations of that story.

If Will Riker discovers that he was non-destructively copied by the Transporter and that there's another version of him running around, he will likely think, 'I don't acknowledge this guy as 'me' in any meaningful sense.' The other guy will think the same thing. Neither of them are the same person they were before they stepped into the Transporter. In fact, you are not the same person you were a few seconds ago, either.

Identify yourself as the book and your concept of identity has big problems with or without uploading. Start by reconciling that notion with things like quantum physics or even simple human ageing and you will find enough challenges to be getting on with without bringing future technology into it.

But you are not some collection of particles somewhere. You are the story, not the book. It's just that you are a story that is still in the process of being written. Uploading is no different that putting Moby Dick on a Kindle. If there's still a meat version of you running around then that is also a copy, also divergent from the original. The 'original' is (or was) you as you were when the copy was made.

Replies from: randallsquared, None
comment by randallsquared · 2014-01-20T00:09:50.754Z · LW(p) · GW(p)

Moby Dick is not a single physical manuscript somewhere.

"Moby Dick" can refer either to a specific object, or to a set. Your argument is that people are like a set, and Error's argument is that they are like an object (or a process, possibly; that's my own view). Conflating sets and objects assumes the conclusion.

Replies from: PDH
comment by PDH · 2014-01-21T06:14:54.266Z · LW(p) · GW(p)

I'm not conflating them, I'm distinguishing between them. It's because they're already conflated that we're having this problem. I'm explicitly saying that the substrate is not what it's important here.

But this works both ways: what is the non-question begging argument that observer slices can only be regarded as older versions of previous slices in the case that the latter and the former are both running on meat-based substrates? As far as I can see, you have to just presuppose that view to say that an upload's observer slice doesn't count as a legitimate continuation.

I don't want to get drawn into a game of burden of proof tennis because I don''t think that we disagree on any relevant physical facts. It's more that my definition of identity just is something like an internally-forward-flowing, indistinguishable-from-the-inside sequence of observer slices and the definition that other people are pushing just...isn't.

All I can say, really, is that I think that Error and Mark et al are demanding an overly strong moment-to-moment connection between observer slices for their conception of identity. My view is easier to reconcile with things like quantum physics, ageing, revived comatose patients etc. and that is the sort of thing I appeal to by way of support.

Replies from: randallsquared, None
comment by randallsquared · 2014-01-22T05:03:29.412Z · LW(p) · GW(p)

It's more that my definition of identity just is something like an internally-forward-flowing, indistinguishable-from-the-inside sequence of observer slices and the definition that other people are pushing just...isn't.

Hm. Does "internally-forward-flowing" mean that stateA is a (primary? major? efficient? not sure if there's a technical term, here) cause of stateB, or does it mean only that internally, stateB remembers "being" stateA?

If the former, then I think you and I actually agree.

comment by [deleted] · 2014-01-21T08:04:27.560Z · LW(p) · GW(p)

All I can say, really, is that I think that Error and Mark et al are demanding an overly strong moment-to-moment connection between observer slices for their conception of identity. My view is easier to reconcile with things like quantum physics, ageing, revived comatose patients etc. and that is the sort of thing I appeal to by way of support.

Details please.

Quantum physics? Max Tengmark does this subject better than I have the time to. And btw, he's on our side.

Aging? Don't see the connection. You seem to argue that information patterns are identity, but information patterns change greatly as you age. Mark at age 12, the troubled teenager, is very different than Mark at age 29, the responsible father of two. But I think most people would argue they are the same person, just at two separate points in time. Why?

Comatose patients? Connection please? I am not aware what objective data you are pointing to on this.

Replies from: PDH
comment by PDH · 2014-01-21T16:49:00.401Z · LW(p) · GW(p)

I'm like a third of the way through that Tegmark paper and I agree with it so far as I understand it but I don't see how it contradicts my view here. He claims that consciousness is a state of matter, i.e. a pattern of information. You can make a table out of a variety of materials, what matters is how the materials are arranged (and obviously brains are a lot more complicated than tables but it's what they can do by virtue of their arrangement in terms of the computations they can perform etc. that matters). To Tegmark (and I think to me, as well) consciousness is what certain kinds of information processing feel like from the inside. Which is is pretty much exactly what I'm saying here (that is equivalent to the story in my Moby Dick analogy). If the information processing is indistinguishable from the inside and internally forward-flowing in the sense that the resulting observer slice is a continuation of a previous one to same degree as meat-based humans, then mission accomplished. The upload was successful.

Aging? Don't see the connection. You seem to argue that information patterns are identity, but information patterns change greatly as you age. Mark at age 12, the troubled teenager, is very different than Mark at age 29, the responsible father of two. But I think most people would argue they are the same person, just at two separate points in time. Why?

I hold that Mark at age 29 is a legitimate continuation of Mark at age 12 but I also hold that this is true of Mark the upload, age 29. Neither are made of the same particles nor do they have the same mental states, as Mark, age 12. so I don't see why one is privileged with respect to the other. I actually make this same point, with almost the same example, in support of my position that non-meat based future observer slices are just as valid as meat based ones.

As for comatose patients, some possible objections that someone could make to my view are that it doesn't constitute a legitimate continuation of someone's conscious narrative if there is a significant interruption to that narrative, if significant time has passed between observer slices or if the later observer slice is running on a significantly different substrate. However, someone revived from a coma after ten years, say, ought to still be regarded as the same person even though there has been a massive discontinuity in their conscious narrative, ten years have passed between observer slices and, even on classical physics, every single one of the particles of which they were composed prior to the coma has now been replaced, meaning they are now literally running on a different substrate.

comment by [deleted] · 2014-01-18T17:43:54.621Z · LW(p) · GW(p)

Changing the definition doesn't resolve the underling issue...

Replies from: PDH
comment by PDH · 2014-01-19T00:03:14.868Z · LW(p) · GW(p)

It does if the the underlying issue is not actually an issue unless you choose certain, in my opinion inadequate, definitions of the key terms. I can't force you not to do that. I can point out that it has implications for things like going to sleep that you probably wouldn't like, I can try my best to help resolve the confusions that I believe have generated those definitions in the first place and I can try to flesh out, with tools like analogy, what I consider to be a more useful way of thinking about identity. Unfortunately, all of these things could potentially open me up to the charge of changing definitions but if that's the case I can only plead guilty because that's the appropriate response in situations where the debate happens to turn on the definitions of the relevant terms.

Error wrote that, in the case of non-destructive copying, he doesn't consider the upload to be a legitimate continuation of the copied entity but he does consider the flesh-and-blood, 'meat' version still walking around to be exactly that. I guess the intuition here is that this case effectively settles the question of identity because you would have a flesh-and-blood human who would have first-hand knowledge that it was the real one ("How can he be me? I'm here!")

I totally get that intuition. I can see how to most people it would be just obvious that the Machine-Version of Error is not the Meat-Version of Error. It's because it's not!

The problem is that neither of those entities are the thing that was copied. What was copied was Error as he was at a particular moment. The Meat-Version isn't that. The Meat-Version is not made of the same particles, nor does he have the same mental states. The Meat Version is a legitimate continuation of the old Meat Version but so is the Machine Version.

I remember having my photograph taken at the seaside when I was a child. When I look at the child in that photograph now I regard myself as the same person. I know we're not made of the same particles, I know that I have memories of events that he hasn't experienced yet, knowledge that he doesn't have, a completely different personality...On my definition of identity, however, I get to call him 'me.' I can consistently point at this photograph and say, 'that was me, when I was a child.'

What I want to know is, how can someone who rejects this view of identity point at a picture of himself as a child and say the same thing without opening the door for a future upload to look at a photograph of him right now (i.e. before the upload) and say, 'that was me, when I was made of meat'?

Replies from: None
comment by [deleted] · 2014-01-19T06:13:13.656Z · LW(p) · GW(p)

Just because the machine version remembers what the meat version did, doesn't mean the conscious meat version didn't die in the uploading process. Nothing you have said negates the death + birth interpretation. Your definitions are still missing the point.

Replies from: Leonhart
comment by Leonhart · 2014-01-19T13:55:22.504Z · LW(p) · GW(p)

Sure, something particular happens to the meat version. But (it is asserted) that thing happens to you all the time anyway and nobody cares. So the objection is to you wasting the nice short code "death" on such an unimportant process. This is a way in which words can be wrong.

Replies from: None
comment by [deleted] · 2014-01-19T18:59:28.289Z · LW(p) · GW(p)

Strawman assertion. Even while unconscious / asleep / anesthetized there is still a jumbled assortment of interdependent interactions going on in my brain. Those don't get broken apart and dematerialized "all the time" the way they do when being destructively uploaded.

Replies from: Leonhart
comment by Leonhart · 2014-01-19T20:53:38.504Z · LW(p) · GW(p)

That's not what strawman means. If you think what I've said is irrelevant or misses your point, say that.

As I said to Error, above, I'm referring to this.

Yes, uploading brains is going to be incredibly difficult and possibly impossible; and if any kind of upload process is sufficiently noisy or imperfect, that that surely could result in something better described as a death-and-creation than a continuation. For the purpose of the argument, I thought we were assuming a solved, accurate upload process.

Replies from: None
comment by [deleted] · 2014-01-20T00:48:12.133Z · LW(p) · GW(p)

When you introduce something which is irrelevant and misses the point, and then use that to dismiss an argument, yes that is a strawman.

Back to the original issue, the "upload" scenario is usually expressed in the form: (1) somehow scan the brain to sufficient resolution, and (2) create a computer simulation using that data. Even if the scan and simulation were absolutely prefect, better than quantum physics actually allows, it still would be death-and-creation under the OP's framework.

I can't tell from your post if you are including "slowly transition brain into electronic medium" under the category of "uploading", but that is usually grouped under intelligence augmentation, and I don't know any material reductionalist who thinks that would be a death-and-creation.

comment by jaime2000 · 2014-01-19T15:38:47.047Z · LW(p) · GW(p)

How do you feel about the continuous uploading procedure described in "Staring into the Singularity"?

comment by shminux · 2014-01-18T05:37:04.575Z · LW(p) · GW(p)

I don't acknowledge an upload as "me" in any meaningful sense of the term

What about if your mind is uploaded, then downloaded into your clone previously grown without any brain functions? Would you consider the new meat you as "you"?

Replies from: None
comment by [deleted] · 2014-01-18T09:41:39.956Z · LW(p) · GW(p)

What about if your mind is uploaded, then downloaded into your clone previously grown without any brain functions? Would you consider the new meat you as "you"?

Why would he? I predict he clearly would not, since he was already dead. What point are you trying to make?

Replies from: Error, shminux
comment by Error · 2014-01-20T17:43:38.861Z · LW(p) · GW(p)

Upvoted for correct prediction.

comment by shminux · 2014-01-18T21:24:24.399Z · LW(p) · GW(p)

My next question would have been about what Error feels about star trek-style transporters (which temporarily convert a person into a plasma beam).

Replies from: Aleksander
comment by Aleksander · 2014-01-19T17:21:50.718Z · LW(p) · GW(p)

And he in turn might respond by asking how you feel about thinking like a dinosaur.

Replies from: Error
comment by Error · 2014-01-20T17:46:33.885Z · LW(p) · GW(p)

I would not be enthusiastic about Star Trek transporters, no. And yes, I would like to know how shminux feels about thinking like a dinosaur; that does seem to capture my intuitions rather well.

Replies from: shminux
comment by shminux · 2014-01-20T19:17:50.080Z · LW(p) · GW(p)

First, I would like to acknowledge that ready access to a cloning tech would require a significant reevaluation of metaethics.

Second, dealing with lost ACKs is the least of the worries, and the show's premise resulted from a poorly constructed communication protocol. So this particular issue can be solved technologically. For example: mandatory induced unconsciousness similar to general anesthesia, for the duration of transport and success confirmation process, to prevent the subject from anxiously waiting for positive confirmation if the initial ACK is lost.

Just to have a taste of the real ethical issues of cloning, note that many forms of utilitarianism mandate immediately creating as many clones as possible as long as their lives are at least "barely worth celebrating", in Eliezer's words. Another issue is cloning the most useful individuals at the expense of the quality of life of the least useful. Refer to your favorite transhumanist sci-fi for more details and examples.

I recall a humorous one of the Lem's Ijon Tichy stories (can't find a link ATM), where on one of the planets under constant heavy meteorite bombardment the mandatory logging and cloning tech was used as a routine way to revive the victims, replacing fatalities with minor inconveniences.

Finally, yours and Mark_Friedenbach's aversion to radical versions of suspended animation is so foreign to me, I have trouble steelmanning your position.

Replies from: Aleksander
comment by Aleksander · 2014-01-20T23:50:03.926Z · LW(p) · GW(p)

I recall a humorous one of the Lem's Ijion Tichy stories (can't find a link ATM), where on one of the planets under constant heavy meteorite bombardment the mandatory logging and cloning tech was used as a routine way to revive the victims, replacing fatalities with minor inconveniences.

It's the Twenty-Third Voyage in Star Diaries.

comment by trist · 2014-01-18T02:16:23.192Z · LW(p) · GW(p)

I find the whole question less confusing when viewed from the other direction. After the upload, the uploaded you will view the current you as it's past. If the upload is nondestructive, the non-uploaded you will also.

Replies from: None
comment by [deleted] · 2014-01-18T09:44:23.071Z · LW(p) · GW(p)

What if I rewire your neurons so you think you're Donald Trump? Would that make you Donald Trump? If Mr. Trump died in a tragic boating accident tomorrow, could his family rest easy knowing that he didn't actually experience death, but lives on in you?

Replies from: trist, TheOtherDave, Leonhart
comment by trist · 2014-01-18T21:07:43.862Z · LW(p) · GW(p)

If you rewrite my nuerons such that I have all of Donald Trump's memories (or connections) and none of my own, yes. If you only rewrite my name, no, for I would still identify with the memories. There's lots of space between those where I'm partially me and partially him, and I would hazard to forward-identify with beings in proportion to how much of my current memories they retain, possibly diluted by their additional memories.

Replies from: None
comment by [deleted] · 2014-01-21T06:49:42.071Z · LW(p) · GW(p)

Ok, what if - like Eternal Sunshine of the Spotless Mind - I slowly over a period of time eliminate your memories. Then maybe - like Dark City - I go in and insert new memories, maybe generic, maybe taken from someone else. This can be done either fast or slowly if it matters.

This future continuation of your current self will have nothing other than a causal & computational connection to your current identity. No common memories whatsoever. Would you expect to experience what this future person experiences?

Replies from: TheOtherDave
comment by TheOtherDave · 2014-01-21T15:58:44.475Z · LW(p) · GW(p)

Would you expect to experience what this future person experiences?

Based on your other comments, I infer that you consider this question entirely different from the question "Are you willing to consider this future person you?" Confirm?

Replies from: None
comment by [deleted] · 2014-01-21T16:17:46.032Z · LW(p) · GW(p)

Correct.

Replies from: TheOtherDave
comment by TheOtherDave · 2014-01-21T16:22:38.440Z · LW(p) · GW(p)

Cool, thanks. Given that, and answering for my own part: I'm not sure what any person at any time would possibly ever observe differentially in one case or the other, so I honestly have no idea what I'd be expecting or not expecting in this case. That is, I don't know what the question means, and I'm not sure it means anything at all.

Replies from: None
comment by [deleted] · 2014-01-21T20:44:02.296Z · LW(p) · GW(p)

That's fair enough. You got the point with your first comment, which was to point out that issues of memory-identity and continuous-experience-identity are / could be separate.

Replies from: TheOtherDave
comment by TheOtherDave · 2014-01-21T21:03:14.670Z · LW(p) · GW(p)

Perhaps I understand more than I think I do, then.

It seems to me that what I'm saying here is precisely that those issues can't be separated, because they predict the same sets of observations. The world in which identity is a function of memory is in all observable ways indistinguishable from the world in which identity is a function of continuous experience. Or, for that matter, of cell lineages or geographical location or numerological equivalence.

Replies from: None
comment by [deleted] · 2014-01-22T05:14:13.355Z · LW(p) · GW(p)

And I'm saying that external observations are not all that matters. Indeed it feels odd to me to hold that view when the phenomenon under consideration is subjective experience itself.

Replies from: TheOtherDave
comment by TheOtherDave · 2014-01-22T14:40:23.519Z · LW(p) · GW(p)

I didn't say "external observations".
I said "observations."

If you engage with what I actually said, does it feel any less odd?

Replies from: None
comment by [deleted] · 2014-01-22T18:40:19.380Z · LW(p) · GW(p)

You said "predict the same set of observations" which I implicitly took to mean "tell me something I can witness to update my beliefs about which theory is correct," to which the answer is: there is nothing you - necessarily external - can witness to know whether my upload is death-and-creation or continuation. I alone am privy to that experience (continuation or oblivion), although the recorded memory is the same in either case so there's no way clone could tell you after.

You could use a model of consciousness and a record of events to infer which outcome occurred. And that's the root issue here, we have different models of consciousness and therefore make different inferences.

Replies from: TheOtherDave
comment by TheOtherDave · 2014-01-22T19:23:04.744Z · LW(p) · GW(p)

You keep insisting on inserting that "external" into my comment, just as if I had said it, when I didn't. So let me back up a little and try to be clearer.

Suppose the future continuation you describe of my current self (let's label him "D" for convenience) comes to exist in the year 2034.

Suppose D reads this exchange in the archives of LessWrong, and happens to idly wonder whether they, themselves, are in fact the same person who participated in LessWrong under the username TheOtherDave back in January of 2014, but subsequently went through the process you describe.

"Am I the same person as TheOtherDave?" D asks. "Is TheOtherDave having my experiences?"

What ought D expect to differentially observe if the answer were "yes" vs. "no"? This is not a question about external observations, as there's no external observer to make any such observations. It's simply a question about observations.

And as I said initially, it seems clear to me that no such differentially expected observations exist... not just no external observations, but no observations period. As you say, it's just a question about models -- specifically, what model of identity D uses.

Similarly, whether I expect to be the same person experiencing what D experiences is a question about what model of identity I use.

And if D and I disagree on the matter, neither of us is wrong, because it's not the sort of question we can be wrong about. We're "not even wrong," as the saying goes. We simply have different models of identity, and there's no actual territory for those models to refer to. There's no fact of the matter.

Similarly, if I decide that you and I are really the same person, even though I know we don't share any memories or physical cells or etc., because I have a model of identity that doesn't depend on any of that stuff... well, I'm not even wrong about that.

Replies from: None
comment by [deleted] · 2014-01-23T04:36:07.950Z · LW(p) · GW(p)

When TheOtherDave walks into the destructive uploader, either he wakes up in a computer or he ceases to exist experiences no more. Not being able to experimentally determine what happened afterwards doesn't change that fact that one of those descriptions matches what you experience and the other does not.

Replies from: TheOtherDave
comment by TheOtherDave · 2014-01-23T15:52:47.904Z · LW(p) · GW(p)

What do I experience in the first case that fails to match what I experience in the other?

That is, if TheOtherDave walks into the destructive uploader and X wakes up in a computer, how does X answer the question "Am I TheOtherDave?"

Again, I'm not talking about experimental determination. I'm talking about experience. You say that one description matches my experience and the other doesn't... awesome! What experiences should I expect X to have in each case?

It sounds like your answer is that X will reliably have exactly the same experiences in each case, and so will every other experience-haver in the universe, but in one case they're wrong and in the other they're right.

Which, OK, if that's your answer, I'll drop the subject there, because you're invoking an understanding of what it means to be wrong and right about which I am profoundly indifferent.

Is that your answer?

Replies from: None
comment by [deleted] · 2014-01-24T10:52:37.318Z · LW(p) · GW(p)

how does X answer the question "Am I TheOtherDave?"

This is so completely unrelated to what I am talking about. Completely out of left field. How the upload/clone answers or fails to answer the question "Am I TheOtherDave?" is irrelevant to the question at hand: what did TheOtherDave experience when he walked into the destructive uploader.

I've rephrased this as many times as I know how, but apparently I'm not getting through. I give up; this is my last reply.

Replies from: TheOtherDave
comment by TheOtherDave · 2014-01-24T14:51:52.807Z · LW(p) · GW(p)

OK.

comment by TheOtherDave · 2014-01-18T17:03:24.701Z · LW(p) · GW(p)

Of course not. But what does thinking you're Donald Trump have to do with it? The question at hand is not about who I think I am, but what properties I have.

Replies from: None
comment by [deleted] · 2014-01-18T17:41:46.715Z · LW(p) · GW(p)

No, the question at issue here is continuity of experience, and the subjective experience (or rather lack thereof) when it is terminated - death.

Replies from: TheOtherDave
comment by TheOtherDave · 2014-01-18T18:38:42.728Z · LW(p) · GW(p)

Ah, OK. You confused me by bringing up trist thinking they were Donald Trump, which seemed unrelated.

For my own part, I'm not sure why I should care about the nominal difference between two things with identical properties, regardless of how continuous their subjective experience is/has been, and regardless of whether one of them is me.

But I acknowledge that some people do care about that. And not just for subjective experience... some people care about the difference between an original artwork and a perfectly identical copy of it, for example, because the continuity of the original's existence is important to them, even though they don't posit the artwork has subjective experiences.

That's fine... people value what they value. For my own part, I don't value continuity in that sense very much at all.

comment by Leonhart · 2014-01-18T10:21:48.665Z · LW(p) · GW(p)

Taboo "think". If you rewire my neurons* to give me the false propositional belief that I am Donald Trump, then no. If you rewire my neurons to an exact copy of Donald Trump's, then yes.

And, yes, they could, to the exact same degree that they would accept a miraculously-resuscitated Trump who was amnesiac about the previous day leading up to the boating accident, and also looked totally different now. But this is a looser requirement. There could be a whole bunch of threshold people who would be recognised by my family as a valid continuation of me, but who I could not have anticipated becoming.

*and any other skull-stuff that bears on the problem

comment by RowanE · 2014-01-20T13:20:34.750Z · LW(p) · GW(p)

I think the expansion and contraction model, as you've described it, would probably also result in my death. The being that includes the computer and myself would be a new being, of which I am now a component. When the meaty component dies, this is my death, even though there is now a being who perceives itself to be a continuation of me. This being is, in many ways, a continuation of me, just not in the way that I care about most.

I'm not completely sure of this, of course, but anywhere I'm not sure whether I'll die or not I prefer to lean heavily towards not dying.

comment by CAE_Jones · 2014-01-18T10:50:00.560Z · LW(p) · GW(p)

What's the point of uploading if we have an AI with all the skills and knowledge of everyone not information-theoretically dead at the time of its creation?

I have no idea how to argue with the ideas about consciousness/identity/experience/whatever that make uploading seem like it could qualify as avoiding death. It occurs to me, though, that those same ideas sorta make uploading individuals pointless. If strong AI doesn't happen, why not just upload the most useful bits of people's brainstates and work out how to combine them into some collective that is not one person, but has the knowledge and skills? Why install, say, Yudkowsky.wbe and Immortalbob.wbe, when you could install them as patches to grandfather_of_all_knowledge.mbe, effectively getting (Yudkowsky|Immortalbob)?

Assume that any properties of a brain that can be quantified get quantified, and where the variables match up, the or process takes the maximum. So if brain A is better at math than brain B, brain A|B uses A's math ability. If brain M is male and brain F is female, though, brain M|F will learn from both perspectives (and hopefully be stronger than the originals for it).

So the benefits of a group, with none of the drawbacks, all crammed into one metabrain.

Replies from: jpaulson, RowanE
comment by Jonathan Paulson (jpaulson) · 2014-01-24T08:13:54.998Z · LW(p) · GW(p)

Because I want to be alive. I don't just want humanity to have the benefit of my skills and knowledge.

Replies from: CAE_Jones
comment by CAE_Jones · 2014-01-24T09:42:05.606Z · LW(p) · GW(p)

Because I want to be alive. I don't just want humanity to have the benefit of my skills and knowledge.

When I read this in the recent comments list, I at first thought it was a position against uploading. Then I read the other recent comments and realized it was probably a reply to me.

I get the impression that no one has a functional definition of what continuity of identity means, yet destructive copies (uploads, teleports, etc) appear to be overwhelmingly considered as preserving it at least as much as sleep. I find this confusing, but the only argument that seemed to support it that I've found is Eliezer's "Identity is not in individual atoms", which is a bit disingenuous, in that uploads are almost certainly not going to be precise quantum state replicators.

(I'd make a pole, here, but my last attempt went poorly and it doesn't appear to be standard markup, so I don't know where I'd test it.)

What probability would you assign to each of these as continuing personal identity?

  1. Sleep.
  2. puberty
  3. The typical human experience over 1-5 years.
  4. Gradual replacement of biological brain matter with artificial substitutes.
  5. Brain-state copying (uploads, teleportation)
  6. Brain-state melding (Brain Omega = Brain A | Brain B | Brain n )
Replies from: jpaulson
comment by Jonathan Paulson (jpaulson) · 2014-01-25T02:35:39.931Z · LW(p) · GW(p)

1) 1.0 2) 1.0 3) 1.0 4) It depends on the artificial substitutes :) If they faithfully replicate brain function (whatever that means), 1.0 5) Again, if the process is faithful, 1.0 6) It really depends. For example, if you drop all my memories, 0.0. If you keep an electronic copy of my brain on the same network as several other brains, 1.0. in-between: in-between

(Yes, I know 1.0 probabilities are silly. I don't have enough sig-figs of accuracy for the true value :)

comment by RowanE · 2014-01-20T13:02:25.222Z · LW(p) · GW(p)

I don't think most people who believe uploading qualifies as avoiding death would also agree that adding a fraction of a person's brainstate to an overmind would also qualify as avoiding death.

comment by Thomas · 2014-01-18T11:56:37.953Z · LW(p) · GW(p)

The simplest way to understand all this, is to look others as your coincarnations.

All the paradoxes go away. What remains, is a memmetic hazard, though.

comment by Leonhart · 2014-01-18T10:06:32.913Z · LW(p) · GW(p)

Copy my mind to a machine non-destructively, and I still identify with meat-me. You could let machine-me run for a day, or a week, or a year, and only then kill off meat-me. I don't like that option and would be confused by someone who did.

This is just bizarre. If the point is to preserve continuity, why on earth would you let the copy run independently and diverge? Of course it won't then represent a continuation of experience from the point at which meat-you was later killed.

The point of the destructive upload is precisely so that you-now can anticipate continuing only as the upload. It's essentially a charitable act! I don't want any me to have the experience "dammit, I'm still the meat one".

Replies from: Error, TheOtherDave
comment by Error · 2014-01-18T15:49:40.802Z · LW(p) · GW(p)

The point of the destructive upload is precisely so that you-now can anticipate continuing only as the upload.

Except I don't anticipate continuing only as the upload; I anticipate being dead. Uploaded-me will remember incorrectly anticipating the same, but uploaded-me was not the one doing the anticipating.

I am an instance of a class, not the class itself.

Actually, tabooing "I", since it seems to be getting in the way: This instance of the class anticipates that this instance will be dead, and has a problem with that even if other instance(s) of the class remain.

Replies from: DanielLC, Leonhart, Dentin
comment by DanielLC · 2014-01-19T23:17:37.108Z · LW(p) · GW(p)

This instance

That's the same as "I".

There is one instance. It forks. There are two instances. If you claim that one of them was the original instance, you are using "I".

I'd say that past!you, upload!you, and meat!you are three distinct instances of the class you. Thinking you're going to die means that past!you does not believe that there is a future!you.

Replies from: Error
comment by Error · 2014-01-20T17:32:52.156Z · LW(p) · GW(p)

That's the same as "I".

Well, yes. What I was trying to do was avert a confusion where "I" might refer to an instance (meat brain, silicon RAM copy #224) or might refer to a class (the abstract computation they're both running), by specifying the intended meaning. That is the point of tabooing, right?

I'd say that past!you, upload!you, and meat!you are three distinct instances of the class you.

Thanks; this seems to be the source of disagreement. I see two instances, not three, the second forking at the moment of upload and running the same program with a new PID. I don't think we'll find common ground here, and I'm not sure I'm up to defending my position on the subject. I just found the consequences of that position interesting to explore.

comment by Leonhart · 2014-01-19T13:47:38.956Z · LW(p) · GW(p)

Uploaded-me will remember incorrectly anticipating the same, but uploaded-me was not the one doing the anticipating.

Fine, but we're still talking past each other, because I think there is no sense in which dead meat-you "was" the one doing the anticipating that is not also true of live upload-you.

I am an instance of a class, not the class itself.

So the whole point of this, as I understood it, was that the universe doesn't support persistent instances in the way you want it to.

You could follow e.g. Mitchell Porter (as far as I understood him) and claim that there's a particular quantum doohickey that does support real fundamental continuity of stuff. Do you? Or am I wildly misinterpreting you?

comment by Dentin · 2014-01-20T06:21:05.632Z · LW(p) · GW(p)

For the record, (this instance of this class) has no problems with the destruction of (other instances of this class or itself), so long as at least one instance remains and is viable, and a handful of similarity conditions are met.

Seriously. We can talk about it on lw chat some time if you're bored.

comment by TheOtherDave · 2014-01-18T17:01:57.503Z · LW(p) · GW(p)

Wait, what?
Are you regularly having the experience "dammit, I'm still the meat one" now?

Replies from: Leonhart
comment by Leonhart · 2014-01-19T13:37:26.289Z · LW(p) · GW(p)

Well, no, because I don't remember recently non-destructively uploading myself.

Am I missing your point?

Replies from: TheOtherDave
comment by TheOtherDave · 2014-01-19T16:45:52.606Z · LW(p) · GW(p)

Perhaps I'm missing yours.
Say P1 = my continued existence in a meat body, and P2 = my continued existence in an uploaded body.
It seems clear to me that I prefer P1 to NOT P1... that's why I don't kill myself, for example.
So why would I prefer P2 to (P2 AND P1)?

Replies from: Leonhart
comment by Leonhart · 2014-01-19T21:29:08.551Z · LW(p) · GW(p)

Ah. But I'm not making an argument, just reporting current preferences.

If P2 can happen, that changes my preference for P1 over NOT P1, except in the case where P1 can also extend indefinitely; due to e.g. advances in anti-aging science, or alicornication science.

I strongly disvalue any me dying without being able to legitimately (by the lights of my model) anticipate then waking up as an upload. The best way to avoid this scenario is with a destructive upload. Obviously, my enthusiasm for this in any real situation involves a tradeoff between my confidence in the upload process, what I expect life as an upload to be like, and my remaining expected QUALYs.

I can imagine creating multiple P2s via non-destructive uploads before that point, but there will always be one meat-me left over. What I want is that he have no further experiences after whatever turns out to be his last save point, in which time to have the possibly silly, but inevitable and emotionally compelling thought, "I've missed the boat."

There's no reason that this should be compelling to you, but do you think it's actually inconsistent?

Replies from: Error, TheOtherDave
comment by Error · 2014-01-20T18:08:49.489Z · LW(p) · GW(p)

I found this enlightening, in that I'd never really understood the point of deliberate destructive uploading until now.

A preference report in return: I would strongly prefer P1 over P2, mildly prefer P2 & P1 over just P1, and moderately prefer P2 over nothing (call nothing P0). I think of destructive-upload as death, but I think I value the algorithm I'm running somewhat, even in the absence of the hardware currently running it.

Given the opportunity to do a non-destructive upload, I would certainly take it. Given the opportunity to do a destructive upload, I would....well, I might take it anyway. Not because it wouldn't be death, but because not taking it would eventually result in P0. I would prefer such an upload to take place as late in life as possible, assuming the possibility of early death by accident is ignored. (I am not certain which way I would go if it was not ignored)

comment by TheOtherDave · 2014-01-20T00:23:04.592Z · LW(p) · GW(p)

Fair enough. Sure, if you happen to have both the desire to not die and the independent desire to stop living in your meat body once you've been uploaded, then a destructive upload gives you more of what you want than a nondestructive one.

comment by ShardPhoenix · 2014-01-18T07:29:32.105Z · LW(p) · GW(p)

I think we have to give up on uniqueness of identity in order to remain consistent in these kind of sci-fi scenarios.

edit: And I guess "identity" has to have a continuous value to - similar to the anthropic principle - being x% certain you are in a particular world is like being x% certain you are a particular person.

Replies from: None
comment by [deleted] · 2014-01-18T09:42:15.549Z · LW(p) · GW(p)

What do you mean by "uniqueness of identity"?

Replies from: DanielLC
comment by DanielLC · 2014-01-19T23:13:04.604Z · LW(p) · GW(p)

The idea that, after a non-destructive upload, only one of the people can be you.

comment by Ander · 2014-01-18T00:18:53.907Z · LW(p) · GW(p)

I think that your position on destructive uploads doesn't make sense, and you did a great job of showing why with your thought experiment.

The fact that you can transition yourself over time to the machine, and you still consider it 'you', and you cant actually tell at what specific line you crossed in order to become a 'machine', means that your original state (human brain) and final state (upload) are essentially the same.

Replies from: solipsist, MrCogmor
comment by solipsist · 2014-01-18T01:21:44.980Z · LW(p) · GW(p)

I don't like the structure of this argument. If I morph into a coffee table, I can't mark a specific line at which I become a piece of furniture. This doesn't imply that I'm essentially a coffee table. No hard boundary does not imply no transition.

comment by MrCogmor · 2014-01-18T02:37:06.338Z · LW(p) · GW(p)

Error isn't implying that the final state is different. Just that the destructive copy process is a form of death and the wired brain process isn't.

I get where he is coming from, a copy is distinct from the original and can have different experiences. In the destructive copy scenario a person is killed and a person is born, In the wired brain scenario the person is not copied they merely change over time and nobody dies.

My view is that if I die to make a upload (which is identical to me except for greater intelligence & other benefits) then I think the gain outweighs the loss.

comment by Richard_Kennaway · 2014-02-14T13:51:53.814Z · LW(p) · GW(p)

Suppose that rather than copying my brain, I adjoined it to some external computer in a kind of reverse-Ebborian act; electrically connecting my synapses to a big block of computrons that I can consciously perform I/O to. Over the course of life and improved tech, that block expands until, as a percentage, most of my thought processes are going on in the machine-part of me. Eventually my meat brain dies -- but the silicon part of me lives on.

This is very similar to the premise of Greg Egan's short story, "Learning to be me".

comment by polarix · 2014-01-25T19:53:57.198Z · LW(p) · GW(p)

I find this an immensely valuable insight: continuity, or "haecceity", is the critical element of self which naive uploading scenarios dismiss. Our current rational situation of self as concept-in-brain has no need for continuity, which is counterintuitive.

We know a good deal about the universe, but we do not yet know it in its entirety. If there were an observer outside of physics, we might suspect they care great deal about continuity, or their laws might. Depending on your priors, and willingness to accept that current observational techniques cannot access all-that-there-is, it might be worth embedding some value to haecceity near your value of self.

Contrast grow-and-prune uploading with slice-and-scan uploading: the latter will be anathema to the vast majority of humanity; they may "get over it", but it'll be a long battle. And slice-and-scan will probably be much slower to market. Start with Glass and EEGs: we'll get there in our lifetime using grow-and-prune, and our AIs will grow up with mentors they can respect.

comment by Lalartu · 2014-01-20T15:27:35.660Z · LW(p) · GW(p)

Speaking of uploading procedures, I think the most brute-force, simple in concept and hard in implementation is described in
Transhuman by Yuri Nikitin. Just replace neurons one by one with nanorobots that have identical functionality, then as whole brain is transformed increase working speed.

comment by kilobug · 2014-01-19T11:01:26.796Z · LW(p) · GW(p)

But what's the difference between "non-destructive upload" and "making a copy of the upload" or "making a copy of your biological body" ?

The intuition behind "Copy my mind to a machine non-destructively, and I still identify with meat-me." is flawed and non-coherent IMHO. What if you can't even tell apart "meat you" and the other one, like the other one is put in a robotic body that looks, feels, ... exactly like the flesh body ? You fall asleep, you awake, there are two "you", one flesh the other robotic, how can you even know which is which ? Both will feel they are "real you".

There are countless similar thought experiments in which this view leads to contradictions/impossible answers. IMHO the only way to resolve them is accept that continuity of personal identity is at software level, not as hardware level.

Replies from: None
comment by [deleted] · 2014-01-21T08:47:09.041Z · LW(p) · GW(p)

You are misunderstanding the argument.

Replies from: kilobug
comment by kilobug · 2014-01-21T15:15:46.875Z · LW(p) · GW(p)

More exactly I don't really understand it, because it relies on presumptions/intuitions that I don't have. My point was mostly to try to get those made more explicit so I can better understand (and then accept or refute) the argument. Sorry if that wasn't clear enough.

comment by solipsist · 2014-01-18T00:06:36.713Z · LW(p) · GW(p)

Why do you have your position on destructive uploads? It could be that when you go to sleep, you die, and a new person who thinks they're you wakes up. The world is inhabited by day-old people who are deluded by their memories and believe they've lived decades-old lives. Everyone will cease to exist as a person the next time they go to sleep.

If you believe that, I can't prove you wrong. But it's not a productive worldview.

In a world where everyone is uploaded or Star Trek transported each day, you could believe that the world is inhabited by day-old people who will cease to exist on their next transport. I couldn't prove you wrong. But it wouldn't be a productive worldview.

Replies from: Error, None
comment by Error · 2014-01-18T00:30:20.985Z · LW(p) · GW(p)

Why do you have your position on destructive uploads

Mostly by comparison to non-destructive uploads. Copy my mind to a machine non-destructively, and I still identify with meat-me. You could let machine-me run for a day, and only then kill off meat-me. I don't like that option and would be confused by someone who did. Destructive uploads feel like the limit of that case where the time interval approaches zero. As with the case outlined in the post, I don't see a crossed line where it stops being death and starts being transition.

Now that I've written that, I wish I'd thought of it before you asked; the two are really mirror images. Approach destructive uploads from the copy-then-kill side, and it feels like death. Approach them from the expand-then-contract side, and it feels like continuous identity. Yet at the midpoint between them they turn into the same operation.

comment by [deleted] · 2014-01-18T09:37:03.342Z · LW(p) · GW(p)

In a world where everyone is uploaded or Star Trek transported each day, you could believe that the world is inhabited by day-old people who will cease to exist on their next transport. I couldn't prove you wrong. But it wouldn't be a productive worldview.

Who cares? Don't appeal to social norms. For the person about to step into teleporter, there's a true difference, even if it not observable from the outside.

Replies from: TheOtherDave, solipsist
comment by TheOtherDave · 2014-01-19T06:54:02.045Z · LW(p) · GW(p)

For the person about to step into teleporter, there's a true difference, even if it not observable from the outside.

Sure. For every person about to go to bed, there's also a true difference between the way they are as they go to bed, and the way they are as they wake up.

That there is a true difference doesn't really matter much; a more useful question is whether we value the difference... which is a psychological and social question. When you insist on ignoring social norms, the effect is simply to insist that a particular set of social norms (the ones having to do with continuous existence in a single body being important) be given unexamined primacy.

Which is fine for you, since you embrace that set. For those of us who reject it, it just seems like a goofy thing to insist on.

comment by solipsist · 2014-01-18T19:30:52.725Z · LW(p) · GW(p)

Don't appeal to social norms.

I can't justify believing that I will continue to exist 5 seconds from now -- that I am more than this thought, right now -- without appealing to social norms and practicality.