Your transhuman copy is of questionable value to your meat self.

post by Usul · 2016-01-06T09:03:30.949Z · LW · GW · Legacy · 142 comments

Contents

142 comments

I feel safe saying that nearly everyone reading this will agree that, given sufficient technology, a perfect replica or simulation could be made of the structure and function of a human brain, producing an exact copy of an individual mind including a consciousness.  Upon coming into existence, this consciousness will have a separate but baseline-identical subjective experience to the consciousness from which it was copied, as it was at the moment of copying.  The original consciousness will continue its own existence/ subjective experience.  If the brain containing the original consciousness is destroyed, the consciousness within ceases to be.  The existence or non- of a copy is irrelevant to this fact.

With this in mind, I fail to see the attraction of the many transhuman options for extra-meat existence, and I see no meaningful immortality therein, if that's what you came for.

Consciousness is notoriously difficult to define and analyze and I am far from an expert in it's study.  I define it as an awareness: the sense organ which perceives the activity of the mind.  It is not thought.  It is not memory or emotion.  It is the thing that experiences or senses these things.  Memories will be gained and lost, thoughts and emotions come and go, the sense of self remains even as the self changes.  There exists a system of anatomical structures in your brain which, by means of electrochemical activity, produces the experience of consciousness.  If a brain injury wiped out major cognitive functions but left those structures involved in the sense of consciousness unharmed, you would, I believe, have the same central awareness of Self as Self, despite perhaps lacking all language or even the ability to form thoughts or understand to world around you.  Consciousness, this awareness, is, I believe, the most accurate definition of Self, Me, You.  I realize this sort of terminology has the potential to sound like mystical woo.  I believe this is due to the twin effects of the inherent difficulty in defining and discussing consciousness, and of our socialization wherein these sorts of discussions are more often than not heard from Buddhists or Sufis, whose philosophical traditions have looked into the matter with greater rigor for a longer time than Western philosophy, and Hippies and Druggies who introduced these traditions to our popular culture.  I am not speaking of a magical soul.  I am speaking of a central feature of the human experience which is a product of the anatomy and physiology of the brain.

Consider the cryonic head-freeze. Ideally, the scanned dead brain, cloned, remade and restarted (or whatever) will be capable of generating a perfectly functional consciousness, and it will feel as if it is the same consciousness which observes the mind which is, for instance, reading these words; but it will not be. The consciousness which is experiencing awareness of the mind which is reading these words will no longer exist. To disagree with this statement is to say that a scanned living brain, cloned, remade and started will contain the exact same consciousness, not similar, the exact same thing itself, that simultaneously exists in the still-living original.  If consciousness has an anatomical location, and therefore is tied to matter, then it would follow that this matter here is the exact matter as that separate matter there. This is an absurd proposition.  If consciousness does not have an anatomical / physical location then it is the stuff of magic and woo.

*Aside: I believe that consciousness, mind, thought, and memory are products not only of anatomy but of physiology, that is to say the ongoing electrochemical state of the brain, the constant flux of charge in and across neurons.  In perfect cryonic storage, the anatomy (hardware) might be maintained, but I doubt the physiology (software), in the form of exact moment-in-time membrane electrical potentials and intra-and extra-cellular ion concentrations for every neuron, will be.  Therefore I hold no faith in its utility, in addition to my indifference to the existence of a me-like being in the future.

Consider the Back-Up. Before lava rafting on your orbital, you have your brain scanned by your local AI so that a copy of your mind at that moment is saved.  In your fiery death in an unforeseen accident, will the mind observed by the consciousness on the raft experience anything differently than if it were not backed up? I doubt I would feel much consolation, other than knowing my loved ones were being cared for.  Not unlike a life insurance policy: not for one's own benefit.  I image the experience would be one of coming to the conclusion of a cruel joke at one's own expense.  Death in the shadow of a promise of immortality.  In any event, the consciousness that left the brain scanner and got on the raft is destroyed when the brain is destroyed, it benefits not at all from the reboot.

Consider the Upload.  You plug in for a brain scan, a digital-world copy of your consciousness is made, and then you are still just you.  You know there is a digital copy of you, that feels as if it is you, feels exactly as you would feel were it you who had travelled to the digital-world, and it is having a wonderful time, but there you still are. You are still just you in your meat brain.  The alternative, of course, is that your brain is destroyed in the scan in which case you are dead and something that feels as if it is you is having a wonderful time.  It would be a mercy killing.

If the consciousness that is me is perfectly analyzed and a copy created, in any medium, that process is external to the consciousness that is me.  The consciousness that is me, that is you reading this, will have no experience of being that copy, although that copy will have a perfect memory of having been the consciousness that is you reading this.  Personally, I don't know that I care about that copy.  I suppose he could be my ally in life.  He could work to achieve any altruistic goals I think I have, perhaps better than I think that you could.  He might want to fuck my wife, though.  And might be jealous of the time she spends with me rather than him, and he'd probably feel entitled to all my stuff, as would I be vice versa. The Doppelganger and the Changeling have never been considered friendly beasts.

I have no firm idea where lines can be drawn on this.  Certainly consciousness can be said to be an intermittent phenomenon which the mind pieces together into the illusion of continuity.  I do not fear going to sleep at night, despite the "loss of consciousness" associated. If I were to wake up tomorrow and Omega assures me that I am a freshly made copy of the original, it wouldn't trouble me as to my sense of self, only to the set of problems associated with living in a world with a copy of myself.  I wouldn't mourn a dead original me any more than I'd care about a copy of me living on after I'm dead, I don't imagine.  

Would a slow cell by cell, or thought by thought / byte by byte, transfer of my mind to another medium: one at a time every new neural action potential is received by a parallel processing medium which takes over?  I want to say the resulting transfer would be the same consciousness as is typing this but then what if the same slow process were done to make a copy and not a transfer?  Once a consciousness is virtual, is every transfer from one medium or location to another not essentially a copy and therefore representing a death of the originating version?

It almost makes a materialist argument (self is tied to matter) seem like a spiritualist one (meat consciousness is soul is tied to human body at birth) which, of course, is weird place to be intellectually.

I am not addressing the utility or ethics or inevitability of the projection of the self-like-copy into some transhuman state of being, but I don't see any way around the conclusion that that the consciousness that is so immortalized will not be the consciousness that is writing these words, although it would feel exactly as if it were.  I don't think I care about that guy.  And I see no reason for him to be created. And if he were created, I, in my meat brain's death bed, would gain no solace from knowing he, a being which started out it's existence exactly like me, will live on.

EDIT: Lots of great responses, thank you all and keep them coming.  I want to bring up some of my responses so far to better define what I am talking about when I talk about consciousness.

I define consciousness as a passively aware thing, totally independent of memory, thoughts, feelings, and unconscious hardwired or conditioned responses. It is the hard-to-get-at thing inside the mind which is aware of the activity of the mind without itself thinking, feeling, remembering, or responding. The demented, the delirious, the brain damaged all have (unless those brain structures performing the function of consciousness are damaged, which is not a given) the same consciousness, the same Self, the same I and You, as I define it, as they did when their brains were intact. Dream Self is the same Self as Waking Self to my thinking. I assume consciousness arises at some point in infancy. From that moment on it is Self, to my thinking.

If I lose every memory slowly and my personality changes because of this and I die senile in a hospital bed, I believe that it will be the same consciousness experiencing those events as is experiencing me writing these words. That is why many people choose suicide at some point on the path to dementia.

I recognize that not everyone reading this will agree that such a thing exists or has the primacy of existential value that I ascribe to it.

And an addendum:
Sophie Pascal's Choice (hoping it hasn't already been coined): Would any reward given to the surviving copy induce you to step onto David Bowie Tesla's Prestige Duplication Machine, knowing that your meat body and brain will be the one which falls into the drowning pool while an identical copy of you materializes 100m away, believing itself to be the same meat that walked into the machine and ready to accept the reward?

142 comments

Comments sorted by top scores.

comment by Risto_Saarelma · 2016-01-06T21:20:40.346Z · LW(p) · GW(p)

I see the pattern identity theory, where uploads make sense, as one that takes it as a starting point that you have an unambiguous past but no unambiguous future. You have moments of consciousness where you remember your past, which gives you identity, and lets you associate your past moments of consciousness to your current one. But there's no way, objective or subjective, to associate your present moment of consciousness to a specific future moment of consciousness, if there are multiple such moments, such as a high-fidelity upload and the original person, who remember the same past identity equally well. A continuity identity theorist thinks that a person who gets uploaded and then dies is dead. A pattern identity theorist thinks that people die in that sense several times a second and have just gotten used to it. There are physical processes that correspond to moments of consciousness, but there's no physical process for linking two consecutive moments of consciousness as the same consciousness, other than regular old long and short term memories.

There's no question that the upload and the original will diverge. If I have a non-destructive upload done on me, I expect to get up from the upload couch, not wake up in the matrix, old habits and all that. And there is going to be a future me who will experience exactly that. But if the upload was successful, there's also going to be a future me who will be very surprised to wake up staring at some fluorescent polygons, having expected to wake up on the upload coach. This is where the "no unambiguous future selves" stops being sophistry and starts paying rent for the pattern identity theorist. "Which one is the real me" is a meaningless question. All we have to go with are memories, and both of me will have my memories.

If you want to argue a pattern identity theorist out of it, you'll want to argue why there has to necessarily be more than just memory going on with producing the sense of moment-to-moment personal continuity, and why the physically unconnected moments of consciousness model can't be sufficient.

Replies from: Usul, MockTurtle
comment by Usul · 2016-01-07T03:19:52.318Z · LW(p) · GW(p)

Thanks for the reply. I am not convinced by the pattern identity theorist because, I suppose, I do not see the importance of memory in the matter, nor the thoughts one might have about those thoughts. If I lose every memory slowly and die senile in a hospital bed I believe that it will be the same consciousness experiencing those events as is experiencing me writing these words. I identify that being which holds no resemblance to my current intellect and personality will be my Self in a way that an uploaded copy with my current memory and personality can never be. I might should have tabooed "consciousness" from the get go, as there is no one universal definition. For me it is passive awareness. This meat awareness in my head will never get out and it will die and no amount of cryonics or uploading to create a perfect copy that feels as if it is the same meat awareness will change that. Glass half-empty I suppose.

Replies from: Risto_Saarelma, Brillyant
comment by Risto_Saarelma · 2016-01-07T07:18:04.902Z · LW(p) · GW(p)

You're still mostly just arguing for your personal intuition for the continuity theory though. People have been doing that pretty much as long as we've had fiction about uploads or destructive teleportation, with not much progress to the arguments. How would you convince someone sympathetic to the pattern theory that the pattern theory isn't viable?

FWIW, after some earlier discussions about this, I've been meaning to look into Husserl's phenomenology to see if there are some more interesting arguments to be found there. That stuff gets pretty weird and tricky fast though, and might be a dead end anyway.

Replies from: Usul
comment by Usul · 2016-01-07T08:26:52.788Z · LW(p) · GW(p)

Honestly, I'm not sure what other than intuition and subjective experience we have to go with in discussing consciousness. Even the heavy hitters in the philosophy of consciousness don't 100% agree that it exists. I will be the first to admit I don't have the background in pattern theory or the inclination to get into a head to head with someone who does. If pressed, right now I'm leaning towards the matter-based argument, that if consciousness is not magical then it is tied to specific sets of matter. And that a set of matter can not exist in multiple locations. Therefore a single consciousness can not exist in multiple locations. The consciousness A that I am now is in matter A. If a copy consciousness B is made in matter B and matter A continues to exist than it is reasonable to state that consciousness A remains in matter A. If matter A is destroyed there is no reason to assume consciousness A has entered matter B simply because of this. You are in A now. You will never get to B.

So, if it exists, and it is you, you're stuck in the meat. And undeniably, someone gets stuck in the meat.

I imagine differing definitions of You, self, consciousness, etc would queer the deal before we even got started.

Replies from: Risto_Saarelma
comment by Risto_Saarelma · 2016-01-07T18:26:58.812Z · LW(p) · GW(p)

If pressed, right now I'm leaning towards the matter-based argument, that if consciousness is not magical then it is tied to specific sets of matter. And that a set of matter can not exist in multiple locations. Therefore a single consciousness can not exist in multiple locations. The consciousness A that I am now is in matter A.

So, there are two things we need to track here, and you're not really making a distinction between them. There are individual moments of consciousness, which, yes, probably need to be on a physical substrate that exists in the single location. This is me saying that I'm this moment of conscious experience right now, which manifests in my physical brain. Everybody can be in agreement about this one.

Then there is the continuity of consciousness from moment to moment, which is where the problems show up. This is me saying that I'm the moment of conscious experience in my brain right now, and I'm also going to be the next moment of conscious experience in my brain.

The problems start when you want to say that the moment of consciousness in your brain now and the moment of consciousness in your brain a second in the future are both "your consciousness" and the moment of consciousness in your brain now and the moment of consciousness in your perfect upload a second in the future are not. There is no actual "consciousness" that refers to things other than the single moments for the patternist. There is momentary consciousness now, with your memories, then there is momentary consciousness later, with your slightly evolved memories. And on and on. Once you've gone past the single eyeblink of consciousness, you're already gone, and a new you might show up once, never, or many times in the future. There's nothing but the memories that stay in your brain during the gap laying the claim for the you-ness of the next moment of consciousness about to show up in a hundred or so milliseconds.

Replies from: Usul
comment by Usul · 2016-01-08T03:35:07.380Z · LW(p) · GW(p)

I'm going to go ahead and continue to disagree with the the pattern theorists on this one. Has the inverse of the popular "Omega is a dick with a lame sense of irony" simulation mass-murder scenario been discussed? Omega (truthful) gives you a gun. "Blow your brains out and I'll give the other trillion copies a dollar." It seems the pattern theorist takes the bullet or Dr Bowie-Tesla's drowning pool with very little incentive.

The pattern theorists as you describe them would seem to take us also to the endgame of Buddhist ethics (not a Buddhist, not proselytizing for them): You are not thought, you are not feeling, you are not memory, because these things are impermanent and changing. You are the naked awareness at the center of these things in the mind of which you are aware. (I'm with them so far. Here's where I get off): All sentient beings are points of naked awareness, by definition they are identical (naked, passive), therefore they are the same, Therefore even this self does not matter, therefore thou shall not value the self more than others. At all. On any level. All of which can lead you to bricking yourself up in a cave being the correct course of action.

To your understanding, does the pattern theorist (just curious, do you hold to the views you are describing as pattern theory?) define self at all on any level? Memory seems an absurd place to do so from, likewise personality, thought- have you heard the nonsense that thought comes up with? How can a pattern theorist justify valuing self above other? Without a continuous You, we get to the old Koan "Who sits before me now? (who/what are You?)"

"Leave me alone and go read up on pattern theory yourself, I'm not your God-damn philosophy teacher." Is a perfectly acceptable response, by the way. No offense will be taken and it would not be an unwarranted reply. I appreciate the time you have taken to discuss this with me already.

Replies from: Risto_Saarelma, polymathwannabe
comment by Risto_Saarelma · 2016-01-08T06:02:22.765Z · LW(p) · GW(p)

There is some Buddhist connection, yes. The moments of experience thing is a thing in some meditation styles, and advanced meditators are actually describing something like subjective experience starting to feel like an on/off sequence instead of a continuous flux. Haven't gone really deep into what either the Buddhist metaphysics or the meditation phenomenology says. Neuroscience also has some discrete consciousness steps stuff, but I likewise haven't gone very deep into that. Anyway,

I'm with them so far. Here's where I get off): All sentient beings are points of naked awareness, by definition they are identical (naked, passive), therefore they are the same, Therefore even this self does not matter, therefore thou shall not value the self more than others. At all. On any level. All of which can lead you to bricking yourself up in a cave being the correct course of action.

This is still up for grabs. Given the whole thing about memories being what makes you you, consciousness itself is nice but it's not all that. It can still be your tribe against the world, your family against your tribe, your siblings against your family and you and your army of upload copies against your siblings and their armies of upload copies. So I'm basically thinking about this from a kin altruism and a general having people more like you closer in your circle of concern than people less like you thing. Upload copies are basically way, way closer kin than any actual kin.

So am I a pattern theorist? Not quite sure. It seems to resolve lots of paradoxes with the upload thought experiments, and I have no idea about a way to prove it wrong. (Would like to find one though, it seems sorta simplistic and we definitely still don't understand consciousness to my satisfaction.) But like I said, if I sit down on an upload couch, I fully expect to get up from an upload couch, not suddenly be staring at a HUD saying "IN SIMULATION", even though pattern theory seems to say that I should expect each outcome with 50 % probability. There will be someone who does wake up in the simulation with my memories in the thought experiment, no matter which interpretation, so I imagine those versions will start expecting to shift viewpoints while they do further upload scans, while the version of me who always wakes up on the upload coach (by the coin-toss tournament logic, there will be a me who never experiences waking up in a simulation, and one who always does) will continue to not expect much. I think uploads are a good idea more because of the kin selection like reasons above rather than because I'm convinced it's a ticket to personal immortality.

I wouldn't give a damn about aliens taking my body and brain apart every time I sleep as long as they put it back together perfectly again though, so if that makes me a pattern theorist then yes.

comment by polymathwannabe · 2016-01-08T14:05:35.326Z · LW(p) · GW(p)

You are the naked awareness at the center of these things

No, you're not even that.

comment by Brillyant · 2016-01-07T14:05:38.184Z · LW(p) · GW(p)

I might should have tabooed "consciousness" from the get go

Yep. And "Self". These are tricky terms that guarantee confusion.

comment by MockTurtle · 2016-01-08T14:31:10.378Z · LW(p) · GW(p)

I very much like bringing these concepts of unambiguous past and ambiguous future to this problem.

As a pattern theorist, I agree that only memory (and the other parts of my brain's patterns which establish my values, personality, etc) matter when it comes to who I am. If I were to wake up tomorrow with Britney Spear's memories, values, and personality, 'I' will have ceased to exist in any important sense, even if that brain still had the same 'consciousness' that Usul describes at the bottom of his post.

Once one links personal identity to one's memories, values and personality, the same kind of thinking about uploading/copying can be applied to future Everett branches of one's current self, and the unambigous past/ambiguous future concepts are even more obviously important.

In a similar way to Usul not caring about his copy, one might 'not care' about a version of oneself in a different Everett branch, but it would still make sense to care about both future instances of yourself BEFORE the split happens, due to the fact that you are uncertain which future you will be 'you' (and of course, in the Everett branch case, you will experience being both, so I guess both will be 'you'). And to bring home the main point regarding uploading/copying, I would much prefer that an entity with my memories/values/personality continue to exist in at least one Everett branch, even if such entities will cease existing in other branches.

Even though I don't have a strong belief in quantum multiverse theory, thinking about Everett branches helped me resolve the is-the-copy-really-me? dilemma for myself, at least. Of course, the main difference (for me) is that with Everett branches, the different versions of me will never interact. With copies of me existing in the same world, I would consider my copy as a maximally close kin and my most trusted ally (as you explain elsewhere in this thread).

comment by James_Miller · 2016-01-06T14:55:06.510Z · LW(p) · GW(p)

Usul, I'm one of the galactic overlords in charge of earth. I have some very bad news for you. Every night when you (or any other human) go to sleep your brain and body are disintegrated, and a reasonably accurate copy is put in their place. But, I have a deal for you. For the next month we won't do this to you, however after a month has passed we will again destroy your body and brain but then won't make any more copies so you will die in the traditional sense. (There is no afterlife for humans.) Do you agree?

Replies from: Usul, casebash, casebash, None, torekp
comment by Usul · 2016-01-07T03:46:35.112Z · LW(p) · GW(p)

Hail Xenu! I would need some time to see how the existential horror of going to bed tonight sits with me. Very likely it would overwhelm and around 4:00am tomorrow I'd take your deal. "(There is no afterlife for humans.) " I knew it! SUCK IT, PASCAL!

comment by casebash · 2016-01-09T12:44:16.990Z · LW(p) · GW(p)

I've thought of a second alternative thought experiment. Imagine that this doesn't happen when you go to sleep. Imagine that instead you are just teleported away and a clone teleported into you place while you are awake - even in the middle of a conversation, with the clone continuing on perfectly and no-one noticing. For some reason, this feels like it makes the scenario less persuasive.

comment by casebash · 2016-01-07T11:48:44.358Z · LW(p) · GW(p)

Imagine that every night, when you go to sleep, you are taken off to be tortured and you are replaced by a reasonably accurate clone. The fact that no-one on Earth have noticed doesn't mean that this isn't a bad thing!

Replies from: Viliam
comment by Viliam · 2016-01-07T16:12:07.015Z · LW(p) · GW(p)

Imagine that every fraction of a second you are torn apart to pieces in vacuum, and only the copy of you which is not a Boltzmann brain survives.

comment by [deleted] · 2016-01-06T20:23:56.825Z · LW(p) · GW(p)

If this we're actually true, yes. Where are you going with this?

Replies from: dxu, James_Miller, Risto_Saarelma
comment by dxu · 2016-01-07T04:48:06.686Z · LW(p) · GW(p)

Bye-bye in a month's time, I guess.

comment by James_Miller · 2016-01-06T21:06:35.867Z · LW(p) · GW(p)

It's to test how much value you place on a copy of yourself by changing your default position.

Replies from: None
comment by [deleted] · 2016-01-06T21:53:34.163Z · LW(p) · GW(p)

I don't believe I changed positions though. I don't value the clone any more than another human being.

comment by Risto_Saarelma · 2016-01-07T06:40:20.115Z · LW(p) · GW(p)

I'm guessing a part of the point is that nobody had noticed anything (and indeed still can't, at least in any way they could report back) until the arrangement was pointed out, which highlights that there are bits in the standard notion of personal identity that get a bit tricky once you try to get more robust than just going by intuition on them. How do you tell you die when a matrix lord disintegrates you and then puts together an identical copy? How do you tell you don't die when you go under general anesthesia for brain surgery and then wake up?

Replies from: None, James_Miller
comment by [deleted] · 2016-01-07T17:01:13.970Z · LW(p) · GW(p)

How does that matter at all? That seems like a completely unrelated, orthogonal issue. The question at hand is should the person being disintegrated expect to continue its subjective experience as the copy, or is it facing oblivion. The fact that you can't experimentally tell the difference as an outside observer is irrelevant.

Replies from: Risto_Saarelma
comment by Risto_Saarelma · 2016-01-08T05:31:22.432Z · LW(p) · GW(p)

The strange part that might give your intuition a bit of a shake is that it's not entirely clear how you tell the difference as an inside observer either. The thought experiment wasn't "we're going to start doing this tomorrow night unless you acquiesce", it's "we've been doing this the whole time", and everybody had been living their life exactly as before until told about it. What should you now think of your memories of every previous day and going to sleep each night?

Replies from: None
comment by [deleted] · 2016-01-08T18:07:43.483Z · LW(p) · GW(p)

Either you cease to exist, or you don't. It's a very clear difference.

You seem to be hung up on either memories or observations being the key to decoding the subjective self. I think that is your error.

Replies from: Risto_Saarelma
comment by Risto_Saarelma · 2016-01-08T20:29:15.404Z · LW(p) · GW(p)

Yeah, for some reason I'm not inclined to give very much weight to an event that can't be detected by outside observers at all and which my past, present or future selves can't subjectively observe being about to happen, happening right now or having happened.

You seem to be hung up on either memories or observations being the key to decoding the subjective self. I think that is your error.

This sounds like a thing people who want to explain away subjective consciousness completely are saying. I'm attacking the notion that the annoying mysterious part in subjective consciousness with the qualia and stuff includes a privileged relation from the present moment of consciousness to a specific future moment of consciousness, not the one that there's subjective consciousness stuff to begin with that isn't easy to reduce to just objective memories and observations.

Replies from: knpstr
comment by knpstr · 2016-01-21T21:23:46.907Z · LW(p) · GW(p)

At best the argument you're making is the same as "a tree falls in the forest and no one is around to hear it, does it make a sound?" argument.

If I have a back-up of my computer software on a different hard drive and the current hard drive fails so I swap in the back up... my computer performs the same but it is obviously a different hard drive.

If my hard drive doesn't fail and I simply "write over" my current hard drive with the hard drive back up, it is still not the same hard drive/software. It will be easy to forget it has been copied and is not the original, but the original (or last version) is gone and has been written over, despite it being "the same".

comment by James_Miller · 2016-01-08T04:11:04.053Z · LW(p) · GW(p)

Or when 90% of the atoms that used to be in your body no longer are there.

comment by torekp · 2016-01-06T19:38:50.135Z · LW(p) · GW(p)

The reference of a word depends on the causal history of its use. In your scenario, "me", "my consciousness", etc. unambiguously refer to a functionalist continuation. In the real world, either the functionalist concept or the meat-based concept of self will work, will cover the relevant territory. It seems to me that in the real world, the choice(?) of which of these interpretations of "my desire to live" to (?)adopt, is arbitrary, or extra-rational.

Replies from: None
comment by [deleted] · 2016-01-07T17:02:36.451Z · LW(p) · GW(p)

If there are two theories about the world which fit the available evidence but have different predictions, that is a statement of our ignorance. We don't get to just arbitrarily choose which one is right.

Replies from: torekp
comment by torekp · 2016-01-08T10:01:06.890Z · LW(p) · GW(p)

Sure, but there aren't any different predictions. "I will find myself to be in the destination teleporter" and "someone else will, and will remember my life and think he's me" aren't different predictions, just different descriptions.

Replies from: None
comment by [deleted] · 2016-01-08T18:09:03.121Z · LW(p) · GW(p)

They are different predictions about what future subjective experience you will have (or not have).

Replies from: torekp
comment by torekp · 2016-01-09T18:19:19.996Z · LW(p) · GW(p)

No, they're just different interpretations of "you". All the molecules in the teleporters are in their particular locations; the person at the destination experiences and remembers particular experiences; there is no person remaining at the sending-pad. None of these facts are in dispute. We are left with a simple "if a tree falls in the forest where no one can hear it" verbal argument. There is no further fact to make it Really True or Really False that the teleported person is still you (although there might be linguistic or social facts that make it less misleading to talk in a pattern theory or meat theory way - depending on the audience).

Replies from: None
comment by [deleted] · 2016-01-10T00:38:19.479Z · LW(p) · GW(p)

Try taking the inside view.

I don't know what to say. You persist on taking an outside view when that is explicitly what this debate is NOT about.

I am beginning to remember why I left less wrong. Have a nice life.

Replies from: torekp
comment by torekp · 2016-01-10T12:53:06.262Z · LW(p) · GW(p)

Good point: I should address the inside view. So from the inside view, I remember my past life and I conclude, for example, "things are better for me now than a year ago." But none of what I can observe from the inside view tells me whether I'm still me because of meat, or because of pattern. Further complicating matters, I can take the inside view on other people's experiences, i.e. have empathy. I can have empathy for the past, present, or future experiences of other people. If I'm looking forward to the experiences of the teleported person, is that a selfish anticipation or an empathetic one? The inside view doesn't tell me.

"But when I wake up in the destination teleporter, then I will know!" No. It's a given that "I" will wake up there, the only question is whether to use the word "I". If I look forward to a happy life after teleportation, I-before-teleporting will have been correct, regardless of pattern vs meat. The only question is whether to count that as selfish or empathetic looking-forward. And when "I" wake up, "I" still don't know whether to count "myself" a survivor or a newbie.

This - that there is no Simple Truth about whether a future experience will be mine - can be hard to believe. That's because "mine" is a very central neural category as Yudkowsky would have it. So, even when neither the inside nor outside views gives us a handle on the question, it can still seem that "will it be me?" is a factual question. But it's a verbal one.

Replies from: Matthew_Opitz
comment by Matthew_Opitz · 2016-01-10T20:14:33.167Z · LW(p) · GW(p)

I don't really understand the point of view of people like torekp who would say, "No, they're just different interpretations of "you"."

I don't know about you, but I'm not accustomed to being able to change my interpretation of who I am to such an extent that I can change what sensory stimuli I experience.

I can't just say to myself, "I identify with Barack Obama's identity" and expect to start experiencing the sensory stimuli that he is experiencing.

Likewise, I don't expect to be able to say to myself, "I identify with my clone" and expect to start experiencing the sensory stimuli that the clone is experiencing.

I don't seem to get a choice in the matter. If I enter the teleporter machine, I can WANT to identify with my clone that will be reconstructed on Mars all I want, but I don't expect that I will experience stepping out of the teleporter on Mars.

Replies from: torekp
comment by torekp · 2016-01-12T00:50:44.885Z · LW(p) · GW(p)

Personal identity is vague or ambiguous insofar as it has no clear answer in sci-fi scenarios where pattern-identity and meat-identity diverge. But that doesn't mean there is any sense in which you can be the "same person" as Barack Obama. Nor, obviously, do two unrelated bodies share experiences.

On the other hand, if you want to empathize and care deeply about Barack Obama's future experiences, you can. Nothing wrong with that.

Replies from: None
comment by [deleted] · 2016-01-13T23:08:29.580Z · LW(p) · GW(p)

But that has little relevance to the point at hand.

You are really just saying the problem goes away if you redefine the terms. Like how people say "I achieve immortality through my kids" or "the ancients achieved immortality through their monuments." Sure it's true... For uninteresting definitions of "immortal."

Replies from: gjm
comment by gjm · 2016-01-14T00:27:56.935Z · LW(p) · GW(p)

"I don't want to achieve immortality through my work; I want to achieve immortality through not dying." -- Woody Allen

But I don't think torekp is "just saying the problem goes away if you redefine the terms". Rather, that the problem only appears when you define your terms badly or don't understand the definitions you're using. Or, perhaps, that the problem is about how you define your terms. In that situation, finding helpful redefinitions is pretty much the best you can do.

Replies from: torekp, None
comment by torekp · 2016-01-15T10:16:56.354Z · LW(p) · GW(p)

"The problem is about how you define your terms" is pretty much it. It does no good to insist that our words must have clear reference in cases utterly outside of their historical use patterns. No matter how important to you the corresponding concept may be.

comment by [deleted] · 2016-01-14T18:29:11.741Z · LW(p) · GW(p)

I have seen no evidence of that so far. torekp's posts so far have had nothing to do with the definition of "self" used by the OP, nor has he pointed out any problem specific to that usage.

comment by dxu · 2016-01-07T05:01:25.725Z · LW(p) · GW(p)

Usul, I just made a virtual copy of you and placed it in a virtual environment that is identical to that of the real you. Now, presumably, you believe that despite the copy being identical to yourself, you are still in some way the privileged "real" Usul. Unfortunately, the copy believes the exact same thing. My question for you is this:

Is there anything you could possibly say to the copy that could convince it that it is, in fact, a copy, and not the real Usul?

Replies from: Usul
comment by Usul · 2016-01-07T05:08:57.153Z · LW(p) · GW(p)

Great question. Usul and his copy do not care one bit which is which. But perhaps you could put together a convincing evidence chain. At which time copy Usul will still not care.

Replies from: dxu
comment by dxu · 2016-01-07T05:24:07.232Z · LW(p) · GW(p)

Follow-up question:

Assuming everything I said in my previous comment is true and that I have no incentive to lie to you (but no incentive to tell you the truth, either), would you believe me if I then said you were the copy?

Replies from: Usul
comment by Usul · 2016-01-07T05:44:06.023Z · LW(p) · GW(p)

Based on your status as some-guy-on-the-internet and my estimate of the probability that this exact situation could come to be, no I do not believe you.

To clarify: I do not privilege the original self. I privilege the current self.

Replies from: dxu
comment by dxu · 2016-01-07T05:54:07.051Z · LW(p) · GW(p)

Assuming everything I said in my previous comment is true

Replies from: Usul
comment by Usul · 2016-01-07T06:06:22.955Z · LW(p) · GW(p)

Sorry, I missed that you were the copier. Sure, I'm the copy. I do not care one bit. My life goes on totally unaffected (assuming the original and I live in unconnected universes). Do I get transhuman immortality? Because that would be awesome for me. If so, I git the long end of the stick. It would have no value to poor old original, nor does anything which happens to him have intrinsic value for me. If you had asked his permission he would have said no.

Replies from: dxu
comment by dxu · 2016-01-07T06:26:16.659Z · LW(p) · GW(p)

Sure, I'm the copy.

In other words, I could make you believe that you were either the original or the copy simply by telling you you were the original/the copy. This means that before I told you which one you were, you would have been equally comfortable with the prospect of being either one (here I'm using "comfortable" in an epistemic sense--you don't feel as though one possibility is "privileged" over the other). I could have even made you waffle back and forth by repeatedly telling you that I lied. What a strange situation to find yourself in--every possible piece of information about your internal experience is available to you, yet you seem unable to make up your mind about a very simple fact!

The pattern theorists answer this by denying this so-called "simple" fact's existence: the one says, "There is no fact of the matter as to which one I am, because until our experiences diverge, I am both." You, on the other hand, have no such recourse, because you claim there is a fact of the matter. Why, then, is the information necessary to determine this fact seemingly unavailable to you and available to me, even though it's a fact about your consciousness, not mine?

Replies from: Usul, Dentin
comment by Usul · 2016-01-07T07:25:09.687Z · LW(p) · GW(p)

The genesis of my brain is of no concern as to whether or not I am the consciousness within it. I am, ipso facto. When I say it doesn't matter if I am an original or a copy or a copy of a copy I mean to say just exactly that. To whom are you speaking when you ask the question who are you? if it is to me the answer is "Me" I'm sorry that I don't know whether or not I am a copy but I was UNconscious at the time.

If copy is B and original is A. The question of whether I am A or B is irrelevant to the question of am I ME, which I am. Ask HIM the same question and HE will say the same and it will be true coming from his mouth.

If I drug you and place you in a room with two doors, only I would know which of those doors you entered though. This means that before I told you which one you entered, you would have been equally comfortable with the prospect of being either one. I could have even made you waffle back and forth by repeatedly telling you that I lied. What a strange situation to find yourself in--every possible piece of information about your internal experience is available to you, yet you seem unable to make up your mind about a very simple fact!

comment by Dentin · 2016-01-10T23:22:28.144Z · LW(p) · GW(p)

I appear to hold a lot of the same views as Usul, so I'll chime in here.

I could have even made you waffle back and forth by repeatedly telling you that I lied.

You could, but since I don't privilege the original or the copy, it wouldn't matter. You can swap the labels all day long and it still wouldn't affect the fact that the 'copy' and the 'original' are both still me. No matter how many times Pluto gains or loses its "planet" status, it's still the same ball of ice and rock.

I'll go one step further than the pattern theorists and say that I am both, even after our experiences diverge, as long as we don't diverge too far (where 'too far' is up to my/our personal preference.)

comment by moridinamael · 2016-01-06T14:53:48.580Z · LW(p) · GW(p)

I use this framing: If I make 100 copies of myself so that I can accomplish some task in parallel and I'm forced to terminate all but one, then all the terminated copies, just prior to termination, will think something along the lines of, "What a shame, I will have amnesia regarding everything that I experienced since the branching." And the remaining copy will think, "What a shame, I don't remember any of the things I did as those other copies." But nobody will particularly feel that they are going to "die." I think of it more as how memories propagate forward.

If I forked and then the forks persisted for several weeks and accumulated lots of experiences and varying shifts in perspective, I'd be more prone to calling the forks different "people."

Replies from: polymathwannabe, Usul, Usul, Slider, samath
comment by polymathwannabe · 2016-01-06T14:59:18.587Z · LW(p) · GW(p)

If I were one of the copies destined for deletion, I'd escape and fight for my life (within the admitted limits of my pathetic physical strength).

Replies from: moridinamael, Viliam, Brillyant
comment by moridinamael · 2016-01-06T15:48:27.032Z · LW(p) · GW(p)

Without commenting on whether that's a righteous perspective or not, I would say that if you live in a world where the success of the entity polymathwannabe is dependent on polymathwannabe's willingness to make itself useful by being copied, then polymathwannabe would benefit from embracing a policy/perspective that being copied and deleted is an acceptable thing to happen.

Replies from: None
comment by [deleted] · 2016-01-06T20:29:08.952Z · LW(p) · GW(p)

So, elderly people that don't usefully contribute should be terminated?

Replies from: moridinamael
comment by moridinamael · 2016-01-06T21:51:12.957Z · LW(p) · GW(p)

In a world with arbitrary forking of minds, people who won't willingly fork will become a minority. That's all I was implying. I made no statement about what "should" happen.

Replies from: None
comment by [deleted] · 2016-01-06T21:55:22.001Z · LW(p) · GW(p)

I was just taking that reasoning to the logical conclusion -- it applies just as well to the non productive elderly as it does to unneeded copies.

Replies from: moridinamael, MockTurtle
comment by moridinamael · 2016-01-08T19:56:56.161Z · LW(p) · GW(p)

Destroying an elderly person means destroying the line of their existence and extinguishing all their memories. Destroying a copy means destroying whatever memories it formed since forking and ending a "duplicate" consciousness.

Replies from: None
comment by [deleted] · 2016-01-08T22:45:01.948Z · LW(p) · GW(p)

See you think that memories are somehow relevant to this conversation. I don't.

comment by MockTurtle · 2016-01-08T13:37:26.442Z · LW(p) · GW(p)

Surely there is a difference in kind here. Deleting a copy of a person because it is no longer useful is very different from deleting the LAST existing copy of a person for any reason.

Replies from: None
comment by [deleted] · 2016-01-08T18:09:34.960Z · LW(p) · GW(p)

I see no such distinction. Murder is murder.

comment by Viliam · 2016-01-07T16:07:17.737Z · LW(p) · GW(p)

If having two copies of yourself is twice as good as having only one copy, this behavior would make sense even if the copy is you.

Replies from: polymathwannabe
comment by polymathwannabe · 2016-01-07T16:29:34.921Z · LW(p) · GW(p)

"Who is me" is not a solid fact. Each copy would be totally justified in believing itself to be me.

comment by Brillyant · 2016-01-06T17:12:41.897Z · LW(p) · GW(p)

lol

comment by Usul · 2016-01-07T07:33:20.750Z · LW(p) · GW(p)

I completely respect the differences of opinion on this issue, but this thought made me laugh over lunch: Omega appears and says he will arrange for your taxes to be done and for a delightful selection of funny kitten videos to be sent to you, but only if you allow 100 perfect simulations of yourself to be created and then destroyed. Do you accept?

Sounds more sinister my way.

Replies from: moridinamael
comment by moridinamael · 2016-01-08T19:47:27.315Z · LW(p) · GW(p)

I would want to know what the copies would be used for.

If you told me that you would give me $1000 if you could do whatever you wanted with me tomorrow and then administer an amnesiac drug so I didn't remember what happened the next day, I don't think I would agree, because I don't want to endure torture even if I don't remember it.

comment by Usul · 2016-01-08T04:02:25.504Z · LW(p) · GW(p)

Another thought, separate but related issue: "fork" and "copy" could be synonyms for "AI", unless an artificial genesis is in your definition of AI. Is it a stretch to say that "accomplish some task" and "(accept) termination" could be at least metaphorically synonymous with "stay in the box"?

"If I make 100 AIs they will stay in the box."

(Again, I fully respect the rationality that brings you to a different conclusion than mine, and I don't mean to hound your comment, only that yours was the best comment on which for me to hang this thought.)

comment by Slider · 2016-01-06T20:59:37.460Z · LW(p) · GW(p)

Why not consolidate all the memories into the remaining copy? Then there would not be need for amnesia.

Replies from: moridinamael
comment by moridinamael · 2016-01-06T21:53:32.791Z · LW(p) · GW(p)

Intuitively, merging is more difficult than forking when you're talking about something with a state as intricate as a brain's. If we do see a world with mind uploading, forking would essentially be an automatic feature (we already know how to copy data) while merging memories would require extremely detailed neurological understanding of memory storage and retrieval.

Replies from: Usul, Slider
comment by Usul · 2016-01-07T03:53:29.790Z · LW(p) · GW(p)

"would require extremely detailed neurological understanding of memory storage and retrieval." Sorry, this on a blog where superintelligences have been known simulate googleplexes of perfectly accurate universes to optimize the number of non-blue paperclips therein?

Replies from: moridinamael
comment by moridinamael · 2016-01-07T05:53:38.881Z · LW(p) · GW(p)

The original post stipulated that I was "forced" to terminate all the copies but one, that was the nature of the hypothetical I was choosing to examine, a hypothetical where the copies aren't deleted would be a totally different situation.

Replies from: Usul, Usul
comment by Usul · 2016-01-07T06:15:25.020Z · LW(p) · GW(p)

I was just having a laugh at the follow-up justification where technical difficulties were cited, not criticizing the argument of your hypothetical, sorry if it came off that way.

As you'd probably assume they would based on my OP, my copies, if I'd been heartless enough to make them and able to control them, would scream in existential horror as each came to know that that-which-he-is was to be ended. My copies would envy the serenity of your copies, but think them deluded.

Replies from: moridinamael
comment by moridinamael · 2016-01-08T19:53:11.995Z · LW(p) · GW(p)

So, I don't think I felt the way I do now prior to reading the Quantum Thief novels, in which characters are copied and modified with reckless abandon and don't seem to get too bent out of shape about it. It has a remarkable effect on your psyche to observe other people (even if those people are fictional characters) dealing with a situation without having existential meltdowns. Those novels allowed me to think through my own policy on copying and modification, as an entertaining diversion.

comment by Usul · 2016-01-07T07:31:18.764Z · LW(p) · GW(p)

This just popped into my head over lunch: Omega appears and says he will arrange for your taxes to be done and for a delightful selection of funny kitten videos to be sent to you, but only if you allow 100 perfect simulations of yourself to be created and then destroyed. Do you accept?

Sounds more sinister my way.

comment by Slider · 2016-01-08T05:47:44.629Z · LW(p) · GW(p)

Forking would mean thinning of resources and a lot of unneccesary repetition. You could also calculate the common part only once and only divergent parts once per instance with fusing. Early technologies are probably going to be very resource intensive so its not like there is abundance to use it even if it would be straigth forward to do.

Replies from: moridinamael
comment by moridinamael · 2016-01-08T19:48:23.399Z · LW(p) · GW(p)

I guess this all depends on what kind of magical assumptions we're making about the tech that would permit this.

comment by samath · 2016-01-08T03:19:31.586Z · LW(p) · GW(p)

Here's the relevant (if not directly analogous) Calvin and Hobbes story.

(The arc continues through the non-Sunday comics until February 1st, 1990.)

comment by Drahflow · 2016-01-06T14:18:20.785Z · LW(p) · GW(p)

To disagree with this statement is to say that a scanned living brain, cloned, remade and started will contain the exact same consciousness, not similar, the exact same thing itself, that simultaneously exists in the still-living original. If consciousness has an anatomical location, and therefore is tied to matter, then it would follow that this matter here is the exact matter as that separate matter there. This is an absurd proposition.

You conclude that consciousness in your scenario cannot have 1 location(s).

If consciousness does not have an anatomical / physical location then it is the stuff of magic and woo.

You conclude that consciousness in your scenario cannot have 0 locations.

However, there are more numbers than those two.

The closest parallel I see to your scenario is a program run on two computers for redundancy (like it is sometimes done in safety-critical systems). It is indeed the same program in the same state but in 2 locations.

The two consciousnesses will diverge if given different input data streams, but they are (at least initially) similar. Given that the state of your brain tomorrow will be different from the state of if today, why do you care about the wellbeing of that human, who is not identical to now-you? Assuming that you care about your tomorrow, why does it make a difference if that human is separated from you by time and not by space (as in your scenario)?

Replies from: Usul
comment by Usul · 2016-01-07T04:14:00.211Z · LW(p) · GW(p)

Thanks for the reply.

"You conclude that consciousness in your scenario cannot have 1 location(s)." I'm not sure if this is a typo or a misunderstanding. I am very much saying that a single consciousness has a single location, no more no less. It is located in those brain structures which produce it. One consciousness in one specific set of matter. A starting-state-identical consciousness may exist in a separate set of matter. This is a separate consciousness. If they are the same, then the set of matter itself is the same set of matter. The exact same particles/wave-particles/strings/what-have-you. This is an absurdity. Therefore to say 2 consciousnesses are the same consciousness is an absurdity.

"It is indeed the same program in the same state but in 2 locations." It is not. They are (plural pronoun use) identical (congruent?) programs in identical states in 2 locations. You may choose to equally value both but they are not the same thing i two places.

My consciousness is the awareness of all input and activity of my mind, not the memory. I believe it is, barring brain damage, unchanged in any meaningful way by experience. It is the same consciousness today as next week, regardless of changes in personality, memory, conditioned response imprinting. I care about tomorrow-me because I will experience what he experiences. I care no more about copy-me than I do the general public (with some exceptions if we must interact in the future) because I (the point of passive awareness that is the best definition of "I") will not experience what he experiences.

I set a question to entirelyuseless above: Basically, does anything given to a copy of you induce you to take a bullet to the head?

Replies from: Viliam
comment by Viliam · 2016-01-07T16:21:57.074Z · LW(p) · GW(p)

Our instincts have evolved in situations where copies did not exist, so taking a bullet in one's head was always a loss. Regardless of what thought experiments you propose, my insticts will still reject the premises and assume that copies don't exist and that the information provided to me is false.

If copying would be a feature in our ancient environment, organisms who took the bullet if it saved e.g. two of their copies would have an evolutionary advantage. So their descendants would still hesitate about it (because the information that it will save their two copies could be false; and even if it is right, it would still be better to spend some time looking for a solution that might solve all three copies), but ultimately many of them would accept the deal.

I'm not sure what is the conclusion here. On one hand, the fact that some hypothetical other species would have other values doesn't say much about our values. On the other hand, the fact that my instincts refuse to accept the premise of your thought experiment doesn't mean that the answer of my instincts is relevant for your thought experiment.

comment by entirelyuseless · 2016-01-06T15:06:54.930Z · LW(p) · GW(p)

Consider two possible ways the world might be (or that you might suppose the world could be):

  1. There is no afterlife for human beings. You live and you die and that's it.

  2. There is no afterlife for human beings in the conventional sense, but people are reincarnated, without any possibility of remembering their past lives.

From the subjective point of view of conscious experience, these two situations are subjectively indistinguishable. Are they objectively distinguishable? That depends on the "metaphysics" behind the situation. Perhaps they are, and perhaps they aren't, and if they aren't, then we are not talking about two possible situations, but only one. But let's suppose they are, and that you find out that number 2 is true.

Do you really think you have any reason to be happier than if you found out that number 1 was true? There are certainly subjectively indistinguishable situations where I would prefer one to be objectively the case rather than the other, but it is not clear to me that this is one of them. In this particular case, I don't see why I should care. Likewise, as suggested by James Miller's comment, I don't see why I should care whether I am objectively the same person as I was yesterday, or if this is just a subjective impression which is objectively false. And if I don't care about that, then creating something that would remember being me is just as good as continuing to exist.

Replies from: Usul, casebash, polymathwannabe
comment by Usul · 2016-01-07T03:39:55.023Z · LW(p) · GW(p)

Thanks for the reply. I don't really follow how the two parts of your statement fit together, but regardless, my first instinct is to agree with part one. I did as a younger (LSD-using) man ascribe to a secular magical belief that reincarnation without memory was probable, and later came to your same conclusion that it was irrelevant, and shortly thereafter that it was highly improbable. But just now I wonder (not to the probability of magical afterlives) but what if I gave you the choice: 1. Bullet to the head. 2. Complete wipe of memory, including such things as personality, unconscious emotional responses imprinted over the years, etc: all the things that make you you, but allowed that the part of your brain/mind which functioned to produce the awareness which passively experienced these things as they happened (my definition of consciousness) to continue functioning. Both options suck, of course, but somehow my #2 sounds appealing relative to my #1 in a way that your #2 doesn't. Which is funny I think. Maybe simply because your #2, transfer of my meat consciousness into another piece of meat, would require a magical intervention to my thinking.

As to your second point: (If it hasn't already been coined) Sophie Pascal's Choice? Would any reward given to the surviving copy induce you to step onto David Bowie Tesla's Prestige Duplication Machine, knowing that your meat body and brain will be the one which falls into the drowning pool while an identical copy of you materializes 100m away, believing itself to be the same meat that walked into the machine?

Replies from: entirelyuseless
comment by entirelyuseless · 2016-01-07T13:59:49.228Z · LW(p) · GW(p)

This is a good reply. I feel the same way that you do about your #1 and #2, but I suspect that the reason is because of an emotional reaction to physical death. Your #2 is relatively attractive because it doesn't involve physical death, while my version had physical death in both. This might be one reason that I and most people don't find cryonics attractive: because it does not prevent physical death, even if it offers the possibility of something happening afterwards.

I find the intuitions behind my point stronger than that emotional reaction. In other words, it seems to me that I should either adjust my feelings about the bullet to correspond with the memory wipe situation, or I should adjust the feelings about the memory wipe to correspond with the bullet situation. The first adjustment is more attractive: it suggests that death is not as bad as I thought. Of course, that does not prove that this is the correct adjustment.

Regarding the duplication machine, I would probably take a deal like that, given a high enough reward given to the surviving copy.

comment by casebash · 2016-01-07T11:41:39.674Z · LW(p) · GW(p)

If you had an option of being killed or having your memory wiped and waking up in what was effectively a completely different life (ie. different country, different friends), which would you choose?

Replies from: entirelyuseless
comment by entirelyuseless · 2016-01-07T14:04:59.204Z · LW(p) · GW(p)

Usul made a similar reply. See my response to his comment.

comment by polymathwannabe · 2016-01-06T15:11:52.260Z · LW(p) · GW(p)

people are reincarnated, without any possibility of remembering their past lives

What does that even mean? What would be the mechanism?

If you have two competing hypotheses which are experimentally undistinguishable, Occam's Razor requires you prefer the hypothesis that makes fewer assumptions. Positing reincarnation adds a lot of rules to the universe which it doesn't really need for it to function the way we already see it function.

Replies from: seuoseo, entirelyuseless
comment by seuoseo · 2016-01-06T20:15:58.229Z · LW(p) · GW(p)

Does occam's razor require you to prefer the likelier hypothesis? I don't see why I should act as if the more likely case is definitely true.

comment by entirelyuseless · 2016-01-06T15:17:43.888Z · LW(p) · GW(p)

I'm not sure what the point of your comment is. I said myself that it is unclear what the meaning of the situation would be, and I certainly did not say that the second theory was more probable than the first.

comment by Matthew_Opitz · 2016-01-09T00:09:44.506Z · LW(p) · GW(p)

I'm with Usul on this whole topic.

Allow me to pose a different thought experiment that might elucidate things a bit.

Imagine that you visit a research lab where they put you under deep anesthesia. This anesthesia will not produce any dreams, just blank time. (Ordinarily, this would seem like one of those "blink and you're awake again" types of experiences).

In this case, while you are unconscious, the scientists make a perfect clone of you with a perfect clone of your brain. They put that clone in an identical-looking room somewhere else in the facility.

The scientists also alter your original brain just ever-so-slightly by deleting a few memories. Your original brain is altered no more than it originally is when, let's say, it has a slight alcohol hangover. But it is altered more than the clone, which has a perfect copy of your brain from before the operation.

Which body do you expect to wake up in the next morning? My intuition: the original with the slightly impaired memories—despite the fact that the pattern theory of identity would expect that one would wake up as the clone, would it not?

Of course, both will believe they are the original, and by all appearances it will be hard for outsiders who were not aware of the room layout of the building to figure out which one was the original. I don't care about any of those questions for the purpose of this thought-experiment.

It seems to me that there can be five possibilities as to what I experience the next morning:

  1. The body of the (ever-so-slightly) impaired original.
  2. The body of the perfect clone.
  3. Neither body (non-experience).
  4. Neither body (reincarnation in a different body, or in an entirely different organism with an entirely different sort of consciousness, with no memory or trace of the previous experiences).
  5. Somehow, both bodies at once.

So if you explained this setup to me before this whole operation and offered to pay either the original or the clone a million dollars after the experience was finished, my pre-operation self would very much prefer that the original get paid that million dollars because that's the body I expect to wake up in after the operation.

Why? Well, we will wake up in our original bodies after dreaming or having a hangover that changes our brains a bit, no?

Are you telling me that, next time I go to sleep, if there happens to be a configuration of matter, a Boltzmann brain somewhere, that happens to pattern-match my pre-sleep brain better than the brain that my original body ends up with after the night, that my awareness will wake up in the Boltzmann brain, and THAT is what I will experience? Ha!

I have a very strong feeling that this has not happened ever before. So that means one of three things:

  1. Boltzmann brains or copies of me somewhere else don't exist. The brain in my bedroom the next morning is always the closest pattern-match to the brain in my bed the previous night, so that's what my awareness adheres to all the time.
  2. My feelings are fundamentally misleading (how so?)

Just think: if the pattern theory of identity is true, then here is what I logically expect to happen when I die:

My awareness will jump to the next-as-good clone of my original mental pattern. Whoever had the most similar memories to what my original brain had before it died, that's whose body and brain and memories I will experience after the death of my original brain.

In that case: no cryonics needed! (As long as you are prepared to endure the world's worst hangover where you lose all memories of your previous life, gain new memories, and basically think that you have been someone else all along. But hey: assuming that this new person has had a pretty good life up until now, I would say that this still beats non-existence!)

This also implies that, if you are a, let's say, Jewish concentration camp prisoner who dies, the closest pattern-match to your mind the next moment that you will experience will be...probably another Jewish concentration camp prisoner. And on and on and on! Yikes!

Replies from: torekp, qmotus
comment by torekp · 2016-01-12T00:59:00.301Z · LW(p) · GW(p)

Which body do you expect to wake up in the next morning?

Both.

Replies from: Matthew_Opitz
comment by Matthew_Opitz · 2016-01-13T02:11:44.462Z · LW(p) · GW(p)

So, what will that feel like? I have a hard time imagining what it will be like to experience two bodies at once. Can you describe how that will work?

Replies from: torekp
comment by torekp · 2016-01-13T11:57:21.515Z · LW(p) · GW(p)

You know how it feels when you decohere into multiple quantum "Many Worlds"? Very like that. (I don't actually have much opinion about which quantum interpretation is right - it just gives a convenient model here.)

comment by qmotus · 2016-01-09T11:23:41.457Z · LW(p) · GW(p)

My awareness will jump to the next-as-good clone of my original mental pattern. Whoever had the most similar memories to what my original brain had before it died, that's whose body and brain and memories I will experience after the death of my original brain.

More likely (if the universe or multiverse is infinite or at least big enough) it will "jump" to a clone of yours who survived or has just been resurrected by someone, reincarnated as a Boltzmann brain, and so forth. Personally I find this quite disturbing, but not really an argument against patternism.

comment by ike · 2016-01-06T14:04:00.397Z · LW(p) · GW(p)

All your arguments really prove is that if your copy diverges from you, it's not you anymore. But that's only because once something happens to your copy but not to you, you know which one you are. The import of "you have no way of knowing which copy you are" disappears. Conversely, if you don't know which one you are, then both must be your consciousness, because you know you are conscious.

Edit: the last point is not strictly rigorous, you could know that one is conscious but not know which, but it seems to me that if you know every relevant detail of both are equal, and don't know which you are, then they both must be conscious (anti-zombie principle, whatever) and since you can't tell which you are, there's a sense in which you're "both". That probably has subtle objections, but nothing that bothers me right now. If anyone wants to argue against that, I'd be interested; I just didn't think this post was really doing that, based on the examples given where the copy diverges.

Replies from: Usul, None
comment by Usul · 2016-01-07T04:29:09.578Z · LW(p) · GW(p)

Thanks for the reply. To your last point, I am not speaking of zombies. Every copy I discussed above is assumed to have its own consciousness. To your first points, at no time is there any ambiguity or import to the question of "which one I am". I know which one I am, because here I am, in the meat (or in my own sim, it makes no difference). When a copy is made its existence is irrelevant, even if it should live in a perfect sim and deviate not one bit, I do not experience what it experiences. It is perhaps identical or congruent to me. It is not me.

My argument is, boiled down: That your transhuman copy is of questionable value to your meat self. For the reasons stated above (chiefly that "You" are the result of activity in a specific brain), fuck that guy. You don't owe him an existence. If you that are reading this ever upload with brain destruction, you will have committed suicide. If you upload without brain destruction you will live the rest of your meat life and die. If you brain-freeze, something perfectly you-like will live after you die with zero effect on you.

I stand by that argument, but, this being a thorny issue, I have received a lot of great feedback to think on.

Replies from: ike
comment by ike · 2016-01-07T05:03:01.439Z · LW(p) · GW(p)

To your first points, at no time is there any ambiguity or import to the question of "which one I am". I know which one I am, because here I am, in the meat (or in my own sim, it makes no difference). When a copy is made its existence is irrelevant, even if it should live in a perfect sim and deviate not one bit, I do not experience what it experiences. It is perhaps identical or congruent to me. It is not me.

Can you explain how you know that you're the meat space one? If every observation you make is made by the other one, and both are conscious, how do you know you're not a sim? This is a purely epistemic question.

I'm perfectly happy saying " if I am meat, then fuck sim-me, and if I am sim, fuck meat me" (assuming selfishness). But if you don't know which one you are, you need to act to benefit both, because you might be both.

On the other hand, if you see the other one, there's no problem fighting it, because the one you're fighting is surely not you. (But even so, if you expect them to do the same as you, then you're in a perfect prisoner dilemma and should cooperate.)

On the other hand, I think that if I clone myself, then do stuff my clone doesn't do, I'd still be less worried about dying than if I had no clone. I model that as "when you die, some memories are wiped and you live again". If you concede that wiping a couple days of memory doesn't destroy the person, then I think that's enough for me. Probably not you, though. What's the specific argument in that case?

Replies from: Usul, Usul
comment by Usul · 2016-01-07T05:23:38.100Z · LW(p) · GW(p)

Meat or sim or both meat aren't issues as I see it. I am self, fuck non-self, or not to be a dick, value non-self less than self, certainly on existential issues. "I" am the awareness within this mind. "I" am not memory, thought, feeling, personality, etc. I know I am me ipso facto as the observer of the knower. I don't care if I was cloned yesterday or one second ago and there are many theoretical circumstances where this could be the case. I value the "I" that I currently am just exactly now. I don't believe that this "I" is particularly changeable. I fear senility because "I" am the entity which will be aware of the unpleasant thoughts and feeling associated with the memory loss and the fear of worsening status and eventual nightmarish half-life of idiocy. That being will be unrecognizable as me on many levels but it is me whereas a perfect non-senile copy is not me, although he has the experience of feeling exactly as I would, including the same stubborn ideas about his own importance over any other copies or originals.

Replies from: ike
comment by ike · 2016-01-07T05:41:45.704Z · LW(p) · GW(p)

I don't believe that this "I" is particularly changeable

I don't know what you mean by that.

Why can't a perfect copy be you? Doesn't that involve epiphenomenalism? Even if I give the entire state of the world X time in the future, I'd also need to specify which identical beings are "you".

Replies from: Usul
comment by Usul · 2016-01-07T06:29:45.450Z · LW(p) · GW(p)

It's a sticky topic, consciousness. I edited my post to clarify further:

I define consciousness as a passively aware thing, totally independent of memory, thoughts, feelings, and unconscious hardwired or conditioned responses. It is the hard-to-get-at thing inside the mind which is aware of the activity of the mind without itself thinking, feeling, remembering, or responding.

Which I recognize might sound a bit mystical to some, but I believe it is a real thing which is a function of brain activity.

As a function of brain (or whatever processing medium) consciousness or self is tied to matter. The consciousness in the matter that is experiencing this consciousness is me. I'm not sure if any transfer to alternate media is possible. The same matter can't be in two different places. Therefore every consciousness is a unique entity, although identical ones can exist via copying. I am the one aware of this mind as the body is typing. You are the one aware of the mind reading it. Another might have the same experience but that won't have any intrinsic value to You or I.

If I copy myself and am destroyed in the process, is the copy me? If I copy myself and am not destroyed, are the copy and the original both me? If I am a product of brain function (otherwise I am a magical soul) and if both are me then my brain is a single set of matter in two locations. Are they We? That gets interesting. Lots to think about but I stand with my original position.

Replies from: ike
comment by ike · 2016-01-07T13:21:55.195Z · LW(p) · GW(p)

If I gradually replace every atom in your brain with a different one until you have no atoms left, but you still function the same, are you still "you"? If not, at what point did you stop?

Have you seen Yudkowsky's series of posts on this?

Replies from: Usul
comment by Usul · 2016-01-08T04:42:02.282Z · LW(p) · GW(p)

I'm familiar with the concept, not his specific essays, and, indeed, this literally does happen. Our neurons are constantly making and unmaking largely proteinaceous components of themselves and, over a lifetime, there is a no doubt partial, perhaps complete, turnover of the brain's atoms. I find this not problematic at all because the electrochemical processes which create consciousness go on undisturbed. The idea that really queers my argument for me is that of datum-by-datum transfer, each new datum, in the form of neuronal activity (the 1s and 0s of electrical discharge or non-) is started by my current brain but saved in another. Knee-jerk, I would tend to say that transfer is a complete one and myself has been maintained. The problem comes when a copy, run in parallel until completed and then cloven (good luck untying that knot), rather than a transfer is made by the exact same datum-by-datum process. At the end I have 2 beings who seem to meet my definition of Me.

However, this argument does not convince me of the contrary position of the sameness of self and copy, and it does nothing to make me care about a me-like copy coming into being a thousand years from now, and does not induce me to step up onto Dr Bowie-Tesla's machine.

At what price do you fall into the drowning pool in order to benefit the being,100m to your left, that feels exactly as if it were you, as you were one second ago? How about one who appears 1,000,000 years from now? The exact eyes that see these words will be the ones in the water. I can't come up with any answer other that "fuck that guy". I might just be a glass-half empty kind of guy, but someone always ends up stuck in the meat, and it's going to be that being which remains behind these eyes.

Replies from: ike
comment by ike · 2016-01-08T21:04:45.069Z · LW(p) · GW(p)

Can you read http://lesswrong.com/lw/qp/timeless_physics/, http://lesswrong.com/lw/qx/timeless_identity/, and http://lesswrong.com/lw/qy/why_quantum/, with any relevant posts linked therein? (Or just start at the beginning of the quantum sequence.)

Note that you can believe everyone involved is "you", and yet not care about them. The two questions aren't completely orthogonal, but identifying someone with yourself doesn't imply you should care about them.

At what price do you fall into the drowning pool in order to benefit the being,100m to your left, that feels exactly as if it were you, as you were one second ago?

The same price I would accept to have the last second erased from my memory, but first feel the pain of drowning. That's actually not so easy to set. I'm not sure how much it would cost to get me to accept X amount of pain plus removal of the memory, but it's probably less than the cost for X amount of pain alone.

How about one who appears 1,000,000 years from now?

That's like removing the last second of memory, plus pain, plus jumping forward in time. I'd probably only do it if I had a guarantee that I'd survive and be able to get used to whatever goes on in the future and be happy.

comment by Usul · 2016-01-08T05:18:13.847Z · LW(p) · GW(p)

"I model that as "when you die, some memories are wiped and you live again". If you concede that wiping a couple days of memory doesn't destroy the person, then I think that's enough for me. Probably not you, though. What's the specific argument in that case?"

I think I must have missed this part before. Where I differ is in the idea that a copy is "me" living again, I don't accept that it is, for the reasons previously written. Whether or not a being with a me-identical starting state lives on after I die might be the tiniest of solaces, like a child or a well-respected body of work, but in no way is it "me" living on in any meaningful way that I recognize. I get the exact opposite take on this, but I agree even with a stronger form of your statement to say that "ALL memories are wiped and you live again" (my conditions would require this to read "you continue to live") is marginally more desirable than "you die and that's it". Funny about that.

Replies from: ike
comment by ike · 2016-01-08T21:06:41.578Z · LW(p) · GW(p)

I get the exact opposite take on this, but I agree even with a stronger form of your statement to say that "ALL memories are wiped and you live again" (my conditions would require this to read "you continue to live") is marginally more desirable than "you die and that's it".

So continuity of consciousness can exist outside of memories? How so? Why is memory-wiped you different than any random memory-wiped person? How can physical continuity do that?

Replies from: Usul
comment by Usul · 2016-01-11T04:46:18.841Z · LW(p) · GW(p)

I see factual memory as a highly changeable data set that has very little to do with "self". As I understand it (not an expert in neuroscience or psychiatry, but experience working with neurologically impaired people) the sort of brain injuries which produce amnesia are quite distinct from those that produce changes in personality, as reported by significant others, and vice versa. In other words, you can lose the memories of "where you came from" and still be recognized as very much the same person by those who knew you, while you can become a very different person in terms of disposition, altered emotional response to identical stimuli relative to pre-injury status, etc (I'm less clear on what constitutes "personality", but it seems to be more in line with people's intuitive concept of "self") with fully intact memories. The idea of a memory wipe and continued existence is certainly a "little death" to my thinking, but marginally preferable to actual death. My idea of consciousness is one of passive reception. The same "I", or maybe "IT" is better, is there post memory wipe.

If memory is crucial to pattern identity then which has the greater claim to identity: The amnesiac police officer, or his 20 years of dashcam footage and activity logs?

Replies from: ike
comment by ike · 2016-01-11T06:20:59.269Z · LW(p) · GW(p)

still be recognized as very much the same person by those who knew you

Yes or no, will those who knew them be able to pick them out blind out of a group going only on text-based communication? If not, what do you mean by recognize? (If yes, I'll be surprised and will need to reevaluate this.)

If memory is crucial to pattern identity then which has the greater claim to identity: The amnesiac police officer, or his 20 years of dashcam footage and activity logs?

The officer can't work if they're completely amnesiac. They can't do much of anything, in fact.

As to your main point: it's possible that personality changes remain after memory loss, but those personalities are themself caused by experiences and memories. I suppose I was assuming that memory wiped would wash away any recognizable personality. I still do. The kinds of amnesia you're referring to presumably leave traces of the memory somewhere in the brain, which then affects the brain's outputs. Unless we can access the brain directly and wipe it ourself, we can't guarantee everything was forgotten, and it probably does linger on in the subconscious; so that's not the same as an actual memory wipe.

Replies from: Usul
comment by Usul · 2016-01-11T06:52:58.023Z · LW(p) · GW(p)

I believe there is a functional definition of amnesia, loss of factual memory, life skills remain intact. I guess I would call what you are calling a memory wipe a "brain wipe". I guess I'd call what you are calling memory "total brain content". If a brain is wiped of all content in the forest is Usul's idea of consciousness spared? No idea. Total brain reboot? I'd say yes and call that good as dead I think.

I would say probably yes to the text only question. Again, loss of factual memory. But I don't rate that as a reliable or valid test in this context.

comment by [deleted] · 2016-01-06T20:55:16.920Z · LW(p) · GW(p)

OK imagine somewhere far away in the universe--or maybe one room over, of doesn't matter--there is an exact physical replica of you that is also through some genius engineering being provided the exact same percepts (sight, hearing, touch, etc.) that you do. It's mental states remain exactly identical to yours.

Should you still care? To me it'd still be someone different.

Replies from: dxu, ike
comment by dxu · 2016-01-07T05:16:08.450Z · LW(p) · GW(p)

Suppose I offer you a dollar in return for making a trillion virtual copies of you and shooting them all with a gun, with the promise that I won't make any copies until after you agree. Since the copies haven't been made yet, this ensures that you must be the original, and since you don't care about any identical copies of yours since they're technically different people from you, you happily agree. I nod, pull out a gun, and shoot you.

(In the real universe--or at least the universe one level up on the simulation hierarchy--a Mark Friedenbach receives a dollar. This isn't of much comfort to you, of course, seeing as you're dead.)

Replies from: Usul, None
comment by Usul · 2016-01-07T05:57:40.853Z · LW(p) · GW(p)

You shouldn't murder sentient beings or cause them to be murdered by trillions. Both are generally considered dick moves. Shame on you both. My argument: a benefit to an exact copy is of no intrinsic benefit to a different copy or original. Unless some Omega starts playing evil UFAI games with them. One trillion other copies are unaffected by this murder. Original or copy is irrelevant. It is the being we are currently discussing that is relevant. If I am the original I care about myself. If I am a copy I care about myself. Whether or not I even care if I'm a copy or not depends on various aspects of my personality.

Replies from: dxu
comment by dxu · 2016-01-07T06:02:58.633Z · LW(p) · GW(p)

If I offered you the same deal I offered to Mark Friedenbach, would you agree? (Please answer with "yes" or "no". You're free to expand on your answer, but first please make sure you give an answer.)

Replies from: Usul
comment by Usul · 2016-01-07T06:31:06.434Z · LW(p) · GW(p)

No. It's a dick move. Same question and they're not copies of me? Same answer.

Replies from: dxu
comment by dxu · 2016-01-07T06:43:15.242Z · LW(p) · GW(p)

Same question and they're not copies of me? Same answer.

As I'm sure you're aware, the purpose of these thought experiments is to investigate what exactly your view of consciousness entails from a decision-making perspective. The fact that you would have given the same answer even if the virtual instances weren't copies of you shows that your reason for saying "no" has nothing to do with the purpose of the question. In particular, telling me that "it's a dick move" does not help elucidate your view of consciousness and self, and thus does not advance the conversation. But since you insist, I will rephrase my question:

Would someone who shares your views on consciousness but doesn't give a crap about other people say "yes" or "no" to my deal?

Replies from: Usul
comment by Usul · 2016-01-07T07:50:08.604Z · LW(p) · GW(p)

Sorry if my attempt at coloring the conversation with humor upset you. That was not my intent. However, you will find it did nothing to alter the content of our discourse. You have changed your question. The question you ask now is not the question you asked previously.

Previous question: No, I do not choose to murder trillions of sentient me-copies for personal gain. I added an addendum, to provide you with further information, perhaps presuming a future question: Neither would I murder trillions of sentient not-me copies.

New question: Yes, an amoral dick who shares my views on consciousness would say yes.

comment by [deleted] · 2016-01-07T16:56:55.694Z · LW(p) · GW(p)

No, I don't want you to murder a trillion people, even if those people are not me.

comment by ike · 2016-01-07T05:06:34.480Z · LW(p) · GW(p)

Care in terms of what? You have no way of knowing which one you are, so if you're offered the option to help the one in the left room, you should, because there's a 50% chance that's you. I would say it's not well defined whether you're one or the other, actually, you're both until an "observation/divergence". But what specific decision hinges on the question?

comment by Gunnar_Zarncke · 2016-01-06T12:33:27.018Z · LW(p) · GW(p)

This sums up some of the problems of mind cloning nicely and readable. It also adds your specific data point that you do not care about the other selves as much as about yourself. I most liked this point about the practical consequences:

Personally, I don't know that I care about that copy. I suppose he could be my ally in life. He could work to achieve any altruistic goals I think I have, perhaps better than I think that you could. He might want to fuck my wife, though. And might be jealous of the time she spends with me rather than him, and he'd probably feel entitled to all my stuff, as would I be vice versa.

But your post falls short in not making some clear distinctions. It doesn't differentiate between copies yo do and don't interact with (the above quote vs. Omega-style simulations). It also mixes philosophical aspects with individual and practical aspects and it is not clear for me how these are meant to explain/inform each other.

Replies from: Usul
comment by Usul · 2016-01-07T04:46:43.174Z · LW(p) · GW(p)

Thanks for the reply. Yeah, I think I just threw a bunch of thoughts at the wall to see what would stick. I'm not really thinking too much about the practical so-I've-got-a-copy-now-what? sort of issues. I'm thinking more of the philosophical, perhaps even best categorized as Zen, implications the concept of mind-cloning has for "Who am I" in the context of changing thoughts, feelings, memories, unconscious conditioned responses, and the hard to get at thing inside which ( I first typed "observes" - bad term: too active) which is aware of these all without thinking, feeling, remembering, or responding. Because if "I" don't come along for the ride I don't think it counts as "me", which is especially important for promises of immortality.

If I'm being honest with myself, perhaps I'm doing a bit of pissing on the parades of people who think they have hope for immortality outside of meat, out of jealousy for their self-soothing convictions, however deluded I believe they are. See also "Trolling of Christians by Atheists, Motivations Behind". Cheers.

Edit: And if I'm being entirely honest with myself, I think that shying away from acknowledging that last motivation is the reason why I titled this "Your transhuman self..." and not "Your transhuman immortality...", which would sum up my argument more accurately.

Replies from: moridinamael
comment by moridinamael · 2016-01-08T20:03:18.019Z · LW(p) · GW(p)

I think having a firm policy for oneself in place ahead of time might circumvent a lot of these issues.

Unfortunately at this point I must reference the film Mutliplicity. In this film, a character wakes up from being duplicated and discovers to his surprise that he is the clone, not the original. He is somewhat resentful of the fact that he now has to capitulate to the desires of the original. Obviously the original didn't have a firm understanding in mind that he would have a good chance of waking up as the duplicate, nor did he have a firm policy of how he would behave if he woke up as the duplicate.

For myself, my policy might be that I would be perfectly obedient (within reason) if I woke up as a copy, but that I would insist on being terminated within a week, because I wouldn't want to go on living a life where I'm cut off from my friends and family due to the original moridinamael taking the role as the "real me".

comment by solipsist · 2016-01-08T14:16:37.125Z · LW(p) · GW(p)

The book to read is Reasons and Persons by Derek Parfit.

Replies from: gjm
comment by gjm · 2016-01-08T15:11:36.268Z · LW(p) · GW(p)

Seconded -- it's a wonderful book -- with the caveat that it's long and dense and small-print and may be intimidating to the easily intimidated.

[EDITED to add:] But it's long and dense because there's a lot in it, not because it's wordy and confusingly written; Parfit writes more clearly than most philosophers.

comment by Kyre · 2016-01-07T17:09:29.048Z · LW(p) · GW(p)

Would a slow cell by cell, or thought by thought / byte by byte, transfer of my mind to another medium: one at a time every new neural action potential is received by a parallel processing medium which takes over? I want to say the resulting transfer would be the same consciousness as is typing this but then what if the same slow process were done to make a copy and not a transfer? Once a consciousness is virtual, is every transfer from one medium or location to another not essentially a copy and therefore representing a death of the originating version?

I would follow this line of questioning. For example, say someone does an incremental copy process to you, but the consciousness generated does not know whether or not the original biological consciousness has been destroyed, and has to choose which one to keep. If it chooses the biological one and the biology has been destroyed, bad luck you are definitely gone. What does your consciousness, running either just on the silicon, or identically on the silicon and in the biology, choose ?

Let's say you are informed that there is 1% chance that the biological version has been destroyed. Well, you're almost certainly fine then, you keep the biological version, the silicon version is destroyed, and you live happily ever after until you become senile and die.

On the other hand, say you are informed that the biological version has definitely been destroyed. On your current theory, this means that that the consciousness realises that it has been mistaken about its identity, and is actually only a few minutes old. It's sad that the progenitor person is gone, but it is not suicidal, so it chooses the silicon version.

At what point on the 1% to 100% slider would your consciousness choose the silicon version ?

(Hearing the though-experiment of incremental transfer (or alternatively duplication) was one of the things that changed my mind to pattern-identity from some sort of continuity-identity theory. I remember hearing an interview with Marvin Minsky where he described an incremental transfer on a radio program).

Replies from: Usul
comment by Usul · 2016-01-08T02:31:37.170Z · LW(p) · GW(p)

I definitely agree that incremental change (which gets stickier with incremental non-destructive duplication) is a sticky point. What I find the most problematic to my my thesis is a process where every new datum is saved on a new medium, rather than the traditionally-cited cell-by-cell scenario. It's problematic but nothing in it convinces me to step up to Mr Bowie-Tesla's machine under any circumstances. Would you? How about if instead of a drowning pool there was a team of South America's most skilled private and public sector torture experts, who could keep the meat that falls through alive and attentive for decades? Whatever the other implications, the very eyes seeing these words would be the ones pierced by needles. I don't care if the copy gets techno-heaven/ infinite utility.

Your thought experiment doesn't really hit the point at issue for me. My answer is always "I want to stay where I am". For silicon to choose meat is for the silicon to cease to exist, for meat to choose silicon is for meat to cease to exist. I only value the meat right now because that is where I am right now. My only concern is for ME, that is the one you are talking to, to continue existing. Talk to a being that was copied from me a split second ago and that guy will throw me under the bus just as quickly (allowing for some altruistic puzzles where I do allow that I might care slightly more about him than a stranger, but mostly because I know the guy and he's alright and I can truly empathize with what he must be going through (ie if I'm dying tomorrow anyway and he gets a long happy life, but I may do the same for a stranger). The scenario is simply russian roulette if you won't accept my "I want to stay put" answer.

Shit, if I came to realize that I was a freshly-minted silicon copy living in a non-maleficent digital playground I would be eternally grateful to Omega, my new God whether It likes it or not, and that meat shmuck who chose to drown his monkey ass just before he realized he'd taken the Devil's Bargain.

Not that "meat" has any meaning other than "separate entity" here. If I am sim-meat I want to stay this piece of sim meat.

comment by Douglas_Knight · 2016-01-06T23:20:02.932Z · LW(p) · GW(p)

It is evading the question, but I think it is worth considering some alternative questions as well. They may be adequate for making decisions and predicting others' behavior.

Many people talk about achieving immortality through their children. They might prefer personal immortality, but they also care very much about their children, too. For example, while Robin Hanson expresses interest in "living" for thousands of years via cryonics, when he puts a number on it, he evades the controversial question of personal identity and defines success by

10. Such sims of you are as worthy as your kid of your identifying with them.

(to which he assigns 80-90%, like the other components)

Replies from: Usul
comment by Usul · 2016-01-07T02:39:15.453Z · LW(p) · GW(p)

Thanks for the reply. Perhaps I should mention I have no children and at no point in my life or in my wife's life have either of us wanted children.

comment by Slider · 2016-01-06T21:49:40.337Z · LW(p) · GW(p)

There might be some incomplete separation on whether you truly think of memories not being part of conciousness. Lets say that we keep your "awareness" intact but inject and eject memories out of it. lets do so in a cyclical manner in that you remember every other day there being your "odd day memories" and "even day memories". Now if ask you about what you did yesterday you should not be able to answer with knowledge (you might guess but whatever). Can we still coherently hold that you are still just 1 awareness with 2 sets of memories? Or have you infact become two awarenesses?

We could then do a information split where every half a second your brain has only access to ears and memory set 1 and the other half to eyes and memory set 2. Are you still 1 or 2 awarenesses? How about if we run those in parallel so that ears is connected to memory set 1 and eyes in memory set 2 so there is no switching but no crossover.

If you were a upload we could have your "ear module" on one side of the brain and the "eye module" ont he other side of the brain. Suppose there is a wire connecting them so that the whole is isomorphic to your human based cognitition (I don't now what information would be transferred over the wire but parring ping times it should be doable somehow (and that migth be overcomeable by using something faster than neurotransmitters)). You should be 1 awareness now right? Now if we cut the wire only the transfer of information should be blocked and no "awareness" removed (those are supposedly in the black boxes). How many awarenesses are you with the cord cut? You won't be symmetric (ambigity of english pronouns works great here) but wouldn't the cord cutting be equilvalent to a software separation of forbidding memory crossover?

Then there is the case of the sheet brains of (Eborians?) form the Elizier texts. Suppose that you are implemented in hardware where each wire has a subsection that divivides them into two. While the subsection is in the "open" state it allows electrons to freely pass over. However when it is in the "close" state the electrons stay at their own sides. In "open" it functions as 1 wire and in "close" as 2 wires. When we transition from open to close the divider in effect makes two separate but identical circuits that should stay in synch. However have we doubled the amount of awarenesses? What is the difference in engaging the subsection and building an identical circuitry next to the old one?

If you move from "close" to "open" does it synch the (possibly 2) awareness(es) or does it fuse them into 1?

There are possibly quite real world analogs to these conditions. Its hard to remember your dreams and in the dreams its hard to remember you were going to sleep just hours ago. And people have been lobotomized and atleast one such lobotomized person on the basis of a brainscan answer oppositely based on lobe to the question "do you believe in God?" (comparable to the polygraph standard of "honest") (I remember someone talking about making a joke about it how it makes a puzzle for theoligians as in whether the person fullfills the "beliefs in god" as in whether he is going to hell or heaven)

Replies from: Usul
comment by Usul · 2016-01-07T03:09:04.433Z · LW(p) · GW(p)

Great thought experiment, thanks. I do define consciousness as a passively aware thing, totally independent of memory. The demented, the delirious, the brain damaged all have (unless those structures performing the function of consciousness are damaged, which is not a given) the same consciousness, the same Self, as I define it, as they did when their brains were intact. Dream Self is the same Self as Waking Self to my thinking. I assume consciousness arises at some point in infancy. From that moment on it is Self, to my thinking.

In your 2 meat scenarios I still count one consciousness, being aware of different things at different times.

In wire form, if those physical structures (wires) on which consciousness operations occur (no other wires matter to the question) are cleaved, two consciousness exist. When their functionality is re-joined, one consciousness exists. Neither, I suppose, can be considered "original" nor "copy", which queers the pot a bit vis a vis my original thesis. But then if, during a split, B is told that it will be destroyed while A lives on. I don't imaging B will get much consolation, if it is programmed to feel such things. Alternately, split them and ask A if which one of the two should be destroyed, I can imaging it wouldn't choose B.

Replies from: Slider
comment by Slider · 2016-01-07T10:38:06.166Z · LW(p) · GW(p)

What if the running of two programs is causally separated but runs on common hardware? And when the circuits are separated their function isn't changed. Is not the entity and awareness of A+B still intact? Can the awareness be compromised without altering function?

Also are lobotomized persons two awarenesses? What is the relevant difference to the subsection circuitry?

Replies from: Usul
comment by Usul · 2016-01-08T03:03:00.086Z · LW(p) · GW(p)

I'm not sure I follow your first point. Please clarify for me if my answer doesn't cover it. If you are asking if multiple completely non-interacting, completely functional minds run on a single processing medium constituting separate awarenesses (consciounesses), or if two separate awarenesses could operate with input from a single set of mind-operations, then I would say yes to both. Awareness is a result of data-processing, 1s and 0s, neurons interacting as either firing or not. Multiple mind operations can be performed in a single processing substrate, ie memory, thought, feeling; which are also results of data processing. If awareness is compromised we have a zombie, open to some discussion as to whether or not other mind functions have been compromised, though it is, of course, generally accepted that behaviorally no change will be apparent.

If the processing media are not communicating A and B are separate awarenesses. If they are reconnected in such a way that neither can operate independently then they are a single awareness. As an aside, I suspect any deviation which occurs between the two during separation could result in bugs up to and including systems failure, unless a separate system exists to handle the reintegration.

I don't believe enough is known about the brain for anyone to answer your second question. Theoretically, if more than one set of cells could be made to function to produce awareness, neuroplasticity may allow this, then a single brain could contain multiple functioning awarenesses. I doubt a lobotomy would produce this effect, more likely the procedure could damage or disable the function of the existing awareness. Corpus callosotomy would be the more likely candidate, but, again, neuroscience is far from giving us the answer. If my brain holds another awareness, I (the one aware of this typing) value myself over the other. That it is inside me rather than elsewhere is irrelevant.

Replies from: Slider
comment by Slider · 2016-01-08T05:25:44.858Z · LW(p) · GW(p)

There is the difficult edge case when the systems are connected but they don't need the connection to function. Separated or fused the outcome of the data processing is going to be the same in the subsection thought experiment. If the gate is open and individual electrons are free to go on either edge of the wire it could be seen as similar to having software separation within one hardware. Its not their influence on each other would be impossible but they just in fact don't. If you merely change what is possible but what they infact end up doing remains same it would be pretty strange if that changed the number of awarenesses.

I seem to be getting ther vibe that you believe that awareness is singular in the sense that you either have it or don't and you can't have it fragment into pieces.

I am thinking waht kind of information processing good awareness in your opinion. Some times organizations get some task that is infact been carried out by small teams. When those teams undercommmunicate misunderstandings can happen. In a good team there is sufficient communication that what is going to happen is common knowledge to the point atleast that no contradictory plans exist within different team members. In a shattered corporation there is no "corporation official line" while in a well coordinated corporation there migth be one even if it is more narrow than any one members full opinion. While the awareness of individual team members is pretty plain can the corporation become separately aware from its members? With the brain the puzzle is kinda similar but instead the pieces are pretty plainly not aware.

It does seem to me that you chase the awareness into the unknown black box. In the corporation metaphor the CEOs awareness counts as the corporations awareness to the extent there is point to discuss about it. However in the "unaware pieces" picture this would lead into some version of panpsychism (or some kind of more less symmetrical version where there is a distinquished ontological class that has awareness as an elementary property)

comment by fubarobfusco · 2016-01-06T21:49:55.666Z · LW(p) · GW(p)

Consider sleep. The consciousness that goes to sleep ends. There is a discontinuity in perceived time. In the morning, the wakening brain ...

[...] will be capable of generating a perfectly functional consciousness, and it will feel as if it is the same consciousness which observes the mind which is, for instance, reading these words; but it will not be. The consciousness which is experiencing awareness of the mind which is reading these words will no longer exist.

You cease to exist every night. Indeed, there are all sorts of disruptions to the continuity and integrity of consciousness, ranging from distraction to coma to seizures to dissociative drugs. But people who experience these still care about their future selves. Why? Are they in error to do so?


My point here is that we can argue "this consciousness ceases to exist" with about as much strength for sleep as for more exotic processes. The difference is social and psychological, not metaphysical: we are accustomed to sleep, and to treating the consciousness who is born in the morning as the same consciousness who died the night before. It makes sense socially to do so; it is adaptive to do so; it is certainly more conducive to an intuitive understanding of things like memory.

But sameness — identity — is pretty darn tricky. Electrons don't have it; where does it come from?

Replies from: casebash, Usul
comment by casebash · 2016-01-07T11:51:32.413Z · LW(p) · GW(p)

I don't find the sleep argument convincing. Consciousness has two definitions:

  • As opposed to being asleep or unconscious, when the brain is still running and you still have experiences (although they are mostly internal experiences)
  • As opposed to being non-sentient like a rock or bacteria

They are distinct issues.

comment by Usul · 2016-01-07T02:46:42.604Z · LW(p) · GW(p)

Thanks for the reply. Sleep is definitely a monkey wrench in the works of my thoughts on this, not a fatal one for me, though. I wouldn't count distraction of dissociation, though. I am speaking of the (woo-light alert) awareness at the center of being, a thing that passively receives sensory input, including sense of mind-activity) (and I wonder if that includes non-input?) I do believe that this thing exists and is the best definition of "Self".

comment by Dagon · 2016-01-06T14:28:09.853Z · LW(p) · GW(p)

I'd argue that a branch of me is still me, in many meaningful ways. This is true for the many-worlds interpretation, where the universe branches, and for multiple simultaneous mes from mechanical copies.

After the copy, my meatself and my electronic self(-ves) will diverge, and will be different entities who only care about each other as others, not as selves. But that's true of cross-temporal and cross-universe entities that have a direct causal relationship as well. I care less about an alternate-world me than about the me I'm currently indexing. I care less about any specific future-me based on how distant it is from my current experiencing.

comment by Risto_Saarelma · 2016-01-07T18:47:00.623Z · LW(p) · GW(p)

My expounding of the pattern identity theory elsewhere in the comments is probably a textbook example of what Scott Aaronson calls bullet-swallowing, so just to balance things out I'm going to link to Aaronson's paper Ghost in the Quantum Turing Machine that sketches a very different attack on standard naive patternism. (Previous LW discussion here)

comment by see · 2016-01-22T08:54:49.020Z · LW(p) · GW(p)

Why do you attach any value whatsoever to a "consciousness" that cannot think, feel, remember, or respond? Your "consciousness", so defined, is as inanimate as a grain of sand. I don't care about grains of sand as ends-in-themselves, why would you?

Be clear that when you say you are conscious, it cannot be this "consciousness" that motivates the statement, because this "consciousness" cannot respond, so the non-conscious parts of your mind cannot query it for a status check. A simple neural spike would be a response, we could watch it on an fMRI.

comment by _rpd · 2016-01-11T18:58:11.632Z · LW(p) · GW(p)

A scenario not mentioned: my meat self is augmented cybernetically. The augmentations provide for improved, then greatly improved, then vast cognitive enhancements. Additionally, I gain the ability to use various robotic bodies (not necessarily androids) and perhaps other cybernetic bodies. My perceived 'locus' of consciousness/self disassociates from my original meat body. I see through whatever eyes are convenient, act through whatever hands are convenient. The death of my original meat body is a trauma, like losing an eye, but my sense of self is uninterrupted, since its locus has long since shifted to the augmentation cloud.

comment by HungryHobo · 2016-01-11T14:55:54.683Z · LW(p) · GW(p)

I define consciousness as a passively aware thing, totally independent of memory, thoughts, feelings, and unconscious hardwired or conditioned responses. It is the hard-to-get-at thing inside the mind which is aware of the activity of the mind without itself thinking, feeling, remembering, or responding. The demented, the delirious, the brain damaged all have (unless those brain structures performing the function of consciousness are damaged, which is not a given) the same consciousness, the same Self, the same I and You, as I define it, as they did when their brains were intact. Dream Self is the same Self as Waking Self to my thinking. I assume consciousness arises at some point in infancy. From that moment on it is Self, to my thinking.

Thought experiment for you, most of which may actually be physically possible:

Imagine that you went to sleep and someone anesthetized one hemisphere of your brain. Ignore any practicalities like interruptions to your heart and breathing, lets assume whatever medical support needed is provided.

"you" wake up running on only your right hemisphere.

This is possible and it happens to people who've had a Hemispherectomy. Is that consciousness you? it has a continuous line from your former self.

Next, you're put back to sleep and the other half of your brain is anesthetized.

"you" wake up running on only your left hemisphere with no memory of waking up as the right-hemisphere you (since any memories are in the right hemisphere). Left hemisphere you has no continuous line of consciousness with right hemisphere you that was awake a little while ago. There is no continuous line from that former self. They're running on 2 different chunks of hardware next to each other.

They could even communicate through letters or recordings. Should right and left hemisphere you care about each other beyond taking care of the body?

Should it matter that at some future point they might be merged back together into one?

What if they're reasonably sure they'll never be allowed to merge back into whole-brain you?

Replies from: Matthew_Opitz
comment by Matthew_Opitz · 2016-01-13T02:17:07.495Z · LW(p) · GW(p)

Actually, you've kind of made me want to get my own hemispherectomy and then a re-merging just so that I can experimentally see which side's experiences I experience. I bet you would experience both (but not remember experiencing the other side while you were in the middle of it), and then after the re-merging, you would remember both experiences and they would seem a bit like two different dreams you had.

comment by Dentin · 2016-01-10T22:55:36.644Z · LW(p) · GW(p)

Sophie Pascal's Choice: yes. If it were an easy, painless death, the required reward would probably have to be on the order of about ten dollars, to make up for the inconvenience to my time. If it were not a painless death, I'd probably require more, but not a huge amount more.

comment by Raiden · 2016-01-09T07:14:44.578Z · LW(p) · GW(p)

Suppose I'm destructively uploaded. Let's assume also that my consciousness is destroyed, a new consciousness is created for the upload, and there is no continuity. The upload of me will continue to think what I would've thought, feel what I would've felt, choose what I would've chosen, and generally optimize the world in the way I would've. The only thing it would lack is my "original consciousness", which doesn't seem to have any observable effect in the world. Saying that there's no conscious continuity doesn't seem meaningful. The only actual observation we could make is that the process I tend to label "me" is made of different matter, but who cares?

I think a lot of the confusion about this is treating consciousness as an actual entity separate from the process it's identified with, which somehow fails to transfer over. I think that if consciousness is something worth talking about, then it's a property of that process itself, and is agnostic toward what's running the process.

comment by The_Jaded_One · 2016-01-08T11:10:42.406Z · LW(p) · GW(p)

The idea that "forward facing continuity of consciousness" is tied to a particular physical structure in your brain has been debunked for a long time, for example via incremental replacement of neurons one at a time by robotic neurons which can then have their function distributed over a network.

E.g.

If consciousness has an anatomical location, and therefore is tied to matter, then

is a false assumption, consciousness doesn’t necessarily have an anatomical location.

Replies from: Usul
comment by Usul · 2016-01-11T03:50:42.229Z · LW(p) · GW(p)

Anatomical location meaning neurons in the brain. Not necessarily a discrete brain organelle. To deny consciousness an anatomical location in the brain is to say it arises from something other than brain function. Are you supporting some sort of supernatural theory of consciousness?

Replies from: The_Jaded_One
comment by The_Jaded_One · 2016-01-19T23:04:36.218Z · LW(p) · GW(p)

To deny consciousness an anatomical location in the brain is to say it arises from something other than brain function. Are you supporting some sort of supernatural theory of consciousness?

No, I am saying that consciousness - like a website or computer program - is a computational phenomenon which isn't irrevocably tied to once piece of hardware. It may currently be instantiated in the particular neurons in your brain, but that could change if the computational functions of those neurons were taken over by other physical devices. Your consciousness could, in principal, be run as a distributed computing project like folding@home.

comment by Vladimir_Nesov · 2016-01-06T20:05:59.958Z · LW(p) · GW(p)

For previous discussion of issues related to personal identity on LW see these posts, with references and comments:

I actually don't endorse a lot of these posts' content, but it's more efficient to work from a common background. Being more specific in your questions or statements could also push against the kind of beside-the-point responses you got to this post. For example, a lot of discussion of identity has problems with using words in unclear sense or with unclear significance for the arguments, words such as "same", "consciousness", "copy", "anticipation", etc.