Brain Upload Comic

post by falenas108 · 2011-03-17T09:32:33.560Z · LW · GW · Legacy · 24 comments

Contents

24 comments

http://www.smbc-comics.com/index.php?db=comics&id=2186

Convincing argument, or faulty metaphor?

I would go with the latter, but I don't trust my brain's abilities at 5:30 in the morning.

24 comments

Comments sorted by top scores.

comment by cousin_it · 2011-03-17T11:47:14.561Z · LW(p) · GW(p)

Also see this blogpost by Mark Dominus.

Replies from: TheOtherDave, DanielVarga, Psy-Kosh, MartinB, None
comment by TheOtherDave · 2011-03-17T15:13:39.609Z · LW(p) · GW(p)

John Varley plays with this a lot in his fiction, where the technology to take snapshots of human brains and reload them into force-aged cloned bodies is ubiquitous, although hardware adequate to actually run the snapshots is not available. On the one hand, people behave as though they are effectively immortal... if they die, all they lose is the experiences since the last snapshot.

On the other hand, when they think about it, his characters agree that if they die, they die, and the existence of a snapshot-clone-whatever doesn't change that fact in the least bit. (Though many of them don't care.)

I find that a pretty convincing version of how humans will react to that technology. Consistency is not our great psychological strength.

Replies from: Dreaded_Anomaly
comment by Dreaded_Anomaly · 2011-03-17T23:58:34.271Z · LW(p) · GW(p)

Peter F. Hamilton has similar technology in his Commonwealth novels. It's used in the worst-case scenario when someone's body is destroyed, because the technology to rejuvenate human bodies to a youthful state is the main form of immortality.

Replies from: TheOtherDave
comment by TheOtherDave · 2011-03-18T13:51:10.111Z · LW(p) · GW(p)

(nods)

Actually, Varley's world has that too, though it's a near-future enough setting that it hasn't really sunk in.

That is, nobody has died of old age except voluntarily in the last few decades, but everyone is still used to thinking of themselves as living a few score years.

The narrator in Steel Beach has just turned 100, and this is a recurring theme... she/he/it doesn't feel old, and in fact isn't old by any normal understanding of the word, but "turning 100" nevertheless has cultural associations with being old.

comment by DanielVarga · 2011-03-17T12:12:03.447Z · LW(p) · GW(p)

There are choices other than obliterating or not obliterating the original. We could, say, build an artificial continuous path in the space of consciousnesses between the physical and the uploaded mind. Of course, by the time we are technically capable of establishing continuity this way, we will all realize that continuity is way overrated, and we will have much better things to do with our minds than simply uploading them.

Replies from: Mitchell_Porter
comment by Mitchell_Porter · 2011-03-17T12:48:16.619Z · LW(p) · GW(p)

continuity is way overrated

So can I kill everyone on Earth now? Since you all exist elsewhere in the multiverse.

Replies from: None, DanielVarga, AlephNeil, None, AlephNeil
comment by [deleted] · 2011-03-17T14:49:04.159Z · LW(p) · GW(p)

So can I kill everyone on Earth now? Since you all exist elsewhere in the multiverse.

It would be awkward if we agreed to that and our copies in the multiverse agreed to your copies killing them as well. We'd feel quite foolish.

comment by DanielVarga · 2011-03-17T14:12:45.258Z · LW(p) · GW(p)

Mitchell, how did you get from my short comment on continuity to the assumption that I believe in the most extreme version of Dust Theory? I do not. And I am not a platonist, either.

To humans, continuity is very important. I am a human, so it is very important to me. Please don't kill me. To substrate-independent, self-modifying minds it is not important, unless we manually make them value it when we build those minds.

Replies from: AlephNeil
comment by AlephNeil · 2011-03-17T14:26:24.624Z · LW(p) · GW(p)

To humans, continuity is very important. I am a human, so it is very important to me.

Really? You think continuity between t and t + 1 is important 'in itself', even holding the endpoints fixed, and assuming that you are anaesthetised during that interval?

To me it seems completely obvious and trivial that continuity is irrelevant.

Replies from: DanielVarga
comment by DanielVarga · 2011-03-17T17:21:08.517Z · LW(p) · GW(p)

I agree with you, but I used the word continuity in a different sense. I have just looked up the Stanford page on Personal Identity, and I think I can clear up terminology. I think you talk about physical continuity, and I agree with you about its irrelevance to everyone, including humans. I talk about psychological continuity. I think it is similarly uncontroversial that it is important to humans. The more interesting part of my statement is that psychological continuity is not important (per se) to substrate-independent self-modifying agents.

Replies from: AlephNeil, Nornagest
comment by AlephNeil · 2011-03-17T19:17:54.825Z · LW(p) · GW(p)

I don't think the important thing here is continuity. After all, a person can 'continuously die' from dementia or 'discontinuously survive' after brain surgery to remove a tumour. Surely what matters is the persistence of the information that 'makes you who you are' in some conscious mind somewhere.

The view which many people seem to hold but I regard as 'obviously wrong' is the one which believes in a thread of subjective identity, irreducible to the functional activity and information content of your mind, which might be 'cut' if you do something drastic like change substrate. That even if you (somehow) knew for sure that your copy would structurally isomorphic to and "Turing-test indistinguishable" from you, there would still an epiphenomenal 'extra fact' about whether the copy is 'really you'.

The more interesting part of my statement is that psychological continuity is not important (per se) to substrate-independent self-modifying agents.

That's a good point.

Replies from: TheOtherDave
comment by TheOtherDave · 2011-03-17T20:21:25.689Z · LW(p) · GW(p)

I think what underlies that 'obviously wrong' view in many cases is a recognition of the fact that in practice, we rely on continuity to establish identity.

A great many optical illusions and magic tricks depend on this: if entity A is here at T1 and entity B at T2, and I don't notice any T1/T2 discontinuity, I'm likely to behave as though the same entity had been here throughout.

Of course, generalizing from those intuitions to a more fundamental notion of some kind of epiphenomenal identity is unjustified, as you say.

Then again, claiming that what makes me who I am is functional activity or information content is problematic, also. It isn't clear, for example, that amnesia or brain damage makes me somebody else. Nor is it clear that if someone else is able to emulate me well enough to pass the equivalent of a Turing test, that developing that skill makes them me.

Mostly, I think is a composite notion, like . That is, we judge that identity is preserved by evaluating a close-enough match along many different axes, and there's no single property or set of properties that is both necessary and sufficient to establish identity.

I also don't think it matters very much.

Replies from: AlephNeil
comment by AlephNeil · 2011-03-18T03:37:52.456Z · LW(p) · GW(p)

Then again, claiming that what makes me who I am is functional activity or information content is problematic, also. It isn't clear, for example, that amnesia or brain damage makes me somebody else.

In case you thought I was implying that, let me clarify that the whole point is to deprecate binary oppositions such as "being someone else" vs "being the same person" and "still being oneself" vs "no longer existing".

So of course it's "not clear" that, say, frontal lobe damage leaves you the same person and it's "not clear" that it leaves you as a different person.

Nor is it clear that if someone else is able to emulate me well enough to pass the equivalent of a Turing test, that developing that skill makes them me.

Only if you're talking about identity in the loose, everyday sense, which like 'furniture' is a mishmash of many concepts. On the other hand, if you're talking about whether the mental state of the copy is qualitatively identical to your own (or 'as similar as makes no difference'), then I don't think it's remotely problematic to say that structural and functional isomorphism (or 'as near as makes no difference') guarantees this. Do you?

(This just boils down to "aren't you a functionalist?")

Replies from: TheOtherDave
comment by TheOtherDave · 2011-03-18T13:45:39.199Z · LW(p) · GW(p)

I think "as near as makes no difference" is not sufficiently well defined for the Turing-test-equivalent scenario I'm quoting. The question of "makes no difference to whom?" becomes important.

This is a problem for the traditional Turing test, as well... a great deal depends on who the auditor is; some people turn out to be surprisingly undiscriminating. (Or, well, it surprises me.)

But yes, if I don't take the Turing test bit that I quoted literally, and instead think more abstractly about a sufficiently precise and reliable functional test, then I agree with you.

Actually, I don't consider structural isomorphism necessary in and of itself; functional isomorphism is adequate for my purposes. (Though that said, I do think that an adequately functionally isomorphic system will tend to demonstrate a high level of structural isomorphism as well, although that's not a well-thought-through assertion and my confidence in it is low).

I'm just not sure what such a test might comprise in practice. That is, if I'm in charge of QA for Upload, Inc. and it's my job to make sure that the uploads we produce are adequately functionally isomorphic to the minds of the organic originals to avoid later lawsuits, it's really not clear to me what tests I ought to be performing to ensure that.

comment by Nornagest · 2011-03-17T18:58:46.436Z · LW(p) · GW(p)

I can't think of any arguments objecting to the psychological discontinuity around uploading that don't also apply to, say, the discontinuity of sleep. It's trivially true that people find thinking hard about sort of discontinuity deeply uncomfortable, but it seems less likely that uploading has unique continuity problems associated with it and more likely that it's weird enough to expose issues that exist in everyday life but have been glossed over by familiarity.

comment by AlephNeil · 2011-03-17T13:58:43.869Z · LW(p) · GW(p)

The question of how we should value our "copies in other worlds" is independent of the question of whether we ought to value "continuity in this world (and its 'descendants', if it has descendants)".

Moreover, valuing 'copies in other worlds' doesn't entail being indifferent to 'this world', especially as 'this world' is apparently the only one we can control (temporarily ignoring the subtleties of 'ambient control').

comment by [deleted] · 2011-03-17T13:24:25.054Z · LW(p) · GW(p)

overrated is not equal to no value. Theres still a bit of a burden of proof on the multiverse as well

comment by AlephNeil · 2011-03-17T13:53:51.223Z · LW(p) · GW(p)

The question of how, if at all, we should value our "MWI brethren" (or "Tegmark brethren" for that matter) is independent of the question of whether we ought to value 'continuity'.

In particular, one can deny the value of MWI brethren and of "continuity", while still valuing the information content of one's own mind.

comment by Psy-Kosh · 2011-03-17T15:26:03.695Z · LW(p) · GW(p)

Use a destructive + incremental uploading technique. Something like a Moravec Transfer.

There, all philosophical troubles re uploading are now vanished. :)

Replies from: nazgulnarsil
comment by nazgulnarsil · 2011-03-17T16:23:23.547Z · LW(p) · GW(p)

one thing I wonder: who will choose moravec transfers if they are far more expensive than other forms of transfer and seem to have the same result?

I claim to be a materialist but I know I would if I could afford it. Though a destructive scan would obviously be better than nothing.

Replies from: Psy-Kosh
comment by Psy-Kosh · 2011-03-18T00:06:56.546Z · LW(p) · GW(p)

Well, if not specifically a Moravec transfer in the sense of "I'm actively conscious during the process and yet at any specific point there is a single version of me, with no version/extention of me being 'lost'", then I'd settle for a destructive update in which I was somehow "paused" (say, cryo suspended) pre upload.

EDIT: Or really deep anesthetic applied, or some such. But either way, preferably a destructive upload. ie, preferably there should be no point in time, subjectively or external, in which there's any ambiguity of who "me" is.

comment by MartinB · 2011-03-17T14:21:21.265Z · LW(p) · GW(p)

That one fried my brain for a while. It would still be nicer than to just die off. Imagine what it feels like for the other version?

For a real life fictional example check star trek TNG

comment by [deleted] · 2011-03-17T12:28:43.082Z · LW(p) · GW(p)

I'm not sure that thats a terrific argument against uploading though. After all, if we could provide a significant other or family member with immortality+happiness then we'd do it, so we'd surely be psyched for an exact copy of ourselves. Obviously ideally we'd like to be the one thats in there, but that doesn't necessarily make any sense...

comment by ata · 2011-03-19T05:25:35.748Z · LW(p) · GW(p)

Convincing argument, or faulty metaphor?

Neither, just an entertaining comic.