Would a halfway copied brain emulation be at risk of having different values/identity?

post by Ghatanathoah · 2020-07-30T05:43:30.772Z · LW · GW · 3 comments

This is a question post.

Contents

  Answers
    5 Donald Hobson
None
3 comments

When I was thinking about the concept of human brain emulation recently, a disturbing idea occurred to me. I have never seen anyone address it, so I suspect it is probably caused by my being deeply confused about either human neurobiology or computer science. I thought I'd ask about it in the hopes someone more informed would be able to explain it and the idea could stop bothering me:

Imagine that a brain emulation is in the process of being encoded into a storage medium. I don't think that it is relevant whether the copy is being made from an organic brain or an existing emulation. Presumably it takes some amount of time to finish copying all the information onto the storage media. If the information is about a person's values or personality, and it is only halfway copied, does that mean for a brief moment, before the copying process is complete, that the partial copy has very different personality or values from the original? Are the partially copied personality/values a different, simpler set of personality/values?

Presumably the copy is not conscious during the copying process, but I don't think that affects the question. When people are unconscious they still have a personality and values stored in their brain somewhere, they are just not active at the moment.

I find this idea disturbing because it implies that emulating any brain (and possibly copying de novo AI as well) would inevitably result in creating and destroying multiple different personality/value sets that might count as separate people in some way. No one has ever brought this up as an ethical issue about uploads as far as I know (although I have never read "Age of Em" by Robin Hanson), and my background is not tech or neuroscience, so there is probably something I am missing .

Some of my theories of things I am missing include:

I'd appreciate if someone with more knowledge about this issue, or programming/neuroscience would be willing to explain where my thinking about it is going wrong. I am interested in explanations that are conditional on brain emulation working. Obviously if brain emulation doesn't work at all this issue won't arise. Thank you in advance, it is an issue that I continue to find disturbing.

Answers

answer by Donald Hobson · 2020-07-30T10:48:54.685Z · LW(p) · GW(p)

Suppose that you are slowly walking into a literal physical tunnel. Almost all of your head is in the tunnel. If the part of your head that is not yet in the tunnel was destroyed, you would survive, but your personality would be different, from brain damage.

Now consider an uploaded mind being copied. The simulation process is paused, the data is copied byte for byte, and then two separate simulation processes start on separate computers.

If you cut the cable halfway through, and only look at what is on the second hard drive, then you get a partial, brain damaged mind. But at no point is that mind actually run. You are saying that if you ignore part of a mind, you see a brain damaged mind. In the case of an em being copied, that part might be on a different hard drive.

Of course, there are good moral reasons to make sure that the data cable isn't unplugged and the half-formed mind run.

I would say that I care about the simulation, not the data as such. In other words, you can encrypt the data, and decrypt it again all you want. You can duplicate the data, and then delete one copy, so long as you don't simulate the copy before deletion. You might disagree with this point of view but it is a consistent position.

comment by Ghatanathoah · 2020-07-30T16:15:06.116Z · LW(p) · GW(p)

Thanks for the reply. It sounds like maybe my mistake was assuming that unsimulated brain data was functionally and morally equivalent to an unconscious brain. From what you are saying it sounds like the data would need to be simulated even to generate unconsciousness.

Replies from: donald-hobson
comment by Donald Hobson (donald-hobson) · 2020-07-30T22:44:59.663Z · LW(p) · GW(p)

Yes, to get a state equivalent to sleeping, you are still simulating the neurons. You can get mind states that are ambiguous mixes of awake and asleep.

Replies from: Ghatanathoah
comment by Ghatanathoah · 2020-07-30T23:06:24.454Z · LW(p) · GW(p)
You can get mind states that are ambiguous mixes of awake and asleep.

I am having trouble parsing this statement. Does it mean that when simulating a mind you could also simulate ambiguous awake/asleep in addition to simulating sleep and wakefulness? Or does it mean that a stored, unsimulated mind is ambiguously neither awake or asleep?

Replies from: donald-hobson
comment by Donald Hobson (donald-hobson) · 2020-07-31T10:25:08.585Z · LW(p) · GW(p)

There are states that existing humans sometimes experience, like sleepwalking, microsleeps ect that are ambiguous.

Whether or not a digital mind is being simulated is a much crisper definition.

3 comments

Comments sorted by top scores.

comment by shminux · 2020-07-30T06:05:10.633Z · LW(p) · GW(p)

I'd think that brain is more like a hologram. Copying a small part would result in a dimmer and less resolved, but still a complete image. That said, I also don't see an ethical issue in copying inactive brain "trait-by-trait".

Replies from: Ghatanathoah
comment by Ghatanathoah · 2020-07-30T16:15:26.976Z · LW(p) · GW(p)

That makes a lot of sense, thank you.

comment by Pattern · 2020-07-30T17:30:43.225Z · LW(p) · GW(p)

Trait by trait doesn't seem like a likely copy means.

One hemisphere, then the other, almost does though.

 

I find this idea disturbing because it implies that emulating any brain (and possibly copying de novo AI as well) would inevitably result in creating and destroying multiple different personality/value sets that might count as separate people in some way. No one has ever brought this up as an ethical issue about uploads as far as I know (although I have never read "Age of Em" by Robin Hanson), and my background is not tech or neuroscience, so there is probably something I am missing .

Suppose, as you were waking up, different parts of the brain would 'come online'. In theory, it could be the same thing. (With the 'incomplete parts' running even.)