Thoughts on teletransportation with copies?

post by titotal (lombertini) · 2023-11-29T12:56:51.193Z · LW · GW · 3 comments

This is a question post.

Contents

  Question 3: In scenario 1, how much cash would you pay for the 100 buck planet A lottery ticket? 
  Question 4: In scenario 2, how much cash would you pay for the 100 buck planet A lottery ticket?
None
  Answers
    10 AlanCrowe
    4 JBlack
    4 Charlie Steiner
    3 Ansel
    3 Dagon
    3 Shamash
    2 Seth Herd
    2 Throwaway2367
    1 Zane
None
3 comments

I’m interested in how people here feel about the teletransportation paradox with multiple copies. I have several scenarios: 

Scenario 1: You are placed in a machine on earth that puts you to sleep and then instantaneously disintegrate your body, in the process recording its exact atomic configuration. 

This information is then beamed to another machine on planet A, and in that machine new matter is used to construct a body with the same configuration as yours. 

At the same time, the same information is sent to a machine on planet B, and a second body is also constructed with the same configuration as yours, at the exact same time as the one on planet A. 

At this point, both copies of your atomic configuration wake up and walk out into their respective planets. 

Question 1: What is the probability that after going to sleep on earth, you wake up on planet A?

Now we modify the scenario:

Scenario 2: 1 millisecond after the initial split, and before either copy wakes up, the machine on planet B is activated again, the copy on planet B is instantaneously disintegrated, and that atomic configuration is transmitted to two more machines on planet C and planet D, which both construct identical copies of the original copy on planet B. 

At this point, the copies on planet A, C, and D are all woken up and walk out into their respective planets. 

Question 2: What is the probability that, in this new scenario, after going to sleep on earth, you wake up on planet A?

Scenario 3&4: The teletransporters are capable of copying not just you, but also your clothing and items you are holding, such as the cash and other contents your wallet. On the way to the first machine, someone offers to sell you a winning lottery ticket that pays out a hundred dollars, but only if cashed in on Planet A. They only take cash, and if you pay the cash, it obviously will not be copied in the teletransportation process. Assume that you are purely selfish about your own experiences.

Question 3: In scenario 1, how much cash would you pay for the 100 buck planet A lottery ticket? 

Question 4: In scenario 2, how much cash would you pay for the 100 buck planet A lottery ticket?

I have my own thoughts and confusions, but I’d be interested in what other people think first. Note that answers like "not enough information" or "ill formed question" are valid as well. 

Answers

answer by AlanCrowe · 2023-11-29T17:57:41.948Z · LW(p) · GW(p)

Consider the case of a reclusive mad scientist who uplifts his dog in the hope of getting a decent game of chess. He is likely to be disappointed as his pet uses his new intelligence to build a still and drink himself to death with homemade vodka. If you just graft intelligence on top of a short term reward system, the intelligence will game it, leading to wireheading and death.

 

There is no easy solution to this problem. The original cognitive architecture implements self-preservation as a list of instinctive aversions. Can one augment that list with addition aversions preventing the various slow-burn disasters that intelligence is likely to create? That seems an unpromising approach because intelligence is open ended, the list would grow and grow. To phrase it differently, an unintelligent process will ultimately be out witted by an intelligent process. What is needed is to recruit intelligence to make it part of the solution as well as part of the problem.

 

The intelligence of the creature can extrapolate forward in time, keeping track of which body is which by historical continuity and anticipating the pleasures and pains of future creatures. The key to making the uplift functional is to add an instinct that gives current emotional weight to the anticipated pleasures and pains of a particular future body, defined by historical continuity with the current one.

 

Soon our reclusive mad scientist is able to chat to his uplifted dog, getting answers to questions such as "why have you cut back on your drinking?" and "why did you decide to have puppies?". The answers are along the lines of "I need to look after my liver." or "I'm looking forward to taking my puppies to the park and throwing sticks for them." What is most interesting here probably slips by unnoticed. Somehow the dog has acquired a self.

 

Once you have instincts that lead the mind to extrapolate down the world line of the physical body and which activate the reward system now according to those anticipated future consequences, it becomes natural to talk in terms of a 4-dimensional, temporally extended self, leaving behind the 3-dimensional, permanent now, of organisms with less advanced cognitive architectures. The self is the verbal behaviour that results from certain instincts necessary to the functioning of a cognitive architecture with intelligence layered on top of a short term reward system. The self is nature's bridle for the mind and our words merely expressions of instinct.We can notice how slightly different instincts give rise to slightly different senses of self and we can ask engineers' questions about which instincts, and hence which sense-of-self, give the better functioning cognitive architecture. But these are questions of better or worse, not true or false.

 

To see how this plays out in the case of teletransportation, picture two scenarios. In both worlds the technology involves making a copy at the destination, then destroying the original. In both worlds there are copy-people who use the teletransportation machines freely, and ur-people who refuse to do so.

 

In scenario one, there is something wrong with the technology. The copy-people accumulate genetic defects and go extinct. (Other stories are available: the copy-people are in such a social whirl, travelling and adventuring, that few find the time to settle down and start a family). The ur-people inherent the Earth. Nobody uses teletransportation any more, because every-one agrees that it kills you.

 

In scenario two, teletransportation becomes embedded in the human social fabric. Ur-people are left behind, left out of the dating game, and marriage and go extinct. (Other stories are available: World War Three was brutal and only copy-people, hopping from bunker to bunker by teletransportation survived). It never occurs to any-one to doubt that the copy at the destination is really them.

 

The is no actual answer to the basic question because the self is an evolved instinct, and the future holds beliefs about the self that are reproductively  successful. In the two and three planet scenarios, the situation is complicated by the introduction of a second kind of reproduction, copy-cloning, in addition to the usual biological process. I find it hard to imagine the Darwinian selective pressures at work in a future with two kinds of reproduction.

 

I think that the questions probe the issue of whether the person choosing whether to buy the lottery ticket is loyal to a particular copy, or to all of them. One copy gets to win the lottery. The other copies are down by the price of the ticket. If one is loyal to only one copy, one will choose to buy if and only if one is loyal to the winner.

 

But I conjecture that a balanced regard for all copies will be most reproductively successful. The eventual future will be populated by people who take note of the size of the lottery prize, and calculate the expected value, summing the probabilities over all of their copies.

answer by JBlack · 2023-11-30T00:31:56.508Z · LW(p) · GW(p)

(1) There is no real relevance to probability here. A future-me will wake up on planet A. A future-me will also wake up on planet B.

(2) Again, a future-me will wake up on planets A, C, and D.

(3) This depends upon my expectation of how useful money is to future-mes on the various planets. If I pay X then A-me nets 100-X dollars more, and B-me loses X dollars. $50 is neutral across the sum of future-mes, but B-me is likely to regret that slightly more than A-me will benefit from it. Though really, why is someone selling me a ticket that will pay out $100 for less than $100? I'd have a lot of credence that this is a scam. How does the economy work if you can legitimately copy money freely anyway?

(4) As with (3), only with 3 future mes instead of 2.

answer by Charlie Steiner · 2023-11-29T19:35:00.721Z · LW(p) · GW(p)
  1. 1
  2. 1
  3. $50
  4. $33.33

I think there's certainly a question people want to ask when they talk about things like Q1 and Q2, but the standard way of asking them isn't right. If there is no magic essence of "you" zipping around from place to place in the universe, then the probability of "you" waking up in your body can only be 1.

My advice: rather than trying to hold the universe fixed and asking where "you" goes, hold the subjective information you have fixed and ask what the outside universe is like. When you walk out of the door, do you expect to see planet A or planet B? Etc.

comment by titotal (lombertini) · 2023-11-29T20:00:44.969Z · LW(p) · GW(p)

Right, and when you do wake up, before the machine is opened and the planet you are on is revealed, you would expect to see yourself in planet A 50% of the time in scenario 1, and 33% of the time in scenario 2? 

What's confusing me is with scenario 2: say you are actually on planet A, but you don't know it yet. Before the split, it's the same as scenario 1, so you should expect to be 50% on planet A. But after the split, which occurs to a different copy ages away, you should expect to be 33% on planet A. When does the probability change? Or am I confusing something here?

Replies from: Charlie Steiner
comment by Charlie Steiner · 2023-11-30T00:12:19.072Z · LW(p) · GW(p)

Yes, since you don't expect the copy of you on planet A to go anywhere, it would be paradoxical to decrease your probability that you're on planet A.

Which is why you have a 100% chance of being on planet A. At least in the third-person, we-live-in-a-causal-universe, things-go-places sense. Sure, in the subjective, internal sense, the copy of you that's on planet A can have a probability distribution over what's outside their door. But in the sense physics cares about you have a 100% probability of being on planet A both before and after the split, so nothing went anywhere.

Subjectively, you always expected your estimate of what's outside the door to change at the time of the split. It doesn't require causal interaction at the time of the split because you're just using information about timing. A lot like how if I know the bus schedule, my probability of the bus being near my house "acausally" changes over time - except weirder because an extra copy of you is added to the universe.

answer by Ansel · 2023-11-29T18:42:52.755Z · LW(p) · GW(p)

I think the answers to 1 and 2 are as reasonably close to 0 as calculated probabilities can be. That may be independent with the question of how reasonable it is to step into the teleporters, however.

It looks like confused thinking to me when people associate their own conscious existence with the clone that comes out of the teleporter. Sure, you could believe that your consciousness gets teleported along with the information needed to construct a copy of your body, but that is just an assumption that isn't needed as part of the explanation of the physical process. From that, also, stems the problems of needing to explain which copy your consciousness would "prefer", if there are multiples; or whether consciousness would be somehow split or combined.

The troubling issues that follow from the teleporter problem are the questions it raises about the actual makeup of the phenomenon we identify as our own consciousness. It seems to me that it may well be that our conception of a persistent personal consciousness is fully illusory, in which case it may be said that the original questions are all ambiguous in terms of the referent of "you". In this conception, an instance of "you" may have qualia, but this qualia is not connected to any actually persistent identity.

If the idea that we have an actually persistent conscious experience is an illusion, then the question of whether we should use the teleporter, cloning or otherwise, is mostly about how comfortable we are with it, and how desirable the outcome of using it is likely to be as a practical matter. If you bite that bullet then using the teleporter should have no more impact than going under anesthesia or even a deep dreamless sleep. If the illusion model is true, then selfishness with regard to experience is simply a mark of not being able to personally accept the falseness of your own identity, in which case you are likely to not choose to use the teleporter for that reason.

For the record, I feel very uncomfortable with the idea of using the teleporter. Currently, the idea "feels" like suicide. But I don't know that there's any rational basis for that.

answer by Dagon · 2023-11-29T16:27:08.899Z · LW(p) · GW(p)

As others have pointed out, there's an ambiguity in the word "you". We don't have intuitions about branching or discontinuous memory paths, so you'll get different answers if you mean "a person with the memories, personality, and capabilities that are the same as the one who went into the copier" vs "a singular identity experiencing something right now".  

Q1: 100%.  A person who feels like me experiences planet A and a different person who is me experiences planet B. 

Q2: Still 100%.  One of me experiences A, one C and one D. 

Q3: Copied money is probably illegal, and my prior for scams is high, so I'd probably reject the offer.  If I suspend my disbelief, I'd pay just under $50 for the ticket.  It turns into $100 (on planet A), where the $50 turns into $100 as well ($50 on A and $50 on B) The amount is small, so declining marginal utility doesn't play much into it.

Q4: $33 for a ticket is equivalent to $33 copied 3 times.

answer by Shamash · 2023-11-29T15:46:33.255Z · LW(p) · GW(p)

I think the simplest way to answer this is to introduce a new scenario. Let's call it Scenario 0. Scenario 0 is similar to Scenario 1, but in this case your body is not disintegrated. The result seems pretty clear: you are unaffected and continue living life on earth. Other yous may be living their own lives in space but it isn't as if there is some kind of metaphysical consciousness link that connects you to them.

And so, in scenarios 1 and 2, where the earth-you is disintegrated, well, you're dead. But not to worry! The normal downsides of death (pain, inability to experience new things, sadness of those left behind) do not apply! As far as the physical universe is concerned (i.e. as far as reality and logic are concerned) there are now two living beings that both perceive themselves as having once been you. Their connection to the original you is no less significant than the connection between the you that goes to sleep and the you that wakes up in the morning.

EDIT: I realize this does not actually answer Questions 3 and 4. I don't have time to respond to those right now but I will in a future edit.

EDIT 2: The approach I'd take with Q3 and Q4 would be to maximize total wealth of all clones that don't get disintegrated.

Let X be how much money is in my wallet and L is the ticket price. In scenario 1, the total wealth is 2X without the lottery or 2(X-L)+100 with the lottery. We buy the lottery ticket if 2(X-L)+100 > 2X, the inequality can be simplified to -2L + 100 > 0, which is further simplified to L < 50. The ticket is only worth buying if it costs less than $50.

For Q4 we should have a similar formula but we have three clones in the end rather than two, so I would only buy the ticket if it cost less than $33.34.

answer by Seth Herd · 2023-11-29T20:23:17.624Z · LW(p) · GW(p)

The confusion on this question is the assumption that there can be only one "you". There is no logical contradiction in having a you arrive at both locations. There are now two yous. They're not the same you, but they are both yous - they have your memories, beliefs, and habits. They'll diverge in experiences from there, but they are equally you.

There's an intuition that physical continuity is important, but on examination it's just not true. What we mean by "me" is a set of memories, beliefs, and habits. The continuity of those is what matters. That's what defines you.

answer by Throwaway2367 · 2023-11-29T14:42:53.805Z · LW(p) · GW(p)

My personal answer to these type of questions in general is that the naive conception of personhood/self/identity is incomplete and (probably because of our history of no cloning/teleportation) it is not suitable to use when thinking about these topics.

The problem is that it implicitly assumes that on the set of all observer-moments you can use the "same person" relation as an equivalence relation.

Instead I think when we will actually deal with these problems in practice should (and will fs) update our language with ways to express the branching nature of personal identity. The relation we want to capture is better modeled by the transitive closure of a directed tree composed with its converse.

So my answer to your question is that they don't have enough information as "you" is ambiguous in this context.

answer by Zane · 2023-12-06T21:42:41.979Z · LW(p) · GW(p)

Q3: $50, Q4: $33.33

The answers that immediately come to mind for me for Q1 and Q2 are 50% and 33.33%, though it depends how exactly we're defining "probability" and "you"; the answer may very well be "~1" or "ill formed question".

The entities that I selfishly care about are those who have the patterns of consciousness that make up "me," regardless of what points in time said "me"s happen to exist at. $33.33 maximizes utility across all the "me"s if they're being weighted evenly, and I don't see any particular reason to weight them differently (I think they exist equally as much, if that's even a coherent statement).

What confusions do you have here?

<obligatory pointless nitpicking>Does this society seriously still use cash despite the existence of physical object duplicators?</obligatory pointless nitpicking>

3 comments

Comments sorted by top scores.

comment by Gunnar_Zarncke · 2023-11-29T14:34:41.382Z · LW(p) · GW(p)

Assume that you are purely selfish about your own experiences.

The problem is how "you" is defined here.

comment by Gesild Muka (gesild-muka) · 2023-11-29T14:08:52.265Z · LW(p) · GW(p)

Maybe I'm not understanding it correctly, if I'm selfish about my own experiences I wouldn't get into the machine in the first place. If I have no choice in whether or not to get in the machine I'd refuse the lottery ticket even if it was free just to spite my future copied self who gets to exist because I was destroyed.