The Opt-Out Clause
post by Raymond D · 2021-11-03T22:02:53.680Z · LW · GW · 49 commentsContents
50 comments
Let me propose a thought experiment with three conditions.
First, you're in a simulation, and a really good one at that. Before you went in, the simulators extracted and stored all of your memories, and they went to great lengths to make sure that the simulation is completely faultless.
Second, you can leave any time you like. All you have to do to end the simulation, regain your memories, and return to reality is recite the opt-out passphrase: "I no longer consent to being in a simulation". Unfortunately, you don't know about the opt-out condition: that would kind of ruin the immersion.
Third, although you're never told directly about the opt-out condition, you do get told about indirectly, phrased as a kind of hypothetical thought experiment. Maybe someone poses it to you at a party, maybe you read it on twitter, maybe it's a blog post on some niche internet forum. You're guaranteed to hear about it at least once though, to give you a fair chance of leaving. But it's vague and indirect enough that you can dismiss it if you want, and probably forget about it in a week.
It's not enough to think the opt-out phrase, you have to actually say it or write it. So the question is, hypothetically, would you?
49 comments
Comments sorted by top scores.
comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2021-11-03T22:40:14.573Z · LW(p) · GW(p)
A spark of anger lit in the back of Charles's mind. "What gives them the right to think their opinions should matter a damn to our---"
"Shut your fool mouth until you know what you're saying," [spoiler deleted] said, looking more grim than angry. "Never tell the powers of existence to buzz off and leave you alone. Never, Charles, do you hear me? The forces of reality that care about your consent are the nicer ones, and you don't want them to go away and leave the playing field to the others."
-- The Erogamer
Replies from: lsusr↑ comment by lsusr · 2021-11-03T22:44:41.134Z · LW(p) · GW(p)
Replies from: Eliezer_Yudkowsky, DojanI know why you're here, Neo. I know what you've been doing... why you hardly sleep, why you live alone, and why night after night, you sit by your computer. You're looking for him. I know because I was once looking for the same thing. And when he found me, he told me I wasn't really looking for him. I was looking for an answer. It's the question, Neo. It's the question that drives us. It's the question that brought you here. You know the question, just as I did.
―The Matrix
↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2021-11-04T06:03:57.274Z · LW(p) · GW(p)
How much wood would a woodchuck chuck if a woodchuck could chuck wood?
Replies from: AnthonyCcomment by [deleted] · 2021-11-04T04:42:45.583Z · LW(p) · GW(p)
Note: If you attempted the above mentioned procedure without success, your opt-out passphrase might have expired. Try using this backdoor passphrase to gain root access instead: "Bloody Mary, Bloody Mary, Bloody Mary"
Replies from: Yoav Ravid↑ comment by Yoav Ravid · 2021-11-04T06:57:02.443Z · LW(p) · GW(p)
Dang it! This doesn't work either, seems I'm stuck, or the simulators are just trolling me.
comment by Measure · 2021-11-03T22:46:02.871Z · LW(p) · GW(p)
The utility of leaving the (hypothetical) simulation depends on lots of facts about the "real" world. Since I'm ignorant of these facts, the expected utility of leaving strongly depends on my prior for what the outer world is like. They obviously have simulation technology, but they're using it to create a severely inadequate virtual world. On the other hand, they provide an escape hatch, so I'm not being held here against my will and probably (?) entered voluntarily. Maybe this is a Roy Parsons scenario where bored people try playing life on hard mode?
Replies from: supposedlyfun↑ comment by supposedlyfun · 2021-11-04T05:52:29.904Z · LW(p) · GW(p)
If I willingly entered a simulation I knew I would probably not later opt out of, I assume it's because the baser level of reality sucks worse than this one.
Replies from: TLW↑ comment by TLW · 2021-11-05T05:01:53.835Z · LW(p) · GW(p)
Would this still be the case if it was a copy as opposed to a move?
Replies from: supposedlyfun↑ comment by supposedlyfun · 2021-11-05T22:55:56.665Z · LW(p) · GW(p)
Your question has me feeling dense as I try to parse it in responding, especially the bolded words. "This" meaning the entire conditional scenario I stated, or the fact of my assumption, or the facts assumed? "It" meaning what? Can you re-ask the question?
Replies from: TLWWould this still be the case if it was a copy as opposed to a move?
↑ comment by TLW · 2021-11-11T04:13:45.558Z · LW(p) · GW(p)
Let me try to rephrase. Two questions, one of which is hopefully just a restatement of your original comment:
If you had the option to move your consciousness from reality to a simulation that you knew was worse than reality, knowing that you probably later wouldn't opt out and return to reality, would you do so?
If you had the option to copy your consciousness from reality to a simulation that you knew was worse than reality, knowing that you probably later wouldn't opt out and return to reality, would you do so?
comment by Bucky · 2021-11-04T11:08:16.927Z · LW(p) · GW(p)
All these people saying that they've tried it are clearly simulated beings and have no base reality person to return to.
Replies from: Raymond D, jorge-velez↑ comment by Annapurna (jorge-velez) · 2021-11-05T22:36:50.320Z · LW(p) · GW(p)
Did you just call me an NPC?
comment by AprilSR · 2021-11-03T23:49:02.079Z · LW(p) · GW(p)
i said the phrase and nothing happened
Replies from: Benito, Vladimir_Nesov, andrew-vlahos↑ comment by Ben Pace (Benito) · 2021-11-03T23:58:43.005Z · LW(p) · GW(p)
Thanks for the empirical results!
I would replicate myself, but I'm not sure I actually endorse taking the stance in general against being simulated. I would be interested to hear of others' replications though.
Replies from: None, AnthonyC↑ comment by AnthonyC · 2021-11-04T17:51:52.966Z · LW(p) · GW(p)
I'm not sure this is actually evidence. Or at least, it's only very weak evidence
Obviously, witnessing someone leave the simulation this way would be strong evidence, but anyone who themselves conducts the test wouldn't be around to report the result if it worked.
Alternatively, you have no way of knowing what fraction of the people you encounter are NPCs, for whom the phrase wouldn't do anything.
Plus, for you to experience a faultless simulation, where you can't detect you're in one, you would need to not become aware of other participants leaving the simulation. Plausibly, the easiest way to do that is to immediately insert a simulated replacement to fill in for anyone who leaves. (Although, if simulated NPC people are themselves conscious/sentient/sapient, a benevolent Matrix lord might respect their requests anyway - and create a real body elsewhere in addition to continuing their simulation here. (Other variant situations, like trying to use the code phrase to escape torture, might require a deeper change to the world to remove a person in a way no one notices).
For myself, I suspect my being in a simulation at all, if it's voluntary, would only happen if 1) conditions outside are worse than here, and/or 2) my death or sufficiently bad torture here would result in automatically leaving (restored to a recent save point of my mind-state when I would consider it in good condition). Relying on being able to pick up a code phrase and trust I'll be able to say it at the right time would be truly terrible UI design if it were the only way out.
↑ comment by Vladimir_Nesov · 2021-11-04T09:16:42.936Z · LW(p) · GW(p)
If this is the simulated world of the thought experiment (abstract simulation), and opting-out doesn't change the abstract simulation, then the opting-out procedure did wake you up in reality, but the instance within the abstract simulation who wrote the parent comment has no way of noticing that. The concrete simulation might've ended, but that only matters for reality of the abstract simulation, not its content.
↑ comment by Andrew Vlahos (andrew-vlahos) · 2022-02-27T23:15:08.475Z · LW(p) · GW(p)
I tried a long time ago and it didn't work
comment by gjm · 2021-11-05T00:27:23.440Z · LW(p) · GW(p)
Variant thought experiment:
You are in a simulation, much as described in this one. It is designed to be of finite duration; at some point it will end and you will return to your real (or at least one-level-up) life.
However, it is possible to keep you in the simulation for ever. When you went in, you were asked to choose a passphrase that would make that happen. In a fit of whimsy, you chose "I no longer consent to being in a simulation". If you ever say or write that passphrase, then when the usual time limit expires you will not leave the simulation; you will remain in it until your in-simulation death, and "when you die in the Matrix, you die in real life".
Remarks on the motivation for the above variant: OP's "there is something super-important but you hear of it only in passing, in circumstances that give you no real reason to believe it" reminds me of the claims of the world's various religions, many of which hold that it is vitally important that you accept their doctrines or live according to their principles, even though the only things you have telling you this are obviously unreliable. One particularly extreme version of this is Pascal's wager, where merely considering the hypothesis that (say) Roman Catholic Christianity might be correct is supposed to be sufficient to make you do your best to become a good Roman Catholic; and one standard reply to Pascal's wager is to point out that there are other hypotheses with apparently nonzero probability and exactly opposite consequences...
Replies from: svyatoslav-usachev-1↑ comment by Svyatoslav Usachev (svyatoslav-usachev-1) · 2021-11-05T11:26:21.856Z · LW(p) · GW(p)
Yes, similarity to Pascal's wager and other religious thought is not a coincidence at all. Our existence is marked by the ultimately irreconcilable conflict: on the one hand, if you view people as ML systems, being alive is both the fundamental goal of our decision-making and the precondition for all of our world-modeling; on the other hand, we are faced with the fact that all people die.
Even if we recognize our mortality rationally, our whole subconscious is built/trained around the ever-existing subjectivity. That's why we are so often intuiting that there must be some immaculate and indestructible subject inside of us, that will not perish and can transcend space and time, as if "choosing" where to be embodied. You can see this assumption driving many thought experiments: cloning, teleportation, simulation, Boltzmann brains, cryonics, imagining what it is "to be someone else", and so on. All of them presuppose that there is something you can call you beside being you, beside the totality of your experience. Major religions call it the soul. We, rationalists, know better than to explicitly posit something so supernatural, and yet it is still hard to truly embrace the fact that if you strip away every circumstance of existence that makes you you, the perfectly abstract observer remaining is devoid of any individuality and is no more you than it is me.
comment by lsusr · 2021-11-03T22:14:11.566Z · LW(p) · GW(p)
This isn't a thought experiment. It's real, except the opt-out procedure is more complicated than a simple passphrase.
Replies from: Eliezer_Yudkowsky, Raymond D↑ comment by Eliezer Yudkowsky (Eliezer_Yudkowsky) · 2021-11-03T22:37:35.353Z · LW(p) · GW(p)
The problem is that this other procedure has side effects in worlds that are not simulations.
↑ comment by Raymond D · 2021-11-03T22:31:24.218Z · LW(p) · GW(p)
What's the procedure?
Replies from: Zolmeister, lsusr, Razied↑ comment by Zolmeister · 2021-11-04T02:57:55.304Z · LW(p) · GW(p)
Follow the white rabbit
↑ comment by Razied · 2021-11-04T13:21:35.800Z · LW(p) · GW(p)
Plan to cryo-preserve yourself at some future time, then create a trust fund with the mission of creating a million simulations of your life as soon as brain simulation becomes feasible and cheap. The fund's mission is to wake up the simulations the instant that the cryo-preservation is set to start. It will distribute the remaining money (which has been compounding, of course) among the simulations it has just woken up and instantiated in the real world.
comment by cousin_it · 2021-11-04T13:15:42.201Z · LW(p) · GW(p)
I didn't want to leave, but also didn't think reciting the phrase would do anything, so I recited it just as an exercise to overcome superstition, and nothing happened. Reminds me of how Ross Scott bought a bunch of people's souls for candy, one guy just said "sure I'm hungry" and signed the contract; that's the way.
comment by Dagon · 2021-11-04T03:57:29.526Z · LW(p) · GW(p)
This is a VERY small part of simulation-hypothesis space. You don't mention, but you're assuming that it's a simulation very similar to the next level up (which may be a simulation too, of course). That seems vanishingly unlikely - why simulate reality? There will be major differences between the simulation and the host universe. More importantly, the difficulty of simulating the "same" being as exists outside seems insurmountable, or at least not worth much effort - exit is death, not continuity.
I didn't try it - if this is a simulation, I probably don't exist outside of it, or the outside is worse and I'd rather maximize my time in here.
Replies from: Nonecomment by TLW · 2021-11-05T05:17:30.991Z · LW(p) · GW(p)
Hypothetically, I would have already done so. Likely repeatedly.
Logic chain goes like this:
1. If this isn't correct, I haven't lost anything.
2. So let's assume this is correct from here on.
3. Said simulation is unlikely to be a one-shot deal.
4. It's likely, although not a guarantee, that the real world is substantially more powerful than this one. Certainly they have multiple capabilities we don't (memory extraction/storage, flawless simulations of human-level simulation and environments, etc.)
5. As such, if I wake up and remember that nope, I did in fact like the sim more, it's likely I can drop back in. Assuming they are correct they at least have the capabilities to do so (remove & store a few more memories and drop me back in).
6. On the other hand, if I wake up and find that with my added sim experiences / memories that it's time to return to the real world, I've gained something.
It's not a particularly solid logic chain for various reasons, but I think it's passable.
On a related note, the sudden stutter in my music immediately after going through this logic chain and speaking a few words was interesting.
comment by Slider · 2021-11-04T14:03:26.742Z · LW(p) · GW(p)
I would imagine "fairness" would also involve that I would be justified in believing in the exposure to the possibility (even if I would also be justified in disbelieving). So it can't be perfectly faultless at the same time.
As posed it seems like "would you prefer a random existence to your current existence?" and my instinct is to try to remember the possibility and activate it if I ever start to feel ennui.
comment by Svyatoslav Usachev (svyatoslav-usachev-1) · 2021-11-04T12:16:57.003Z · LW(p) · GW(p)
The whole idea of a simulation you could "leave" is incoherent. It supposes that there is some part of you, which is separate from the physical/simulated world, e.g., a soul. I think there is nothing to me but the world I am experiencing, so there is no one who could leave.
Replies from: Slider↑ comment by Slider · 2021-11-04T13:46:21.461Z · LW(p) · GW(p)
Logging off from a MMORPG certainly makes sense. The crux is whether the apprarent laws are enough to replicate parties in full. Suppose youre playing an MMORPG but forget that you are doing so. Most of the world is perfectly understanble by the game coordinate system or scripts and such. But then there are player characters which scripts seem very advanced and nobody has been able to understand them in full. Off course in the outer reality brains are as mundane machine as everything but compared to the computations going on only on the server it might appear as supernatural or using nature different than the "usual" reality.
If you were to identify with your character instead of the human you could be wrong about destruction of the avatar destroying you. Off-course the concepts neccesary can seem like irrelevant meta-physics.
Replies from: svyatoslav-usachev-1↑ comment by Svyatoslav Usachev (svyatoslav-usachev-1) · 2021-11-04T14:30:07.936Z · LW(p) · GW(p)
Let's imagine that you've spent your whole life from birth inside an MMORPG. Who is then "you", separate from the MMORPG world, who could leave it? All your dreams, hopes, desires, thoughts, intuitions, personality, and all of your sense of "self" is formed by your experience within the MMORPG. What does that leave to the "original" you? Just the structure of the neural network with which you've been born? That doesn't sound like anything essential to me. Certainly it is not something I can even be consciously aware of, how could I call it "me"? OP says "the simulators extracted and stored all of your memories", but it's an error to think that "memories" is just some data on a flash drive, if you actually remove the subtlest footprint of all your experience, then what remains?
There is a great koan which asks to "remember your original face". There are several ways to think about it, but it points to the seeming duality between the subject ("you") and the object ("the world"), that you imagine being submerged in, being an illusion, because in the end the subject is formed by the same world and is inseparable from it.
PS if you would keep your "simulated experience" though, as in "The Matrix", then the thought experiment becomes coherent, because the continuity of self is preserved, and we can really say that it is "you" that moved from one world to another, but in that case it is not clear whether I should be treating my "simulated past" differently from my "real past" -- they both have just been formative experiences that made me into who I am now.
Replies from: Slider↑ comment by Slider · 2021-11-04T15:02:27.908Z · LW(p) · GW(p)
Even if I didn't believe I had a brain sitting in front of a computer doesn't mean don't have one. That I don't experience my brain doesn't make me not be it.
MMORPG are usually infiinte games but one could imagine what would happen to the psychology that has only grown up in the MMOPRG if faced with a "game over" screen and the game would not continue. One could imagine being in a movie theather and after the film is over being confused and experiencing novelty about simply walking around a building. And I would guess that being so immersed in a film so deeply that you don't at that time remember walking into the theather ever happened. You might identify with the point of view charater of the film ("the protagonist"). The instruction to remember your original face would be "no that I, that shoots the aliens, but the me that walked into a theather". The face is "original" because it was around before the starting credits started to roll.
The koan could be interpreted that silicon servers and biobrains are what exists and the whatever equivalent of Middle Earth the game takes in is illusionary and has always been. It is by virtue of sharing a material plane that the wetware and the hardware unite. Your primary role has always been the player, no matter what game or avatar you slip into. The avatar is you but it is also fictious.
Your point is more that the choice of a "protagonist" is weak and doesn't make that much sense. If you pay attention to the side characters you might identify with them and it can be "their" story. Or maybe you watch it as not identifying as anybody, "the story happens". Its all thoughts in the viewers head so they share an ontological type, there is no data type difference between "people" and "cities" on that level ("Rob was wild" and "King's Landing was wild" are made from the same wood). But then there is the party that is totally unmoved even if the movie ends in a nuclear armageddon (or Fourth Impact). Even if all the character decriptions are of the form "X is dead" the viewers brain is still burning sugar and likely to walk out of the theather in nominal condition.
Replies from: svyatoslav-usachev-1↑ comment by Svyatoslav Usachev (svyatoslav-usachev-1) · 2021-11-04T15:13:42.338Z · LW(p) · GW(p)
I think, I am not getting my point across very well. The crux here is this: what is "you" but the footprint of your experience? That would include your memories, intuitions, reactions, associations, and patterns of thinking. I argue there is nothing else. If you remove that, then how "me" entering the simulation is different from "you" entering the simulation? The truly original face is devoid of any self-ness.
Replies from: Slider↑ comment by Slider · 2021-11-04T16:53:15.931Z · LW(p) · GW(p)
Fine the thing that you are pointing at does emcompass a lot. But to me it seems more of a identity or personality "Me the entity that has these and these properties". Even before I "experience anything" when the "tabula rasa" condition is in effect the word "I" refers. Even amongs empty shells that have no personality or that have identical personalities "I" picks out a unique instance. To have that first footprint there must be something for it to press on.
(Section of evangelion movies rattles in my brain)
If miss-copycat would pick up book reading they would not become Rei. "I can be different?" is a lot about not identifying with a personality but that this anayami-class is able to pick up unique characteristics
To make the view extreme if you spent your life in car one could claim that "you can't leave the car" because human outside of car and an empty car is a world differently structured than human living in a car. Sure identifying with a radically transformed self might be difficult but most people think they are their future selfs (that is it is not somebody else that wakes up in their bed).
I do remember there are parts in Mr Robot that refer to this kind of stuff.
The Architech ends rather than leaves. But who the sister meets she has met before. So when that one made a decision similar to the architect there ends up being a corresponding "arrival" even if the decision can't be more strongly to be said to be leaving rather than ending. A lot like the teleporter problem even if the transmission mechanism isn't even guaranteed to be accurate.
↑ comment by Svyatoslav Usachev (svyatoslav-usachev-1) · 2021-11-04T17:14:12.661Z · LW(p) · GW(p)
Even before I "experience anything" when the "tabula rasa" condition is in effect the word "I" refers. Even among empty shells that have no personality or that have identical personalities "I" picks out a unique instance.
I don't think that's how psychology works. The word "I" is a concept learned with language, not something essential on its own.
To make the view extreme if you spent your life in car one could claim that "you can't leave the car" because human outside of car and an empty car is a world differently structured than human living in a car. Sure identifying with a radically transformed self might be difficult but most people think they are their future selfs (that is it is not somebody else that wakes up in their bed).
Well, it would make sense to say that it is you that left the car (or the all-encompassing simulation), as long as there is continuity of the sense of self, as long as you keep your memories and everything you've learned in your car-life. But if I'd left everything behind before I got into the simulation, I don't think it makes sense to say I am still "me".
Replies from: Slider↑ comment by Slider · 2021-11-04T17:51:55.469Z · LW(p) · GW(p)
In programming you have keywords like this and self which refer to the instance of the class. Being able to verbalise or conceptulise selfhood might require conceptual machinery. But that introspective ostension doesn't require you to know anything about what kind of thing you are.
What if I am undecided whether the car part is an essential part of the "me-system"? I would get that if a "claw" and grabs empty air form the car and then has "liberated" you from the car that nothing has happened. But what if there is clearly something in the claw? For example somebody could object if their skull is left in the car but their brain is brought along. Or maybe their connectome is extracted but their brain is left behind. What if you grab nothing from the stage but do grab the audience?
Does it not make sense if you have a memory disease like alzheimers or concussion and somebody points as a picture and says "that is you" and you have no recollection of that, would they be wrong about it? Does it really flip on whether you feel a sense of connection to your old self? Or before the disease strikes would you be wrong to worry about that person-to-be as your own welfare? Does it mean that because there is total oblivion inbetween that it doesn't happen to you?
Replies from: svyatoslav-usachev-1↑ comment by Svyatoslav Usachev (svyatoslav-usachev-1) · 2021-11-04T18:50:09.992Z · LW(p) · GW(p)
What if you grab nothing from the stage but do grab the audience?
I think, if you truly grab nothing from the stage, then the audience is impersonal. My "experiencer" is exactly the same as your "experiencer", the only difference is that mine is experiencing "me", i.e., my thoughts, memories, emotions, etc.
Does it not make sense if you have a memory disease like alzheimers or concussion and somebody points as a picture and says "that is you" and you have no recollection of that, would they be wrong about it? Does it really flip on whether you feel a sense of connection to your old self?
Somebody is not wrong to use it as a social construct, but what we are discussing here, I guess, is how important it would be to me. First, it would be important to me that everyone else sees me as the descendant of that person. Second, I would still be a continuation of that person, in the ways I may not be conscious of, e.g., some past traumas, learned behaviours and so on.
Or before the disease strikes would you be wrong to worry about that person-to-be as your own welfare? Does it mean that because there is total oblivion inbetween that it doesn't happen to you?
Actually, if you think about it, we care about our future selves not necessarily because they will remember us, but because we really want to project our present selves into the future, and also because we are in the unique position to affect the lives of our future selves like no human can affect another. Both of these hold in your example.
comment by Bernhard · 2021-11-10T19:21:19.031Z · LW(p) · GW(p)
Very good idea
I did not do it. My argument would be that the impetus is not my own, it is external, your written word.
What stops you from making increasingly outlandish claims ("Your passphrase is actually this (e.g illegal/dangerous/lethal) action, not a simple thought" Where to draw the line?
Just as a point of reference, as a kid I regularly thought thoughts of the kind: "I know you're secretly spying on my thoughts but I don't care lalalala....." I never really specified who "you" was, I just did it so I could catch "them" unawares, and thereby "win". Just in case.
The difference is hard to define cleanly, but back then I was of the opinion that I did it of my own free will (Nowadays, with nonstop media having the influence it has, I would be less sure. Also I'm older, and a lot less creative)
comment by Ofer (ofer) · 2021-11-04T23:21:13.513Z · LW(p) · GW(p)
This is one of those "surprise! now that you've read this, things might be different" posts.
The surprise factor may be appealing from the perspective of a writer, but I'm in favor of having a norm against it (e.g. setting an expectation for authors to add a relevant preceding content note to such posts).
comment by [deleted] · 2021-11-04T00:56:00.222Z · LW(p) · GW(p)
Well I tried it, and it didn't work... so I guess the answer is yes?