Magic by forgetting

post by avturchin · 2024-04-24T14:32:20.753Z · LW · GW · 30 comments

Contents

  Thought experiment with curing a disease by forgetting
  There are several caveats for such a line of reasoning
  Why minds in similar states should merge? 
  Theoretical price
None
30 comments

Epistemic – this post is more suitable for LW as it was 10 years ago

 

Thought experiment with curing a disease by forgetting

Imagine I have a bad but rare disease X. I may try to escape it in the following way:

1. I enter the blank state of mind and forget that I had X.

2. Now I in some sense merge with a very large number of my (semi)copies in parallel worlds who do the same. I will be in the same state of mind as other my copies, some of them have disease X, but most don’t.  

3. Now I can use self-sampling assumption for observer-moments (Strong SSA) and think that I am randomly selected from all these exactly the same observer-moments. 

4. Based on this, the chances that my next observer-moment after the blank state of mind will have the disease X are small and equal to the statistical probability of having disease X. Let’s say there are 1000 times more my copies which do not have disease X. Therefore after I return from the meditation, there will be only 0.001 chance that I will have this disease X, as the next state will be randomly selected from all those that can logically follow from the current state. Thus, I will be almost for sure cured!

 

There are several caveats for such a line of reasoning

1. Obviously, I must forget not only about the disease but even about the fact that I was trying to forget something. I have to forget that I tried to forget about X and even used meditation as a magic tool. Therefore, after waking up, I will not know if it works. Also, it will work if not-ill people are often also entering the blank state of mind without attempts to forget something (and accept the risk of getting something bad). Meditation is in some sense such a blank state of mind, and many people meditate just for relaxation or enlightenment. 

2. The state-based, not path-based identity theory must be valid. Not continuity of consciousness, but “I am randomly selected from the same minds”. Note that path-dependent identity also has its own paradoxes: two copies can have different ‘weights” depending on how they were created while having the same measure. For example, if in sleep two copies of me will be created and one of the copies will be copied again – when there will be 3 copies in the morning in the same world, but if we calculate chances to be one of them based on paths, they will be ½ and ¼ and ¼. Path-based identity also claims that a copy of me sent by a tele-transporter is not me, because it has a different path. Path-based identity also is used in the identity of objects of art, in the name of provenance

3. Also, MWI or other form of multiverse must be true. 

4. There is a 0.001 chance that someone who did not have the disease will get it. But he can repeat the procedure. 

5. One can try to change other observables this way: age, and height. Small changes will work better, as they are easy to forget.

6. The deeper the meditation (which here is understood as a blank state of mind without any other clarification like contact with atman or jahnas, and deepness is measured only by closeness by the pure blank state without any traces), the more minds are in the same state of consciousness throughout the universe. This means that I somehow can jump into those minds as if through a wormhole.

7. This contradicts all popular theories of magic where a person concentrates on what she wants. Here you need to forget.

8. The bigger the problem, the more difficult is to forget it. 

9. There can’t be observable evidence that magic-by-forgetting actually works.

10. A bad infohazardous consequence: the things you love can disappear forever as soon as you stop looking at them. There was a LW post about this fear in 2015 https://www.lesswrong.com/posts/is7ieoWyiyYRc7eXL/the-consequences-of-dust-theory

11. Magic by forgetting will be a necessary consequence of the dust theory (but not vice versa, magic by forgetting can be valid even in no-dust-theory-worlds). One way to solve this is to accept that there is nothing in the world except the chains of mathematical Boltzmann-brains-observer-moments, as Mueller did in his article “Law without Law”. In that case, we can suggest that more stable chains are getting advantage and such stability also implies that there are stronger interconnections between observer-moments (more traces of past moments in the current moment) and there is less magic by forgetting. But glitches can be observable in such a model.

12. An interesting analogy is with a hybrid model of Sleeping Beauty by Bostrom. In it, according to my understanding, the observer, when gets new evidence, should update her reference class to the member of all minds who got the same evidence.

13. Yes, I tried to implement this, but I don’t know if it works.

14. Can I validate magic-by-forgetting, if I precommit to use it any time I have a bad problem? – Will I have eventually fewer bad problems on average (without knowing which bad problems I escaped)? 

15. Small drift of reality. Even if I keep all important things in my mind constantly, there is a bar of error in details. Within this error, two little different things can look the same. After time passes, such small errors may accumulate and reality will change. In a normal world, it is unobservable. In the dust world, it can be observed and will look like the Mandella effect: a strange discrepancy between memory and facts, or generally, between any two long disconnected information channels. 

16. If you are an effective altruist, magic by forgetting doesn’t matter to you. 

17. If you practice magic-by-forgetting 1000 times, it returns to thermodynamic equilibrium, and your chances of getting rid of bad things become equal to getting it.

18. If you have rare but valuable property, it is dangerous for you – you may lose it. 

 

Why minds in similar states should merge? 

They do not merge physically (if dust theory is false), but they merge logically: if there are three different minds with different names A, B and C, and each of them enters into a blank state and forgets its name, each mind can assign the chances that its name is A as 1/3, based on the self-sampling assumption (SIA does not make a significant change here as there are no possible minds in this experiment). 

To strengthen the point, imagine that the minds would actually merge, maybe as uploads which are written down in the same memory block (or whatever way of mind merging you can imagine). You can observe that such merging methods do not assume actual information exchange between copies, as they all have the same information. There is no casual process which connects copies. So, merging into one place plays only a symbolic role, and being in the same state in different locations is the same as being merged into one place.

The point here is not just indexical uncertainty, but that the three minds which are in the same state should be treated as the same mind (from an internal perspective): the same mind, located in three different places. Any argument against it assumes some path-dependent identity or external perspective.

 

Theoretical price

While it is easy to dismiss the idea of magic-by-forgetting as absurd, it may have a theoretical price. Either a strong self-sampling assumption is false and-or path-based identity is true.

30 comments

Comments sorted by top scores.

comment by justinpombrio · 2024-04-24T21:34:36.406Z · LW(p) · GW(p)

There is a 0.001 chance that someone who did not have the disease will get it. But he can repeat the procedure.

No, that doesn't work. It invalidates the implicit assumption you're making that the probability that a person chooses to "forget" is independent of whether they have the disease. Ultimately, you're "mixing" the various people who "forgot", and a "mixing" procedure can't change the proportion of people who have the disease.

When you take this into account, the conclusion becomes rather mundane. Some copies of you can gain the disease, while a proportional number of copies can lose it. (You might think you could get some respite by repeatedly trading off "who" has the disease, but the forgetting procedure ensures that no copy ever feels respite, as that would require remembering having the disease.)

Replies from: avturchin
comment by avturchin · 2024-04-25T18:29:02.156Z · LW(p) · GW(p)

The "repeating" will not be repeating from internal point of view of a person, as he has completely erased the memories of the first attempt. So he will do it as if it is first time. 

Replies from: justinpombrio
comment by justinpombrio · 2024-04-28T03:54:04.021Z · LW(p) · GW(p)

My point still stands. Try drawing out a specific finite set of worlds and computing the probabilities. (I don't think anything changes when the set of worlds becomes infinite, but the math becomes much harder to get right.)

Replies from: avturchin
comment by avturchin · 2024-04-28T10:05:40.488Z · LW(p) · GW(p)

The trick is to use already existing practice of meditation (or sleeping) and connect to it. Most people who go to sleep do no do it to use magic by forgetting, but it is natural to forget something during sleep. Thus, the fact that I wake up from sleeping does not provide any evidence about me having the disease. 

But it is in a sense parasitic behavior, and if everyone will use magic by forgetting every time she goes to sleep,  there will be almost no gain. Except that one can "exchange" one bad thing on another, but will not remember the exchange. 

Replies from: justinpombrio
comment by justinpombrio · 2024-04-28T13:41:12.169Z · LW(p) · GW(p)

Not "almost no gain". My point is that it can be quantified, and it is exactly zero expected gain under all circumstances. You can verify this by drawing out any finite set of worlds containing "mediators", and computing the expected number of disease losses minus disease gains as:

num(people with disease)*P(person with disease meditates)*P(person with disease who meditates loses the disease) - num(people without disease)*P(person without disease meditates)*P(person without disease who meditates gains the disease)

My point is that this number is always exactly zero. If you doubt this, you should try to construct a counterexample with a finite number of worlds.

Replies from: avturchin
comment by avturchin · 2024-04-29T14:00:18.283Z · LW(p) · GW(p)

I think I understand what you say - the expected utility of the whole procedure is zero. 

For example, imagine that there are 3 copies and only one has the disease. All meditate. After the procedure, the copy with disease will have 2/3 chances of being cured. Each of two copies without the disease are getting 1/3 chance of having the disease which in sum gives 2/3 of total utility. In that case total utility of being cured = total utility of getting the disease and the whole procedure is neutral.

However, If I already know that I have the disease, and I am not altruistic to my copies, playing such game is a wining move to me?

Replies from: justinpombrio, RamblinDash
comment by justinpombrio · 2024-04-29T15:33:33.220Z · LW(p) · GW(p)

However, If I already know that I have the disease, and I am not altruistic to my copies, playing such game is a wining move to me?

Correct. But if you don't have the disease, you're probably also not altruistic to your copies, so you would choose not to participate. Leaving the copies of you with the disease isolated and unable to "trade".

Replies from: avturchin
comment by avturchin · 2024-04-29T16:07:55.917Z · LW(p) · GW(p)

Yes, it only works if other copies are meditating for some other reason. For example, they sleep or meditate for enlightenment. And they are exploited in this situation.

Replies from: justinpombrio
comment by justinpombrio · 2024-04-30T17:02:34.987Z · LW(p) · GW(p)

Exactly.

comment by RamblinDash · 2024-04-29T14:19:39.155Z · LW(p) · GW(p)

In this scenario, why are the non-disease-having copies participating? They are not in a state of ignorance, they know they don't have the disease.

Replies from: avturchin
comment by avturchin · 2024-04-29T16:05:38.072Z · LW(p) · GW(p)

I assume that meditation happens naturally, like sleep. 

Replies from: RamblinDash
comment by RamblinDash · 2024-04-30T15:49:22.374Z · LW(p) · GW(p)

But don't the non-diseased copies not just need to generally meditate, but to do some special kind of meditation where they forget the affirmative evidence they have that they don't have the disease?

Replies from: avturchin
comment by avturchin · 2024-04-30T20:32:52.254Z · LW(p) · GW(p)

non-disease copies do not need to perform any changes in their meditation routine in this model, assuming that they naturelly forget their disease status during meditation.

Replies from: RamblinDash
comment by RamblinDash · 2024-05-01T00:47:13.278Z · LW(p) · GW(p)

I am not a mediator so maybe you have me beat, but it's not immediately clear why you would assume this

comment by No77e (no77e-noi) · 2024-04-24T20:54:13.994Z · LW(p) · GW(p)

Even if you manage to truly forget about the disease, there must exist a mind "somewhere in the universe" that is exactly the same as yours except without knowledge of the disease. This seems quite unlikely to me, because you having the disease has interacted causally with the rest of your mind a lot by when you decide to erase its memory. What you'd really need to do is to undo all the consequences of these interactions, which seems a lot harder to do. You'd really need to transform your mind into another one that you somehow know is present "somewhere in the multiverse" which seems also really hard to know.

Replies from: alen-2, avturchin
comment by Alen (alen-2) · 2024-04-24T21:19:56.534Z · LW(p) · GW(p)

The multiverse might be very big. Perhaps if you're mad enough having the disease will bring you to a state of mind that a version with no disease has. That's why wizards have to be mad to use magic.

Replies from: avturchin
comment by avturchin · 2024-04-25T17:29:29.899Z · LW(p) · GW(p)

Yes, here we can define magic as "ability to manipulate one's reference class". And special minds may be much more adapted to it.

comment by avturchin · 2024-04-25T16:17:33.710Z · LW(p) · GW(p)

Yes it is easy to forget something if it does not become a part of your personality. So a new bad thing is easier to forget.

comment by Ape in the coat · 2024-04-28T17:01:36.883Z · LW(p) · GW(p)

Universal guide to magic via anthropics:

  1. Be not randomly sampled from a set
  2. Assume that you you are randomly sampled from the set anyway
  3. Arrive to an absurd conclusion
  4. Magic!

Either a strong self-sampling assumption is false

Of course it is false. What are the reasons to even suspect that it might be true?

 and-or path-based identity is true.

 

Note that path-dependent identity also has its own paradoxes: two copies can have different ‘weights” depending on how they were created while having the same measure. For example, if in sleep two copies of me will be created and one of the copies will be copied again – when there will be 3 copies in the morning in the same world, but if we calculate chances to be one of them based on paths, they will be ½ and ¼ and ¼.

This actually sounds about right. What's paradoxical here?

comment by Dagon · 2024-04-24T21:10:50.575Z · LW(p) · GW(p)

Is your mind causally disconnected from the actual universe?  That's the only way I can understand the merging of minds that share some similarities (but are absolutely not identical across universes that aren't themselves identical).  Your forgetting may make two possible minds superficially the same, but they're simply not identical.

I don't know why you think path-based configuration of brain state would be false.  That may not be "identity" for all purposes - there may be purposes for which it doesn't suffice or is too restrictive, but it's probably good for this case.

Replies from: avturchin
comment by avturchin · 2024-04-25T16:19:18.954Z · LW(p) · GW(p)

Presumably in deep meditation people become disconnected from reality.

Replies from: Dagon
comment by Dagon · 2024-04-25T16:48:49.236Z · LW(p) · GW(p)

In deep meditation people become disconnected from reality

Only metaphorically, not really disconnected.  In truth, in deep meditation, the conscious attention is not focused on physical perceptions, but that mind is still contained in and part of the same reality.

This may be the primary crux of my disagreement with the post.  People are part of reality, not just connected to it.  Dualism is false, there is no non-physical part of being.  The thing that has experiences, thoughts, and qualia is a bounded segment of the universe, not a thing separate or separable from it.

comment by Donald Hobson (donald-hobson) · 2024-04-24T18:47:56.377Z · LW(p) · GW(p)

Who knows what "meditation" is really doing under the hood.

Lets set up a clearer example. 

Suppose you are an uploaded mind, running on a damaged robot body. 

You write a script that deletes your mind, running a bunch of nul-ops before rebooting a fresh blank baby mind with no knowledge of the world. 

You run the script, and then you die. That's it. The computer running nul ops "merges" with all the other computers running nul ops. If the baby mind learns enough to answer the question before checking if it's hardware is broken, then it considers itself to have a small probability of the hardware being broken. And then it learns the bad news. 

 

Basically, I think forgetting like that without just deleting your mind isn't something that really happens. I also feel like, when arbitrary mind modifications are on the table, "what will I experience in the future" returns Undefined. 

Toy example. Imagine creating loads of near-copies of yourself, with various changes to memories and personality. Which copy do you expect to wake up as? Equally likely to be any of them? Well just make some of the changes larger and larger until some of the changes delete your mind entirely and replace it with something else. 

Because the way you have set it up, it sounds like it would be possible to move your thread of subjective experience into any arbitrary program. 

Replies from: avturchin
comment by avturchin · 2024-04-24T19:51:11.843Z · LW(p) · GW(p)

In the case of broken robot we need two conditions for magic by forgetting:

  • there are 100 robots and only one is broken and all of them are type-copies of each other.
  • each robot enters into blank state of mind naturally in some moment, like sleep or reboot.

In that case, after robot enters the blank state of mind it has equal chances to be any of robots and this dilutes its chances to have the damaged body after awakening. 

For you toy example - at first approximation, any of which can recognize itself as avturchin (self-recognition identity criteria).

Replies from: donald-hobson
comment by Donald Hobson (donald-hobson) · 2024-04-24T22:10:46.076Z · LW(p) · GW(p)

The point is, if all the robots are a true blank state, then none of them is you. Because your entire personality has just been forgotten.

Replies from: avturchin
comment by avturchin · 2024-04-25T16:07:32.706Z · LW(p) · GW(p)

I can forget one particular thing, but preserve most of my selfidentification information

Replies from: donald-hobson
comment by Donald Hobson (donald-hobson) · 2024-04-26T13:31:58.007Z · LW(p) · GW(p)

True. But for that you need there to exist another mind almost identical to yours except for that one thing. 

In the question "how much of my memories can I delete while retaining my thread of subjective experience?" I don't expect there to be an objective answer. 

comment by ABlue · 2024-04-25T01:40:24.089Z · LW(p) · GW(p)

Is this an independent reinvention of the law of attraction? There doesn't seem to be anything special about "stop having a disease by forgetting about it" compared to the general "be in a universe by adopting a mental state compatible with that universe." That said, becoming completely convinced I'm a billionaire seems more psychologically involved than forgetting I have some disease, and the ratio of universes where I'm a billionaire versus I've deluded myself into thinking I'm a billionaire seems less favorable as well.

Anyway, this doesn't seem like a good solution since even for every "me" that gets into a better universe, another just gets booted into the worse one. As far as the interests of the whole cohort go it'd be a waste of effort.

Replies from: avturchin
comment by avturchin · 2024-04-25T16:15:47.805Z · LW(p) · GW(p)

The number of poor people is much larger than billionaire. So in most cases you will fail to wake up as a billionaire. But sometimes it will work and it is similar to law of attraction. But formulation via forgetting is more beautiful. You forget that you are poor.

UPDATE; actually, the difference with the law of attraction is that after applying the law of attraction, a person still remember that he has used the law. In magic by forgetting the fact of its use must be completely forgotten.

Replies from: ABlue
comment by ABlue · 2024-04-26T16:47:09.054Z · LW(p) · GW(p)

The number of poor people is much larger than the number of billionaires, but the number of poor people who THINK they're billionaires probably isn't that much larger. Good point about needing to forget the technique, though.