# Anthropics: Full Non-indexical Conditioning (FNC) is inconsistent

post by Stuart_Armstrong · 2019-01-14T15:03:04.288Z · score: 22 (5 votes) · LW · GW · 4 commentsI'll be using Full non-indexical conditioning (FNC) as an example of anthropic reasoning. Given that, it's useful to know that FNC is (time) inconsistent: the expected future probability is not the same as the current probability. I can't find a full write-up of this specific fact, so I thought I'd present it here for reference.

In FNC:

one should condition on all the evidence available, including all the details of one’s memory, but without considering “indexical” information regarding one’s place in the universe (as opposed to what the universe contains).

So, suppose that there were either or initially identical copies of you in the universe, with equal probability. Your copies never communicate or get extra evidence of each others existence. Each copy of has seen (and remembered) the outcome of independent random bits, for . You yourself have seen the sequence (which sequence is doesn't actually matter, as they are all equiprobable).

Then, by FNC reasoning, the probability of a copy of you seeing in a universe with only of you is , while the probability of that happening in the a universe with copies of you is:

- , by the binomial approximation.

Thus FNC will update towards the large universe by a ratio that is roughly - in other words, . Translated into probabilities, this gives a probability close to for the small universe, and for the large one.

This makes FNC very close to SIA; in fact, in the limit of finitely many copies remembering infinitely many different random bits, FNC is exactly SIA.

But now consider the situation before any of your copies see any random bits (or, at least, where they only see non-independent random bits). Then all copies have seen the same sequence, so the update is ; ie there is no anthropic update at all, and the probabilities of either universe remains .

But if you know that all your copies will see an independent sequence random bits, then you can predict, **with certainty**, that your future probabilities will be (almost exactly) and , rather than and . How can you predict with certainty? Because you know that you will see some sequence , and all sequence lead to the same FNC updates.

- So FNC is time inconsistent.

## Forgetful agents.

More strangely, FNC can be time inconsistent the other way, if your copies are forgetful. If they start forgetting large initial pieces of the sequence , then all of their/your probabilities will start to move back towards the and probabilities.

For example, if your copies have forgotten all but the last bit of the sequence, then the probability of some copy seeing either or in a large universe is . For large , this is approximately , by the limit expression for the exponential.

Then the update ratio will be , and the probabilities of large and small universes will be close to and .

## 4 comments

Comments sorted by top scores.

Full non-indexical conditioning is broken in other ways too [LW · GW]. As I argued before, the core of this idea is essentially a cute trick where by precommitting to only guess on a certain sequence, you can manipulate the chance that at least one copy of you guesses and that the guesses of your copies are correct. Except full non-indexical conditioning doesn't precommit so the probabilities calculated are for a completely different situation. Hopefully the demonstration of time inconsistency will make it clearer that this approach is incorrect.

May be it is not an bug, but a feature, where by forgetting or remembering parts of information I could manipulate expected probability? It was already discussed on LW as the "flux universe".

This is just the standard sleeping beauty paradox and I’d suggest that the issue isn’t unique to FNC.

However, you are a bit quick in concluding it is time inconsistent as it’s not altogether clear that one is truly referring to the same event before and after you have the observation. The hint here is that in the standard sleeping beauty paradox the supposed update involves only information you already were certain you would get.

Id argue that what’s actually going on is that you are evaluating slightly different questions in the two cases

This is not the standard sleeping beauty paradox, as the information you are certain to get does not involve any amnesia or duplication before you get it.