Anthropic probabilities and cost functions
post by Stuart_Armstrong · 2018-12-21T17:54:20.921Z · LW · GW · 1 commentsContents
1 comment
I've claimed that anthropic probabilities like SIA and SSA don't actually exist - or, more properly, that you need to include some details of preferences in order to get any anthropic probabilities, and thus that anthropic issues should be approached from the perspective of decision theory.
What do I mean by this? Well, informally, what are probabilities? If I said that (a very visible event) would happen with a probability , then I would expect to see events like happen about a tenth of the time.
This makes a lot of sense. Why can't it be transposed into anthropic situations? Well, the big problem is the "I" in "I would expect". Who is this "I" - me, my copies, some weighted average of us all?
In non-anthropic situations, we can formalise "I would expect to see" with a cost function. Let me choose a number to be whatever I want; then, if doesn't happen I pay a cost of , while if it does happen, I pay a cost of (this is exactly equal to , for the indicator function of ).
Then, for this cost function, I minimize my losses by setting "" to be equal to my subjective opinion of the probability of (note there are many eliciting cost functions we could have used, not just the quadratic loss, but the results are the same in for all of them).
In the informal setting, we didn't know how to deal with "I" when expecting future outcomes. In the formal setting, we don't know how to aggregate the cost when multiple copies could all have to pay the cost.
There are two natural methods of aggregation: the first is to keep , as above, as the cost for every copy. Thus each copy has the average cost of all the copies (this also allows us to generalise to situations where different copies would see different things). In this case, the probability that develops from this is SSA.
Alternatively, we could add up all the costs, giving a total cost of if there were copies (this also generalises to situations where different copies see different things). In this case, the probability that develops from this is SIA.
So probability might be an estimate of what I expect to see, or a cost-minimiser for errors of prediction, but anthropic probabilities differ depending on how one extends "I" and "cost" to situations of multiple copies.
1 comments
Comments sorted by top scores.
comment by Chris_Leong · 2018-12-21T23:07:43.616Z · LW(p) · GW(p)
See also: If a tree falls on Sleeping Beauty [LW · GW].