Optimal predictors and conditional probability

post by Vanessa Kosoy (vanessa-kosoy) · 2015-06-30T18:01:31.000Z · LW · GW · 3 comments

This is a link post for https://github.com/antiquark/FAI/raw/a8836f811f58c9ff8f91b53ca682b47ff0f6e3d1/A%20complexity%20theoretic%20approach%20to%20logical%20uncertainty.pdf

3 comments

Comments sorted by top scores.

comment by orthonormal · 2015-07-02T23:44:54.000Z · LW(p) · GW(p)

Looking again at Theorem 4.4, I don't find any assumption analogous to the independence property. If , won't this wrongly claim that the values of the optimal predictor should be squared along the diagonal?

(Apologies if I'm misunderstanding.)

Replies from: vanessa-kosoy
comment by Vanessa Kosoy (vanessa-kosoy) · 2015-07-03T06:17:34.000Z · LW(p) · GW(p)

There is a crucial difference between the setting of Theorem 4.4 and setting of Theorems 4.5, 4.6. In Theorems 4.5, 4.6 we consider the intersection of two languages in which case . In Theorem 4.4 we consider the Cartesian product of two languages. is not the same thing as . Moreover, the diagonal embedding of into is not a valid reduction for the purpose of Theorem 6.1 since condition (ii) is violated (diagonal elements are rare within the product ensemble).

comment by Vanessa Kosoy (vanessa-kosoy) · 2015-06-30T18:02:53.000Z · LW(p) · GW(p)

This is an updated version of the optical predictors paper which contains Theorem 4.5 and Theorem 4.6. I mentioned these results without proof during the logical uncertainty workshop in May.