Is the sum individual informativeness of two independent variables no more than their joint informativeness?
post by Ronny Fernandez (ronny-fernandez) · 2019-07-08T02:51:28.221Z · LW · GW · 1 commentThis is a question post.
Contents
Answers 12 jessicata 3 Miss_Figg None 1 comment
Is it true that:
If I(X;Y) = 0 then I(S;X) + I(S;Y) <= I(S;X,Y)
Can you find a counterexample, or prove this and teach me your proof?
Someone showed me a simple analytic proof. I am still interested in seeing different ways people might prove this though.
Answers
For a visualization, see information diagrams, and note that the central cell I(S; X; Y) must be non-positive (because I(S; X; Y) + I(X; Y | S) = I(X; Y) = 0).
We want to prove:
This can be rewritten as:
After moving everything to the right hand side and simplifying, we get:
Now if we just prove that is a probability distribution, then the left hand side is , and Kullback-Leibler divergence is always nonnegative.
Ok, q is obviously nonnegative, and its integral equals 1:
Q.e.d.
1 comment
Comments sorted by top scores.
comment by Jalex Stark (jalex-stark-1) · 2019-07-08T12:38:13.242Z · LW(p) · GW(p)
Just for amusement, I think this theorem can fail when s, x, y represent subsystems of an entangled quantum state. (The most natural generalization of mutual information to this domain is sometimes negative.)