# Is the sum individual informativeness of two independent variables no more than their joint informativeness?

post by Brangus · 2019-07-08T02:51:28.221Z · score: 11 (3 votes) · LW · GW · 1 comment

This is a question post.

## Contents

  Answers
12 jessicata
3 Miss_Figg
None
1 comment


Is it true that:

If I(X;Y) = 0 then I(S;X) + I(S;Y) <= I(S;X,Y)

Can you find a counterexample, or prove this and teach me your proof?

Someone showed me a simple analytic proof. I am still interested in seeing different ways people might prove this though.

answer by jessicata · 2019-07-08T07:46:20.903Z · score: 12 (3 votes) · LW · GW

For a visualization, see information diagrams, and note that the central cell I(S; X; Y) must be non-positive (because I(S; X; Y) + I(X; Y | S) = I(X; Y) = 0).

answer by Miss_Figg · 2019-07-08T15:39:36.282Z · score: 3 (2 votes) · LW · GW

We want to prove:

This can be rewritten as:

After moving everything to the right hand side and simplifying, we get:

Now if we just prove that is a probability distribution, then the left hand side is , and Kullback-Leibler divergence is always nonnegative.

Ok, q is obviously nonnegative, and its integral equals 1:

Q.e.d.