Posts

Comments

Comment by Nina Katz-Christy (nina-katz-christy) on Deconfusing Landauer's Principle · 2025-01-31T20:51:16.499Z · LW · GW

how much information (in bits) we gain by finding out the exact value of  given some prior beliefs described by .

Just to clarify, this is the expected/average information, right? 

If we observe X = x for some fixed x, we get exactly -log_2 p(x) bits of information, and entropy is the average of this information gain taken over the distribution p(x). For example, suppose X ~ Bern(0.5). Then we have a 50% chance of getting a 0, and thus gaining -log2(0.5) = 1 bit of information, and a 50% chance of getting a 1, and thus gaining -log2(0.5) = 1 bits of information, meaning we will necessarily gain 1 bit of information upon finding the value of X. But, if X ~ Bern(0.25) instead, then finding out X is 1 gives -log2(0.25) = 2 bits of information and finding out X is 0 gives log2(0.75) bits of information. So, on average, we expect to gain (0.25)(-log2(0.25)) + (0.75)(-log2(0.75)) bits of information.

Is my understanding of this correct?