Posts

Comments

Comment by Simuon on What's So Bad About Ad-Hoc Mathematical Definitions? · 2021-03-18T12:47:48.856Z · LW · GW

I think the merit of Shannon was not to define entropy, but to understand the operational meaning of entropy in terms of coding a message with a minimal number of letters,  leading to the notion of the capacity of a channel of communication, of error-correcting code and of "bit". 

Von Neumann's entropy was introduced before Shannon's entropy (1927, although the only reference I know is von Neumann's book from 1932). It was also von Neumann how suggested the name "entropy" for the quantity that Shannon found. What Shannon could've noticed was that von Neumann's entropy also has an operational meaning. But for that, he would've had to be interested in the transmission of quantum information by quantum channels, ideas that were not around at the time.

Comment by Simuon on What's So Bad About Ad-Hoc Mathematical Definitions? · 2021-03-18T12:31:52.546Z · LW · GW

I was suspecting it was more of a fable, but I hoped it was historical (there are many true cryptographic stories in this style, though I don't now any about this "proxy" problem). I think it is a bit dangerous to draw conclusions from a fictional story, though the fable made this post more engaging, and I think I mostly agree with its conclusion. 

Why is using a fable to construct an argument dangerous ? Suppose Aesope wrote a fable on some goose laying golden eggs, and people draw the conclusion that you should not experiment around positive phenomena in fear to lose what you got. Later, Aristotle understood that science is actually good. He advised Alexander to be curious, then Alexander cut the Gordian knot and became "the Great". 

Well, this was meta and self-defeating. But here is a more interesting story: economists usually tell a fable about the birth of currency, emerging from barter to solve the problem of the double coincidence of wants (Jevons, 1875). This is a great thought experiment, but it is too often seen as a realistic description of how money was invented, while there is anthropological and historical evidence that money was at first issued by the state, and considered as a debt token rather than having value in itself (Graeber).  Framing the thought experiment as historical results in a public discourse where hyperinflation is waved as the inevitable consequence of state-issued money. The conclusion I draw from this story is that thought experiment should not be framed as historical stories, because it prevents us from seeing other aspects of the problem. 

Does this apply to the post ? I'm not sure... the fable is not really framed as historical; what the rest of the argument needs from the story is mostly that Pearson's correlation is misleading while Shannon's mutual information is on point. Maybe we can open some interesting perspectives by looking at historical examples where the correlation is mistakenly used in place of mutual information. The point that uniqueness is an interesting proxy for robustness stands still; I think it could be developed into a more general discussion around the advantages of uniqueness in a metric.

Comment by Simuon on What's So Bad About Ad-Hoc Mathematical Definitions? · 2021-03-17T20:30:21.815Z · LW · GW

Is there a reference on the events with the Bell labs ? I can imagine some scenarii where the military transmits some information an can sort of engineer what the adverse party can read (for example Eve can read the power supply of some device, Alice must then add sufficient noise on the power supply), but none seems realistic in the context. 

Comment by Simuon on What's So Bad About Ad-Hoc Mathematical Definitions? · 2021-03-17T16:57:03.039Z · LW · GW

I don't agree with the argument on the variance :

"Any other such measure will indeed be isomorphic to variance when restricted to normal distributions."

 It's true, but you should not restrict to normal distributions in this context. It is possible to find some distributions X1 and X2 with different variances but same value E(|x-mean|^p) for . Then X1 and X2 looks the same to this p-variance, but their normalized sample average will converge to different normal distributions. Hence variance is indeed the right and only measure of spreadout-ness to consider when applying the central limit theorem. 

Comment by Simuon on What's So Bad About Ad-Hoc Mathematical Definitions? · 2021-03-17T16:51:48.789Z · LW · GW