Logarithmic Measure of Information (Entropy Primer)
It wasn’t until 1920 that the question “how do we quantify information” was well articulated. This video introduces a key idea of Nyquist and Hartley, who laid the groundwork for Claude Shannon’s historic equation (Information Entropy) two decades later. In these early papers, the idea of using a logarithmic function appears, something which isn’t immediately obvious to most students fresh to this subject. If one ‘takes this for granted’ they will forever miss the deeper insights which come later. So, the goal of this video is to provide intuition behind why the logarithm was the ‘natural’ choice…
June 17, 2013 at 4:35 am
I like how the tin can used for the money has a hole in the bottom. Evidently, it was attached to a string to be used before Alice and Bob invented wireless communication!
June 17, 2013 at 5:49 am
good catch 🙂