It wasn’t until 1920 that the question “how do we quantify information” was well articulated. This video introduces a key idea of Nyquist and Hartley, who laid the groundwork for Claude Shannon’s historic equation (Information Entropy) two decades later. In these early papers, the idea of using a logarithmic function appears, something which isn’t immediately obvious to most students fresh to this subject. If one ‘takes this for granted’ they will forever miss the deeper insights which come later. So, the goal of this video is to provide intuition behind why the logarithm was the ‘natural’ choice…
Archive for Entropy
Logarithmic Measure of Information (Entropy Primer)
Posted in Video / Theatre with tags Entropy, hartley, logarithm, nyquist on June 1, 2013 by Brit CruiseInformation Theory: The Language of Coins
Posted in Video / Theatre with tags claude shannon, Entropy, information theory on September 15, 2012 by Brit CruiseI’ll never forget the first time I was introduced to Information Theory. My TA Mike Burrel began a lecture by writing a string of 0’s and 1’s on the board and asked us to think about what it meant. It was followed by a trance-like state of excitement…how did I not hear of this before? Three years later I’m thrilled to be launching an entire episode on the topic. It was a true joy to go back to square one and relearn the topic with a childlike curiosity…My goal is to create a Myst inspired adventure which includes various puzzles along the way.
Episode #2: The Language of Coins