It wasn’t until 1920 that the question “how do we quantify information” was well articulated. This video introduces a key idea of Nyquist and Hartley, who laid the groundwork for Claude Shannon’s historic equation (Information Entropy) two decades later. In these early papers, the idea of using a logarithmic function appears, something which isn’t immediately obvious to most students fresh to this subject. If one ‘takes this for granted’ they will forever miss the deeper insights which come later. So, the goal of this video is to provide intuition behind why the logarithm was the ‘natural’ choice…
Archive for nyquist
Logarithmic Measure of Information (Entropy Primer)
Posted in Video / Theatre with tags Entropy, hartley, logarithm, nyquist on June 1, 2013 by Brit CruiseSymbols, Signals & Noise
Posted in Uncategorized with tags baud, baudot, edison, hartley, information theory, intersymbol interference, nyquist, pulse wave, quadruplex telegraph, Shannon, signals, symbol rate on April 8, 2013 by Brit CruiseThe following video/simulation was an attempt to bridge the gap between information as what we mean to say vs. information as what we could say. I view this as an important stepping stone towards Hartly, Nyquist and Shannon – which I will deal with next. It covers symbols, symbol rate (baud) and message space as an introduction to channel capacity. Featuring the Baudot multiplex system and Thomas Edison’s quadruplex telegraph.
Play with simulations used in video on Khan Academy: