What is Computer Science? (Part 1)

Posted in Uncategorized on January 20, 2016 by britcruise

After a long period of research I’m happy to report Art of the Problem’s third episode is in production. This episode will act as the final piece of a CS trilogy. Here is the first video which gives an overview of the series: (or watch on YouTube)

I’ve also published an essay version of this video with extra links below (or read on Medium).


 

Around 100 years ago something really exciting was happening.

Continue reading

The making of Pixar in a Box

Posted in Research and Projects with tags , , , , , on September 25, 2015 by britcruise

Prehistory

In early 2014 Tony DeRose (Senior Scientist and Lead of the Research Group at Pixar Animation Studios ) and Elyse Klaidman (Director of Pixar University and Archives) approached Khan Academy with an idea. They wanted to answer a question everyone asks in school at some point: “Why do I need to learn this?” Previously, Tony had given talks which try and engage children in mathematics by demonstrating how math lives at the intersection of design and technology at Pixar. It was clear that you could motivate kids to learn math and science by showing them how concepts they encounter in school are used at Pixar to make movie magic… Continue reading

Markov Chains: The link between Plato, Bernoulli, Markov & Claude Shannon

Posted in Research and Projects, Video / Theatre with tags , , , , , , , , on June 27, 2013 by britcruise

Why did Bernoulli mention Plato’s Theory of Forms in Ars Conjectandi? What does this have to do with free will?

This video is a broad introduction to the Weak Law of Large Numbers, the Central Limit Theorem and how it all led to Markov Chains…

Next, play around with this interactive, graphical Markov simulator!

Screen Shot 2013-06-27 at 3.41.43 PM

3 decades later, Claude Shannon famously applied this idea to generate “english looking” messages in his Mathematical Theory of Communication:

Logarithmic Measure of Information (Entropy Primer)

Posted in Video / Theatre with tags , , , on June 1, 2013 by britcruise

It wasn’t until 1920 that the question “how do we quantify information” was well articulated. This video introduces a key idea of Nyquist and Hartley, who laid the groundwork for Claude Shannon’s historic equation (Information Entropy) two decades later. In these early papers, the idea of using a logarithmic function appears, something which isn’t immediately obvious to most students fresh to this subject. If one ‘takes this for granted’ they will forever miss the deeper insights which come later. So, the goal of this video is to provide intuition behind why the logarithm was the ‘natural’ choice…

Link to Khan Academy lesson.

Symbols, Signals & Noise

Posted in Uncategorized with tags , , , , , , , , , , , on April 8, 2013 by britcruise

The following video/simulation was an attempt to bridge the gap between information as what we mean to say vs. information as what we could say. I view this as an important stepping stone towards Hartly, Nyquist and Shannon – which I will deal with next. It covers symbols, symbol rate (baud) and message space as an introduction to channel capacity. Featuring the Baudot multiplex system and Thomas Edison’s quadruplex telegraph.

Play with simulations used in video on Khan Academy:

Symbol Rate

Symbol Rate

Screen Shot 2013-04-08 at 7.40.37 AM

Electricity, Magnetism, Morse & The Information Age.

Posted in Video / Theatre with tags , , , , , , , , , , , on March 2, 2013 by britcruise

The follow three video mini-series is a bit of an Engineering detour in the story of information theory. In order to easily grasp the ideas of Hartley and Shannon, I felt it would be beneficial to lay some groundwork. It began with my own selfish interest in wanting to relive some famous experiments & technologies from the 19th Century. Specifically, why did the Information Age arise? When and how did electricity play a role in communication? Why was magnetism involved? Why did Morse code become so popular compared to the European designs? How was information understood before words (and concepts) such as “bit” existed? What’s the difference between static electricity and current?

All of these questions are answered as we slowly uncover a more modern approach to sending differences over a distance…

The History of Electricity

The Battery and Electromagnetism

Morse Code and the Information Age

Click below to practice Morse Code!

Screen Shot 2013-03-01 at 11.34.08 PM

Conditional Probability (Bayes Theorem) Visualized

Posted in Research and Projects, Video / Theatre with tags , , , on January 24, 2013 by britcruise

It’s powerful to understand how conditional probability can be visualized using decision trees. I wanted to create an alternative to most explanations which often start with many abstractions. I was drawn to the idea of looking at the back pages of a choose-your-own-adventure book, and deciding how you could have arrived there. Here I present a visual method using a story involving coins… allowing you to decide how to formalize. Once we grow tired of growing trees, we may ask the key questions: how can we speed up this process?:

This is followed by a game I designed (built by Peter Collingridge) which introduces how branches can be weighted instead of counted.

Screen Shot 2013-01-27 at 6.18.14 PM

Thanks to Kalid Azad for reviewing this lesson.

Follow

Get every new post delivered to your Inbox.

Join 1,594 other followers