Information Theory part 12: Information Entropy (Claude Shannon's formula)
Art of the Problem
Published at : 19 Dec 2020
Entropy is a measure of the uncertainty in a random variable (message source). Claude Shannon defines the "bit" as the unit of entropy (which is the uncertainty of a fair coin flip). In this video information entropy is introduced intuitively using bounce machines & yes/no questions.
Note: This analogy applies to higher order approximations, we simply create a machine for each state and average over all machines!