This document discusses classical information theory and concepts like entropy. It defines entropy as quantifying the apparent disorder of particles in a room. Different examples of sequences are provided, from a highly ordered repeating sequence to a more random typical sequence. Functions are defined to calculate the string length and characters of a given sequence. The document also shows how to calculate the entropy of a sequence and compares the results of different entropy calculation functions.
2. Classical Information Theory
Entropy is defined to quantify the apparent disorder of a
room with particles
Statistical information on the average behaviour
A way to study particles is by using symbols forming
objects such as sequences or messages
4. Classical Information Theory
Looks highly ordered:
s =
1111111111111111111111111111111111111111111111111
1
Looks less ordered:
s =
0011010000110011000011000000110111110101010010100
0
Looks more typical because it is the kind of sequence one
13. Classical Information Theory
(1) between 1 and 50? In this case, yes
(2) between 1 and 25? Yes
(3) between 1 and 13? No
(4) between 14 and 20? Yes
(5) between 14 and 17? Yes
(6) between 16 and 17? Yes
(7) is it 16? No
N@Entropy[2, sequence]
MyEntropy[sequence] == Entropy[sequence]
True