際際滷

際際滷Share a Scribd company logo
Classical Information Theory
Unit 3
Classical Information Theory
 Entropy is defined to quantify the apparent disorder of a
room with particles
 Statistical information on the average behaviour
 A way to study particles is by using symbols forming
objects such as sequences or messages
Classical Information Theory
Looks highly ordered:
s =
1111111111111111111111111111111111111111111111111
1
Classical Information Theory
Looks highly ordered:
s =
1111111111111111111111111111111111111111111111111
1
Looks less ordered:
s =
0011010000110011000011000000110111110101010010100
0
Looks more typical because it is the kind of sequence one
Classical Information Theory
1001110101
Classical Information Theory
1101101010
0011101110
1100111001
1010110101
0111010101
0101011011
1101110010
1011010101
1101111000
0101010111
Classical Information Theory
Classical Information Theory
sequence = "this is not a sentence"
StringLength[sequence]
22
Characters[sequence]
{"t", "h", "i", "s", " ", "i", "s", " ", "n", "o", "t", " ", "a", " ", "s",
"e", "n", "t", "e", "n", "c", "e"}
Classical Information Theory
{{"a", 1}},
{{"c", 1}},
{{"h", 1}},
{{"o", 1}},
{{"i", 2}},
{{"e", 3}},
{{"n", 3}},
{{"s", 3}},
{{"t", 3}},
{{" ", 4}}
Classical Information Theory
{3/22, 1/22, 1/11, 3/22, 2/11, 3/22, 1/22, 1/22, 3/22, 1/22}
MyEntropy[base_, s_] :=
N@Total[Table[-Log[base, #[[i, 2]]/Total[Last /@ #]] (#[[i, 2]]/
Total[Last /@ #]), {i, Length[#]}] &@Tally[Characters@s]]
MyEntropy[2, sequence]
3.14036
Classical Information Theory
Classical Information Theory
1 <= x <= N
Log2 N
O Log2 N
N@Log[2, 100]
6.64386
Classical Information Theory
(1) between 1 and 50? In this case, yes
(2) between 1 and 25? Yes
(3) between 1 and 13? No
(4) between 14 and 20? Yes
(5) between 14 and 17? Yes
(6) between 16 and 17? Yes
(7) is it 16? No
N@Entropy[2, sequence]
MyEntropy[sequence] == Entropy[sequence]
True

More Related Content

Unit 3: Classical information