said susi when for all
TRANSCRIPT
-
8/13/2019 Said Susi When for All
1/158
-
8/13/2019 Said Susi When for All
2/158
-
8/13/2019 Said Susi When for All
3/158
-
8/13/2019 Said Susi When for All
4/158
-
8/13/2019 Said Susi When for All
5/158
-
8/13/2019 Said Susi When for All
6/158
-
8/13/2019 Said Susi When for All
7/158
-
8/13/2019 Said Susi When for All
8/158
-
8/13/2019 Said Susi When for All
9/158
-
8/13/2019 Said Susi When for All
10/158
-
8/13/2019 Said Susi When for All
11/158
-
8/13/2019 Said Susi When for All
12/158
-
8/13/2019 Said Susi When for All
13/158
-
8/13/2019 Said Susi When for All
14/158
-
8/13/2019 Said Susi When for All
15/158
-
8/13/2019 Said Susi When for All
16/158
-
8/13/2019 Said Susi When for All
17/158
-
8/13/2019 Said Susi When for All
18/158
-
8/13/2019 Said Susi When for All
19/158
-
8/13/2019 Said Susi When for All
20/158
-
8/13/2019 Said Susi When for All
21/158
-
8/13/2019 Said Susi When for All
22/158
-
8/13/2019 Said Susi When for All
23/158
-
8/13/2019 Said Susi When for All
24/158
-
8/13/2019 Said Susi When for All
25/158
-
8/13/2019 Said Susi When for All
26/158
-
8/13/2019 Said Susi When for All
27/158
-
8/13/2019 Said Susi When for All
28/158
-
8/13/2019 Said Susi When for All
29/158
-
8/13/2019 Said Susi When for All
30/158
-
8/13/2019 Said Susi When for All
31/158
-
8/13/2019 Said Susi When for All
32/158
-
8/13/2019 Said Susi When for All
33/158
-
8/13/2019 Said Susi When for All
34/158
-
8/13/2019 Said Susi When for All
35/158
-
8/13/2019 Said Susi When for All
36/158
-
8/13/2019 Said Susi When for All
37/158
-
8/13/2019 Said Susi When for All
38/158
-
8/13/2019 Said Susi When for All
39/158
-
8/13/2019 Said Susi When for All
40/158
-
8/13/2019 Said Susi When for All
41/158
-
8/13/2019 Said Susi When for All
42/158
-
8/13/2019 Said Susi When for All
43/158
-
8/13/2019 Said Susi When for All
44/158
-
8/13/2019 Said Susi When for All
45/158
-
8/13/2019 Said Susi When for All
46/158
-
8/13/2019 Said Susi When for All
47/158
-
8/13/2019 Said Susi When for All
48/158
-
8/13/2019 Said Susi When for All
49/158
-
8/13/2019 Said Susi When for All
50/158
-
8/13/2019 Said Susi When for All
51/158
-
8/13/2019 Said Susi When for All
52/158
-
8/13/2019 Said Susi When for All
53/158
-
8/13/2019 Said Susi When for All
54/158
-
8/13/2019 Said Susi When for All
55/158
-
8/13/2019 Said Susi When for All
56/158
-
8/13/2019 Said Susi When for All
57/158
-
8/13/2019 Said Susi When for All
58/158
-
8/13/2019 Said Susi When for All
59/158
-
8/13/2019 Said Susi When for All
60/158
-
8/13/2019 Said Susi When for All
61/158
-
8/13/2019 Said Susi When for All
62/158
-
8/13/2019 Said Susi When for All
63/158
-
8/13/2019 Said Susi When for All
64/158
-
8/13/2019 Said Susi When for All
65/158
-
8/13/2019 Said Susi When for All
66/158
-
8/13/2019 Said Susi When for All
67/158
-
8/13/2019 Said Susi When for All
68/158
-
8/13/2019 Said Susi When for All
69/158
-
8/13/2019 Said Susi When for All
70/158
-
8/13/2019 Said Susi When for All
71/158
-
8/13/2019 Said Susi When for All
72/158
-
8/13/2019 Said Susi When for All
73/158
-
8/13/2019 Said Susi When for All
74/158
-
8/13/2019 Said Susi When for All
75/158
-
8/13/2019 Said Susi When for All
76/158
-
8/13/2019 Said Susi When for All
77/158
-
8/13/2019 Said Susi When for All
78/158
-
8/13/2019 Said Susi When for All
79/158
-
8/13/2019 Said Susi When for All
80/158
-
8/13/2019 Said Susi When for All
81/158
-
8/13/2019 Said Susi When for All
82/158
-
8/13/2019 Said Susi When for All
83/158
-
8/13/2019 Said Susi When for All
84/158
-
8/13/2019 Said Susi When for All
85/158
-
8/13/2019 Said Susi When for All
86/158
-
8/13/2019 Said Susi When for All
87/158
-
8/13/2019 Said Susi When for All
88/158
-
8/13/2019 Said Susi When for All
89/158
-
8/13/2019 Said Susi When for All
90/158
-
8/13/2019 Said Susi When for All
91/158
-
8/13/2019 Said Susi When for All
92/158
-
8/13/2019 Said Susi When for All
93/158
-
8/13/2019 Said Susi When for All
94/158
-
8/13/2019 Said Susi When for All
95/158
-
8/13/2019 Said Susi When for All
96/158
-
8/13/2019 Said Susi When for All
97/158
-
8/13/2019 Said Susi When for All
98/158
-
8/13/2019 Said Susi When for All
99/158
-
8/13/2019 Said Susi When for All
100/158
-
8/13/2019 Said Susi When for All
101/158
-
8/13/2019 Said Susi When for All
102/158
-
8/13/2019 Said Susi When for All
103/158
-
8/13/2019 Said Susi When for All
104/158
-
8/13/2019 Said Susi When for All
105/158
-
8/13/2019 Said Susi When for All
106/158
-
8/13/2019 Said Susi When for All
107/158
-
8/13/2019 Said Susi When for All
108/158
-
8/13/2019 Said Susi When for All
109/158
-
8/13/2019 Said Susi When for All
110/158
-
8/13/2019 Said Susi When for All
111/158
-
8/13/2019 Said Susi When for All
112/158
-
8/13/2019 Said Susi When for All
113/158
-
8/13/2019 Said Susi When for All
114/158
-
8/13/2019 Said Susi When for All
115/158
-
8/13/2019 Said Susi When for All
116/158
-
8/13/2019 Said Susi When for All
117/158
-
8/13/2019 Said Susi When for All
118/158
-
8/13/2019 Said Susi When for All
119/158
-
8/13/2019 Said Susi When for All
120/158
-
8/13/2019 Said Susi When for All
121/158
-
8/13/2019 Said Susi When for All
122/158
-
8/13/2019 Said Susi When for All
123/158
-
8/13/2019 Said Susi When for All
124/158
-
8/13/2019 Said Susi When for All
125/158
-
8/13/2019 Said Susi When for All
126/158
-
8/13/2019 Said Susi When for All
127/158
-
8/13/2019 Said Susi When for All
128/158
-
8/13/2019 Said Susi When for All
129/158
-
8/13/2019 Said Susi When for All
130/158
-
8/13/2019 Said Susi When for All
131/158
-
8/13/2019 Said Susi When for All
132/158
-
8/13/2019 Said Susi When for All
133/158
-
8/13/2019 Said Susi When for All
134/158
-
8/13/2019 Said Susi When for All
135/158
-
8/13/2019 Said Susi When for All
136/158
-
8/13/2019 Said Susi When for All
137/158
-
8/13/2019 Said Susi When for All
138/158
-
8/13/2019 Said Susi When for All
139/158
-
8/13/2019 Said Susi When for All
140/158
-
8/13/2019 Said Susi When for All
141/158
-
8/13/2019 Said Susi When for All
142/158
-
8/13/2019 Said Susi When for All
143/158
-
8/13/2019 Said Susi When for All
144/158
-
8/13/2019 Said Susi When for All
145/158
-
8/13/2019 Said Susi When for All
146/158
-
8/13/2019 Said Susi When for All
147/158
-
8/13/2019 Said Susi When for All
148/158
-
8/13/2019 Said Susi When for All
149/158
-
8/13/2019 Said Susi When for All
150/158
-
8/13/2019 Said Susi When for All
151/158
8hannon!s theorem also implies that no lossless compression scheme can compress all messages. #f somemessages come out smaller, at least one must come out larger due to the pigeonhole principle. #n practical use,this is generally not a problem, because we are usually only interested in compressing certain types of messages,for e(ample @nglish documents as opposed to gibberish te(t, or digital photographs rather than noise, and it isunimportant if a compression algorithm ma&es some unli&ely or uninteresting se uences larger. However, the
problem can still arise even in everyday use when applying a compression algorithm to already compressed datafor e(ample, ma&ing a I#Q file of music in the )M7S audio format is unli&ely to achieve much e(tra savings in
space.Vefinition edit-
*amed after Golt mann!s H/theorem, 8hannon denoted the entropy H %Uree& letter @ta' of a discrete randomvariable A with possible values
-
8/13/2019 Said Susi When for All
152/158
represent the outcomes in binary form. )or our n sided die, this would be binary numbers
-
8/13/2019 Said Susi When for All
153/158
-
8/13/2019 Said Susi When for All
154/158
properties relating to the occurrence fre uencies of letter or word pairs, triplets etc. 8ee ar&ov chain.Vata compression edit-
ain article: Vata compression@ntropy effectively bounds the performance of the strongest lossless %or nearly lossless' compression possible,which can be reali ed in theory by using the typical set or in practice using Huffman, Mempel/Iiv or arithmeticcoding. $he performance of e(isting data compression algorithms is often used as a rough estimate of theentropy of a bloc& of data. 13- 1;- 8ee also _olmogorov comple(ity. #n practice, compression algorithms
deliberately include some judicious redundancy in the form of chec&sums to protect against errors.Korld!s technological capacity to store and communicate entropic information edit-7 recent study in 8cience %journal' estimates the world!s technological capacity to store and communicateoptimally compressed information normali ed on the most effective compression algorithms available in theyear 2BBT, therefore estimating the entropy of the technologically available sources. 1 -7ll figures in entropically compressed e(abytes$ype of #nformation 154 2BBT #ncrease8torage 2. 25; 119.;(Groadcast 392 15BB 3.954($elecommunications B.241 ; 291.9($he authors estimate human&ind technological capacity to store information %fully entropically compressed' in154 and again in 2BBT. $hey brea& the information into three categories / $o store information on a medium, $o
receive information through a one/way broadcast networ&s, to e(change information through two/waytelecommunication networ&s 1 -Mimitations of entropy as information content edit-$here are a number of entropy/related concepts that mathematically uantify information content in some way:the self/information of an individual message or symbol ta&en from a given probability distribution,the entropy of a given probability distribution of messages or symbols, andthe entropy rate of a stochastic process.%$he "rate of self/information" can also be defined for a particular se uence of messages or symbols generated
by a given stochastic process: this will always be e ual to the entropy rate in the case of a stationary process.'Other uantities of information are also used to compare or relate different sources of information.#t is important not to confuse the above concepts. Often it is only clear from conte(t which one is meant. )ore(ample, when someone says that the "entropy" of the @nglish language is about 1 bit per character, they are
actually modeling the @nglish language as a stochastic process and tal&ing about its entropy rate.7lthough entropy is often used as a characteri ation of the information content of a data source, this informationcontent is not absolute: it depends crucially on the probabilistic model. 7 source that always generates the samesymbol has an entropy rate of B, but the definition of what a symbol is depends on the alphabet. Sonsider asource that produces the string 7G7G7G7G7G... in which 7 is always followed by G and vice versa. #f the
probabilistic model considers individual letters as independent, the entropy rate of the se uence is 1 bit percharacter. Gut if the se uence is considered as "7G 7G 7G 7G 7G..." with symbols as two/character bloc&s,then the entropy rate is B bits per character.However, if we use very large bloc&s, then the estimate of per/character entropy rate may become artificiallylow. $his is because in reality, the probability distribution of the se uence is not &nowable e(actly it is only anestimate. )or e(ample, suppose one considers the te(t of every boo& ever published as a se uence, with eachsymbol being the te(t of a complete boo&. #f there are * published boo&s, and each boo& is only published once,
the estimate of the probability of each boo& is 1 *, and the entropy %in bits' is log2%1 *' + log2%*'. 7s a practical code, this corresponds to assigning each boo& a uni ue identifier and using it in place of the te(t of the boo& whenever one wants to refer to the boo&. $his is enormously useful for tal&ing about boo&s, but it is not souseful for characteri ing the information content of an individual boo&, or of language in general: it is not
possible to reconstruct the boo& from its identifier without &nowing the probability distribution, that is, thecomplete te(t of all the boo&s. $he &ey idea is that the comple(ity of the probabilistic model must be considered._olmogorov comple(ity is a theoretical generali ation of this idea that allows the consideration of theinformation content of a se uence independent of any particular probability model it considers the shortest
program for a universal computer that outputs the se uence. 7 code that achieves the entropy rate of a se uencefor a given model, plus the codeboo& %i.e. the probabilistic model', is one such program, but it may not be the
-
8/13/2019 Said Susi When for All
155/158
shortest.)or e(ample, the )ibonacci se uence is 1, 1, 2, 9, ;, 4, 19, . .. . $reating the se uence as a message and eachnumber as a symbol, there are almost as many symbols as there are characters in the message, giving an entropyof appro(imately log2%n'. 8o the first 124 symbols of the )ibonacci se uence has an entropy of appro(imately T
bits symbol. However, the se uence can be e(pressed using a formula )%n' + )%n 1' 6 )%n 2' for n+
-
8/13/2019 Said Susi When for All
156/158
/_ sum
-
8/13/2019 Said Susi When for All
157/158
$he entropy of two simultaneous events is no more than the sum of the entropies of each individual event, andare e ual if the two events are independent. ore specifically, if A and C are two random variables on the same
probability space, and %A,C' denotes their Sartesian product, then H %A,C'- le H%A'6H%C'.Qroving this mathematically follows easily from the previous two properties of entropy.@(tending discrete entropy to the continuous case: differential entropy edit-
ain article: Vifferential entropy$he 8hannon entropy is restricted to random variables ta&ing discrete values. $he corresponding formula for acontinuous random variable with probability density function f%(' with finite or infinite support mathbb A onthe real line is defined by analogy, using the above form of the entropy as an e(pectation:h f- + operatorname
-
8/13/2019 Said Susi When for All
158/158
1, then the relative entropy can be defined asV < mathrm