information complexity and exact communication bounds
DESCRIPTION
Information complexity and exact communication bounds. Mark Braverman Princeton University. April 26, 2013. Based on joint work with Ankit Garg , Denis Pankratov , and Omri Weinstein. Overview: information complexity. Information complexity :: communication complexity a s - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/1.jpg)
1
Information complexity and exact communication bounds
April 26, 2013
Mark BravermanPrinceton University
Based on joint work with Ankit Garg, Denis Pankratov, and Omri Weinstein
![Page 2: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/2.jpg)
Overview: information complexity
• Information complexity :: communication complexity
as• Shannon’s entropy ::
transmission cost
2
![Page 3: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/3.jpg)
Background – information theory
• Shannon (1948) introduced information theory as a tool for studying the communication cost of transmission tasks.
3
communication channel
Alice Bob
![Page 4: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/4.jpg)
Shannon’s entropy
• Assume a lossless binary channel. • A message is distributed according to some
prior .• The inherent amount of bits it takes to
transmit is given by its entropy.
4communication channel
X
![Page 5: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/5.jpg)
Shannon’s noiseless coding
• The cost of communicating many copies of scales as .
• Shannon’s source coding theorem:– Let be the cost of transmitting
independent copies of . Then the amortized transmission cost
.
5
![Page 6: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/6.jpg)
Shannon’s entropy – cont’d• Therefore, understanding the cost of
transmitting a sequence of ’s is equivalent to understanding Shannon’s entropy of .
• What about more complicated scenarios?
communication channelX
Y• Amortized transmission cost = conditional
entropy .
![Page 7: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/7.jpg)
A simple example• Alice has uniform • Cost of transmitting to Bob is
• Suppose for each Bob is given a unifomly
random such that then… cost of transmitting the ’s to Bob is
.
7
Easy and complete!
![Page 8: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/8.jpg)
Communication complexity [Yao]• Focus on the two party randomized setting.
8
A B
X YA & B implement a functionality .
F(X,Y)
e.g.
Meanwhile, in a galaxy far far away…
Shared randomness R
![Page 9: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/9.jpg)
Communication complexity
A B
X Y
Goal: implement a functionality .A protocol computing :
F(X,Y)
m1(X,R)m2(Y,m1,R)
m3(X,m1,m2,R)
Communication cost = #of bits exchanged.
Shared randomness R
![Page 10: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/10.jpg)
Communication complexity
• Numerous applications/potential applications (streaming, data structures, circuits lower bounds…)
• Considerably more difficult to obtain lower bounds than transmission (still much easier than other models of computation).
• Many lower-bound techniques exists. • Exact bounds??
10
![Page 11: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/11.jpg)
Communication complexity
• (Distributional) communication complexity with input distribution and error : Error w.r.t. .
• (Randomized/worst-case) communication complexity: . Error on all inputs.
• Yao’s minimax:.
11
![Page 12: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/12.jpg)
Set disjointness and intersectionAlice and Bob each given a set , (can be viewed as vectors in • Intersection .• Disjointness if , and otherwise. • is just 1-bit-ANDs in parallel. • is an OR of 1-bit-ANDs. • Need to understand amortized communication
complexity (of 1-bit-AND).
![Page 13: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/13.jpg)
Information complexity
• The smallest amount of information Alice and Bob need to exchange to solve .
• How is information measured?• Communication cost of a protocol?
– Number of bits exchanged. • Information cost of a protocol?
– Amount of information revealed.
13
![Page 14: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/14.jpg)
Basic definition 1: The information cost of a protocol
• Prior distribution: .
A B
X Y
Protocol πProtocol transcript
𝐼𝐶(𝜋 ,𝜇)= 𝐼 (Π ;𝑌∨𝑋 )+𝐼 (Π ; 𝑋∨𝑌 )what Alice learns about Y + what Bob learns about X
![Page 15: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/15.jpg)
Mutual information
• The mutual information of two random variables is the amount of information knowing one reveals about the other:
• If are independent, .• .
15
H(A) H(B)I(A,B)
![Page 16: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/16.jpg)
Basic definition 1: The information cost of a protocol
• Prior distribution: .
A B
X Y
Protocol πProtocol transcript
𝐼𝐶(𝜋 ,𝜇)= 𝐼 (Π ;𝑌∨𝑋 )+𝐼 (Π ; 𝑋∨𝑌 )what Alice learns about Y + what Bob learns about X
![Page 17: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/17.jpg)
Example• is .• is a distribution where w.p. and w.p. are
random.
A B
X Y
1 + 64.5 = 65.5 bits
what Alice learns about Y + what Bob learns about X
MD5(X) [128 bits]X=Y? [1 bit]
![Page 18: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/18.jpg)
Information complexity
• Communication complexity:.
• Analogously:.
18
![Page 19: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/19.jpg)
Prior-free information complexity
• Using minimax can get rid of the prior. • For communication, we had:
.• For information
.
19
![Page 20: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/20.jpg)
Connection to privacy
• There is a strong connection between information complexity and (information-theoretic) privacy.
• Alice and Bob want to perform computation without revealing unnecessary information to each other (or to an eavesdropper).
• Negative results through arguments.
20
![Page 21: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/21.jpg)
Information equals amortized communication
• Recall [Shannon]: .• [BR’11]: , for .• For : .
•[ an interesting open question.]
21
![Page 22: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/22.jpg)
Without priors
•[BR’11] For : .• [B’12] .
22
![Page 23: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/23.jpg)
Intersection
• Therefore
• Need to find the information complexity of the two-bit !
23
![Page 24: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/24.jpg)
The two-bit AND
• [BGPW’12] bits.• Find the value of for all priors .• Find the information-theoretically optimal
protocol for computing the of two bits.
24
![Page 25: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/25.jpg)
The optimal protocol for AND
A B
X{0,1} Y{0,1}
If X=1, A=1If X=0, A=U[0,1]
If Y=1, B=1If Y=0, B=U[0,1]
0
1
“Raise your hand when your number is reached”
![Page 26: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/26.jpg)
The optimal protocol for AND
A B
If X=1, A=1If X=0, A=U[0,1]
If Y=1, B=1If Y=0, B=U[0,1]
0
1
“Raise your hand when your number is reached”
X{0,1} Y{0,1}
![Page 27: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/27.jpg)
Analysis• An additional small step if the prior is not
symmetric (). • The protocol is clearly always correct. • How do we prove the optimality of a
protocol?• Consider the function as a function of .
27
![Page 28: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/28.jpg)
The analytical view• A message is just a mapping from the
current prior to a distribution of posteriors (new priors). Ex:
28
Y=0 Y=1
X=0 0.4 0.2
X=1 0.3 0.1
Y=0 Y=1
X=0 2/3 1/3
X=1 0 0
Y=0 Y=1
X=0 0 0
X=1 0.75 0.25Alice sends her bit
“0”: 0.6
“1”: 0.4
![Page 29: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/29.jpg)
The analytical view
29
Y=0 Y=1
X=0 0.4 0.2
X=1 0.3 0.1
Y=0 Y=1
X=0 0.545 0.273
X=1 0.136 0.045
Y=0 Y=1
X=0 2/9 1/9
X=1 1/2 1/6Alice sends her bit w.p ½ and unif. random bit w.p ½.
“0”: 0.55
“1”: 0.45
![Page 30: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/30.jpg)
Analytical view – cont’d
• Denote .• Each potential (one bit) message by either
party imposes a constraint of the form:
• In fact, is the point-wise largest function satisfying all such constraints (cf. construction of harmonic functions).
30
![Page 31: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/31.jpg)
IC of AND
• We show that for described above, satisfies all the constraints, and therefore represents the information complexity of at all priors.
• Theorem: represents the information-theoretically optimal protocol* for computing the of two bits.
31
![Page 32: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/32.jpg)
*Not a real protocol
• The “protocol” is not a real protocol (this is why IC has an inf in its definition).
• The protocol above can be made into a real protocol by discretizing the counter (e.g. into equal intervals).
• We show that the -round IC:
32
![Page 33: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/33.jpg)
Previous numerical evidence
• [Ma,Ishwar’09] – numerical calculation results.33
![Page 34: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/34.jpg)
Applications: communication complexity of intersection
• Corollary:
• Moreover:
34
![Page 35: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/35.jpg)
Applications 2: set disjointness
• Recall: .• Extremely well-studied. [Kalyanasundaram
and Schnitger’87, Razborov’92, Bar-Yossef et al.’02]: .
• What does a hard distribution for look like?
35
![Page 36: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/36.jpg)
A hard distribution?
36
0 0 1 1 0 1 0 0 0 1 0 0 1 1 1 1 0 1 1 0 0
1 0 1 0 0 1 1 1 0 0 1 1 1 0 1 0 1 1 0 0 0
Y=0 Y=1
X=0 1/4 1/4
X=1 1/4 1/4
Very easy!
![Page 37: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/37.jpg)
A hard distribution
37
0 0 0 1 0 1 0 0 0 1 0 0 1 1 0 1 0 1 1 0 0
1 0 1 0 0 0 1 1 0 0 1 1 1 0 0 0 1 0 0 0 0
Y=0 Y=1
X=0 1/3 1/3
X=1 1/3
At most one (1,1) location!
![Page 38: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/38.jpg)
Communication complexity of Disjointness
• Continuing the line of reasoning of Bar-Yossef et. al.
• We now know exactly the communication complexity of Disj under any of the “hard” prior distributions. By maximizing, we get:
• , where
• With a bit of work this bound is tight.38
![Page 39: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/39.jpg)
Small-set Disjointness
• A variant of set disjointness where we are given of size .
• A lower bound of is obvious (modulo ). • A very elegant matching upper bound was
known [Hastad-Wigderson’07]: .
39
![Page 40: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/40.jpg)
Using information complexity
• This setting corresponds to the prior distribution
• Gives information complexity • Communication complexity
Y=0 Y=1
X=0 1-2k/n k/n
X=1 k/n
![Page 41: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/41.jpg)
Overview: information complexity
• Information complexity :: communication complexity
as• Shannon’s entropy ::
transmission costToday: focused on exact bounds using IC.
41
![Page 42: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/42.jpg)
Selected open problems 1• The interactive compression problem. • For Shannon’s entropy we have
• E.g. by Huffman’s coding we also know that
• In the interactive setting
• But is it true that ??
![Page 43: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/43.jpg)
Interactive compression?
• is equivalent to , the “direct sum” problem for communication complexity. • Currently best general compression scheme
[BBCR’10]: protocol of information cost and communication cost compressed to bits of communication.
43
![Page 44: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/44.jpg)
Interactive compression?
• is equivalent to , the “direct sum” problem for communication complexity. • A counterexample would need to separate
IC from CC, which would require new lower bound techniques [Kerenidis, Laplante, Lerays, Roland, Xiao’12].
44
![Page 45: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/45.jpg)
Selected open problems 2
• Given a truth table for , a prior , and an , can we compute ?
• An uncountable number of constraints, need to understand structure better.
• Specific ’s with inputs in .
• Going beyond two players.
45
![Page 46: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/46.jpg)
External information cost
(𝑋 ,𝑌 ) 𝜇 .
A B
X Y
Protocol πProtocol transcript
𝐼𝐶𝑒𝑥𝑡 (𝜋 ,𝜇)=𝐼 (Π ; 𝑋𝑌 )≥ 𝐼 (Π ;𝑌|𝑋 )+𝐼 (Π ; 𝑋∨𝑌 )what Charlie learns about
C
![Page 47: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/47.jpg)
External information complexity
• .• Conjecture: Zero-error communication scales
like external information:
• Example: for this value is
47
![Page 48: Information complexity and exact communication bounds](https://reader035.vdocument.in/reader035/viewer/2022070422/568163c9550346895dd50023/html5/thumbnails/48.jpg)
48
Thank You!