ryan o’donnell carnegie mellon university. part 1: a. fourier expansion basics b. concepts: bias,...

109
Ryan O’Donnell Carnegie Mellon University

Upload: dwayne-nelson

Post on 12-Jan-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Ryan O’Donnell

Carnegie Mellon University

Page 2: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Part 1:

A. Fourier expansion basics

B. Concepts:

Bias, Influences, Noise Sensitivity

C. Kalai’s proof of Arrow’s Theorem

Page 3: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

10 Minute Break

Page 4: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Part 2:

A. The Hypercontractive Inequality

B. Algorithmic Gaps

Page 5: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Sadly no time for:

Learning theory

Pseudorandomness

Arithmetic combinatorics

Random graphs / percolation

Communication complexity

Metric / Banach spaces

Coding theory

etc.

Page 6: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

1A. Fourier expansion basics

Page 7: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

f : {0,1}n {0,1}

Page 8: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

f : {−1,+1}n {−1,+1}

Page 9: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

ℝ3

(+1,+1,+1)

(−1,−1,−1)

(+1,+1,−1)

(+1,−1,+1)

(−1,+1,+1)

−1

−1

−1

+1

+1

+1

+1

−1

Page 10: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

ℝ3

(+1,+1,+1)

(−1,−1,−1)

+1

+1

+1

+1−1

−1

−1

−1

Page 11: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

ℝ3

(+1,+1,+1)

(−1,−1,−1)

−1

−1

−1

+1−1

−1

−1

−1

Page 12: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

ℝ3

(+1,+1,+1)

(−1,−1,−1)

+1

+1

+1

+1+1

+1

+1

−1

Page 13: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

ℝ3

(+1,+1,+1)

(−1,−1,−1)

+1

+1

+1

+1+1

+1

+1

+1

Page 14: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

ℝ3

(+1,+1,+1)

(−1,−1,−1)

−1

−1

−1

−1−1

−1

−1

−1

Page 15: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

ℝ3

(+1,+1,+1)

(−1,−1,−1)

−1

−1

+1

+1−1

+1

+1

−1

Page 16: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

ℝ3

(+1,+1,+1)

(−1,−1,−1)

−1

+1

−1

+1+1

−1

+1

−1

Page 17: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

ℝ3

(+1,+1,+1)

(−1,−1,−1)

+1

−1

−1

+1+1

+1

−1

−1

Page 18: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

ℝ3

(+1,+1,+1)

(−1,−1,−1)

−1

−1

−1

+1

+1

+1

+1

−1

Page 19: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

(+1,+1,+1)

+1+1

+1+1

+1+1+1+1

−1−1

−1−1

−1−1−1−1

+1

+1

+1

+1

−1

−1 −1

−1

(+1,+1,−1)

(+1,−1,−1)

Page 20: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

=

Page 21: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s
Page 22: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

=

Page 23: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Proposition:

Every f : {−1,+1}n {−1,+1} can be

expressed as a multilinear polynomial,

That’s it. That’s the “Fourier expansion” of f.

(uniquely)

(indeed, → ℝ)

Page 24: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Proposition:

Every f : {−1,+1}n {−1,+1} can be

expressed as a multilinear polynomial,

That’s it. That’s the “Fourier expansion” of f.

(uniquely)

(indeed, → ℝ)

Page 25: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Rest: 0

Page 26: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Why?

Coefficients encode useful information.

When?

1. Uniform probability involved

2. Hamming distances relevant

Page 27: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Parseval’s Theorem:

Let f : {−1,+1}n {−1,+1}.

Then

avg { f(x)2 }

Page 28: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

“Weight” of f on S ⊆ [n]

=

Page 29: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

{2}{1}

{3}

{1,3}{1,2} {2,3}

{1,2,3}

Page 30: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

{2}{1}

{3}

{1,3}{1,2} {2,3}

{1,2,3}

Page 31: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

{2}{1}

{3}

{1,3}{1,2} {2,3}

{1,2,3}

Page 32: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

{2}{1}

{3}

{1,3}{1,2} {2,3}

{1,2,3}

Page 33: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

{2}{1}

{3}

{1,3}{1,2} {2,3}

{1,2,3}

Page 34: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

{2}{1}

{3}

{1,3}{1,2} {2,3}

{1,2,3}

Page 35: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

1B. Concepts:

Bias, Influences, Noise Sensitivity

Page 36: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Social Choice:

Candidates ±1

n voters

Votes are random

f : {−1,+1}n {−1,+1}

is the “voting rule”

Page 37: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Bias of f:

avg f(x) = Pr[+1 wins] − Pr[−1 wins]

Fact:

Weight on ∅ = measures “imbalance”.

Page 38: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Influence of i on f:

Pr[ f(x) ≠ f(x(⊕i)) ]

= Pr[voter i is a swing voter]

Fact:

Page 39: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

{2}{1}

{3}

{1,3}{1,2} {2,3}

{1,2,3}

Maj(x1,x2,x3)

Page 40: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s
Page 41: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

+1

+1

+1+1

−1

−1

−1−1

Infi(f) = Pr[ f(x) ≠ f(x(⊕i)) ]

Page 42: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

+1

+1

+1+1

−1

−1

−1−1

Infi(f) = Pr[ f(x) ≠ f(x(⊕i)) ]

Page 43: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

avg Infi(f) = frac. of edges which

are cut edges

Page 44: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

LMN Theorem:

If f is in AC0

then avg Infi(f)

Page 45: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

⇒ avg Infi(Parityn) = 1

⇒ Parity ∉ AC0

⇒ avg Infi(Majn) =

⇒ Majority ∉ AC0

Page 46: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

KKL Theorem:

If Bias(f) = 0,

then

Corollary:

Assuming f monotone,

−1 or +1 can bribe o(n) voters

and win w.p. 1−o(1).

Page 47: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Noise Sensitivity of f at ϵ:

NSԑ(f) = Pr[wrong winner wins],

when each vote misrecorded w/prob ϵ

f(

f(

)

)

+ − + + − − + − −

− − + + + + + − −

Page 48: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s
Page 49: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Learning Theory principle:

[LMN’93, …, KKMS’05]

If all f ∈ C have small NSԑ(f)

then C is efficiently learnable.

Page 50: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

{2}{1}

{3}

{1,3}{1,2} {2,3}

[3]

Page 51: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Proposition:

for small ԑ,

with Electoral College:

ϵ 10

1

Page 52: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

1C. Kalai’s proof of Arrow’s Theorem

Page 53: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Ranking 3 candidates

Condorcet [1775] Election:

=> (x_i, y_i, z_i) are Not All Equal (no 111 -1-1-1)

Condorcet: Try f = Maj. Outcome can be “irrational” A > B > C > A. [easy

eg]

Maybe some other f?

A > B?

B > C?

C > A?

Page 54: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Ranking 3 candidates

Condorcet [1775] Election:

=> (x_i, y_i, z_i) are Not All Equal (no 111 -1-1-1)

Condorcet: Try f = Maj. Outcome can be “irrational” A > B > C > A. [easy

eg]

Maybe some other f?

• • • • • •

A > B?

B > C?

C > A?

“C >

A >

B”

“A >

B >

C”

“B >

C >

A”

+

+

+

+

+

+

+

+

+

+

+

+

+

+

Page 55: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Ranking 3 candidates

Condorcet [1775] Election:

=> (x_i, y_i, z_i) are Not All Equal (no 111 -1-1-1)

Condorcet: Try f = Maj. Outcome can be “irrational” A > B > C > A. [easy

eg]

Maybe some other f?

• • • • • •

A > B?

B > C?

C > A?

“C >

A >

B”

“A >

B >

C”

“B >

C >

A”

f( )f( )f( )

=+=+=−

Society: “A > B > C”

+

+

+

+

+

+

+

+

+

+

+

+

+

+

Page 56: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Ranking 3 candidates

Condorcet [1775] Election:

=> (x_i, y_i, z_i) are Not All Equal (no 111 -1-1-1)

Condorcet: Try f = Maj. Outcome can be “irrational” A > B > C > A. [easy

eg]

Maybe some other f?

• • • • • •

A > B?

B > C?

C > A?

“C >

A >

B”

“A >

B >

C”

“B >

C >

A”

f( )f( )f( )

=+=+=−

Society: “A > B > C”

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

Page 57: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Ranking 3 candidates

Condorcet [1775] Election:

=> (x_i, y_i, z_i) are Not All Equal (no 111 -1-1-1)

Condorcet: Try f = Maj. Outcome can be “irrational” A > B > C > A. [easy

eg]

Maybe some other f?

• • • • • •

“C >

A >

B”

“A >

B >

C”

“B >

C >

A”

Society: “A > B > C”

A > B?

B > C?

C > A?

f( )f( )f( )

=+=+=+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

Page 58: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Ranking 3 candidates

Condorcet [1775] Election:

=> (x_i, y_i, z_i) are Not All Equal (no 111 -1-1-1)

Condorcet: Try f = Maj. Outcome can be “irrational” A > B > C > A. [easy

eg]

Maybe some other f?

• • • • • •

“C >

A >

B”

“A >

B >

C”

“B >

C >

A”

Society: “A > B > C > A”?A > B?

B > C?

C > A?

f( )f( )f( )

=+=+=+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

+

Page 59: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Arrow’s Impossibility Theorem [1950]:

If

f : {−1,+1}n {−1,+1} never gives

irrational outcome in Condorcet

elections,

then

f is a Dictator or a negated-Dictator.

Page 60: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Gil Kalai’s Proof [2002]:

Page 61: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

• • • • • •

“C >

A >

B”

“A >

B >

C”

“B >

C >

A”

A > B?

B > C?

C > A?

f( )f( )f( )

=+=+=−

+

+

+

+

+

+

+

+

+

+

+

+

+

+

Page 62: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

• • • • • •

“C >

A >

B”

“A >

B >

C”

“B >

C >

A”

A > B?

B > C?

C > A?

f( )f( )f( )

=+=+=−

+

+

+

+

+

+

+

+

+

+

+

+

+

+

Page 63: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Gil Kalai’s Proof:

Page 64: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Gil Kalai’s Proof:

Page 65: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Gil Kalai’s Proof, concluded:

f never gives irrational outcomes ⇒ equality

⇒ all Fourier weight “at level 1”

⇒ f(x) = ±xj for some j (exercise).

Page 66: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Guilbaud’s Theorem [1952]

Guilbaud’s Number ≈ .912

Page 67: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Corollary of “Majority Is Stablest” [MOO05]:

If Infi(f) ≤ o(1) for all i,

then

Pr[rational outcome with f]

Page 68: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s
Page 69: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Part 2:

A. The Hypercontractive Inequality

B. Algorithmic Gaps

Page 70: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

2A. The Hypercontractive Inequality

AKA Bonami-Beckner Inequality

Page 71: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

all use “Hypercontractive Inequality”

Page 72: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Hoeffding Inequality:

Let

F = c0 + c1 x1 + c2 x2 + ··· + cn xn,

where xi’s are indep., unif. random ±1.

Page 73: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Mean: μ = c0 Variance:

Hoeffding Inequality:

Let

F = c0 + c1 x1 + c2 x2 + ··· + cn xn,

Page 74: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Mean: μ = Variance:

Hypercontractive Inequality*:

Let

Page 75: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Then for all q ≥ 2,

Hypercontractive Inequality:

Let

Page 76: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Then F is a “reasonabled” random variable.

Hypercontractive Inequality:

Let

Page 77: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Then for all q ≥ 2,

Hypercontractive Inequality:

Let

Page 78: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Then

“q = 4” Hypercontractive Inequality:

Let

Page 79: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Then

“q = 4” Hypercontractive Inequality:

Let

Page 80: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

all use Hypercontractive Inequality

Page 81: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

just use “q = 4” Hypercontractive Inequality

Page 82: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

“q = 4” Hypercontractive Inequality:

Let F be degree d over n i.i.d. ±1 r.v.’s.

Then

Proof [MOO’05]: Induction on n.

Obvious step.

Use induction hypothesis.

Use Cauchy-Schwarz on the obvious thing.

Use induction hypothesis.

Obvious step.

Page 83: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

2B. Algorithmic Gaps

Page 84: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Opt

best poly-timeguarantee

ln(N)

“Set-Cover is NP-hard to

approximate to factor ln(N)”

Page 85: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Opt

LP-Rand-Roundingguarantee

ln(N)

“Factor ln(N) Algorithmic Gap

for LP-Rand-Rounding”

Page 86: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Opt(S)

LP-Rand-Rounding(S)

ln(N)

“Algorithmic Gap Instance S

for LP-Rand-Rounding”

Page 87: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Algorithmic Gap instances

are often “based on” {−1,+1}n.

Page 88: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Sparsest-Cut:

Algorithm: Arora-Rao-Vazirani SDP.

Guarantee: Factor

Page 89: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s
Page 90: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Opt = 1/n

Page 91: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Opt = 1/n

Page 92: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Opt = 1/n

Page 93: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Opt = 1/n

f(x) = sgn( )

Page 94: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Opt = 1/n

f(x) = sgn(r1x1 + ••• + rnxn)

ARV gets

Page 95: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Opt = 1/n

ARV gets

gap:

Page 96: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Algorithmic Gaps → Hardness-of-Approx

LP / SDP-rounding Alg. Gap instance

• n optimal “Dictator” solutions

• “generic mixture of Dictators” much worse

+ PCP technology

= same-gap hardness-of-approximation

Page 97: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Algorithmic Gaps → Hardness-of-Approx

LP / SDP-rounding Alg. Gap instance

• n optimal “Dictator” solutions

• “generic mixture of Dictators” much worse

+ PCP technology

= same-gap hardness-of-approximation

Page 98: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

KKL / Talagrand Theorem:

If f is balanced,

Infi(f) ≤ 1/n.01 for all i,

then

avg Infi(f) ≥

Gap: Θ(log n) = Θ(log log N).

Page 99: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

[CKKRS05]: KKL + Unique Games Conjecture

⇒ Ω(log log log N) hardness-of-approx.

Page 100: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

2-Colorable 3-Uniform hypergraphs:

Input: 2-colorable, 3-unif. hypergraph

Output: 2-coloring

Obj: Max. fraction of legally

colored hyperedges

Page 101: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

2-Colorable 3-Uniform hypergraphs:

Algorithm: SDP [KLP96].

Guarantee:

[Zwick99]

Page 102: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Algorithmic Gap Instance

Vertices: {−1,+1}n

6n hyperedges:{ (x,y,z) : poss. prefs in

a Condorcet

election}

(i.e., triples s.t. (xi,yi,zi) NAE for all i)

Page 103: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Elts: {−1,+1}n Edges: Condorcet votes (x,y,z)

2-coloring = f : {−1,+1}n → {−1,+1}

frac. legally colored hyperedges

= Pr[“rational” outcome with f]

Instance 2-colorable? ✔

(2n optimal solutions: ±Dictators)

Page 104: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Elts: {−1,+1}n Edges: Condorcet votes (x,y,z)

SDP rounding alg. may output

Random weighted majority also

rational-with-prob.-.912! [same CLT arg.]

f(x) = sgn(r1x1 + ••• + rnxn)

Page 105: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Algorithmic Gaps → Hardness-of-Approx

LP / SDP-rounding Alg. Gap instance

• n optimal “Dictator” solutions

• “generic mixture of Dictators” much worse

+ PCP technology

= same-gap hardness-of-approximation

Page 106: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Corollary of Majority Is Stablest:

If Infi(f) ≤ o(1) for all i,

then

Pr[rational outcome with f]

Cor: this + Unique Games Conjecture

⇒ .912 hardness-of-approx*

Page 107: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

2C. Future Directions

Page 108: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s

Develop the “structure vs. pseudorandomness”

theory for Boolean functions.

Page 109: Ryan O’Donnell Carnegie Mellon University. Part 1: A. Fourier expansion basics B. Concepts: Bias, Influences, Noise Sensitivity C. Kalai’s proof of Arrow’s