simulating independence: new constructions of condensers, ramsey graphs, dispersers and extractors...

40
new constructions of Condensers, Ramsey Graphs, Dispersers and Extractors Boaz Barak Guy Kindler Ronen Shaltiel Benny Sudakov Avi Wigderson

Upload: clinton-lucas

Post on 24-Dec-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

Simulating independence: new constructions of Condensers, Ramsey Graphs, Dispersers and Extractors

Boaz BarakGuy Kindler

Ronen ShaltielBenny SudakovAvi Wigderson

Plan for this talk Background:

Bipartite Ramsey Graphs. Randomness extractors. 2-source extractors and their relation to

bipartite Ramsey graphs. Constructions:

A construction of a 3-source extractor. Component: somewhere random condenser. A sketch of our construction of bipartite

Ramsey graphs.

Ramsey Graphs K-Ramsey graphs are graphs which contain no

cliques or anti-cliques of size K. [Erdos 1947]: There exist a graph on N vertices

with no cliques (anti-cliques) of size (2+o(1))log N.

One of the first applications of the probabilistic method!

Erdos also asked whether such graphs can be explicitly constructed.

Best explicit construction [Frankl and Wilson]: )logloglog(2 NN

Bipartite Ramsey Graphs (Viewed as adjacency matrices)

Ramsey Graph:No large monochromatic

rectangles of form X x X.Bipartite Ramsey Graph:No large monochromatic

rectangles of form X x Y.Every matrix of a bipartite

Ramsey Graph is a matrix of a Ramsey Graph.

Nonexplicit result: O(log N)Known explicit [CG85]: √NRecently [PR04]: o(√N).Our result: Nδ for every δ>0.

010101110

000110010

001011111

010001111

110101111

001001111

011101001

011001100

100110011

N

X

X Y

N N

A new construction of Ramsey Graphs

Convention N=2n. Identify {1..N}≈{0,1}n.

Theorem: For every constant δ>0 and large enough n there is a polynomial time computable function*

R:{0,1}n x {0,1}n -> {0,1}Such that for every X,Y in {0,1}n of

size Nδ=2δn, R(X,Y)={0,1}. *The polynomial depends on δ.

Randomness Extractors: Refining randomness from nature

We have access to distributions in nature:

Electric noise Particle reactions Key strokes of user Timing of past eventsThese distributions are

“somewhat random” but not “truly random”.

Solution: Randomness Extractors

random coins

Probabilistic procedure

input

output

Somewhat random

RandomnessExtractor

Notion of entropy Somewhat random distributions must “contain

randomness” (entropy). Right notion (min-entropy): the min-entropy of a

distribution X is the largest k such that Pr[X=x] ≤2-k.

Unjustified assumption for this talk (with loss of generality):

All distributions are uniform over some subset. Entropy=log(set size).

(n-bit strings)

Xx

alg

The dream extractor 1-source extractor

I=(011…010)

(n-bit strings)

ext

(m-bit strings)

Xx

Problem:No such thing!

We want of ext:

whenever |X|>2m ext(x)~Um

alg

Seeded extractor

I=(011…010)

(n-bit strings)

ext

(m-bit strings)

Xx

alg

1

2

5

3

4

Seeded extractor

I=(011…010)

(m-bit strings)

(n-bit strings)

ext

Xx

Good for:Good for:

Simulating BPPSimulating BPP

using weak sources.using weak sources.

Problems:Problems:

Doesn’t work for Doesn’t work for

cryptographycryptography

2-source extractor

(m-bit strings)

(n-bit strings)

ext

x2

x1X1 X2

Whenever:

|X1|, |X2| “large” ext(x1,x2)~Um

Such things exist!

Definition: 2-source extractor

Entropy rate = entropy/length

A 2-source extractor for rate δ is a function Ext(x,y) such that for any two independent distributions X,Y with entropy rate > δ the output distribution Ext(X,Y) is close to uniform.

2-source extractors and bipartite Ramsey graphs

Consider 2-source extractors for distributions of type X x Y where X and Y have entropy rate δ.

Namely a function Ext(x,y) (say into one bit).

Requirement: No large unbalanced X x Y rectangles.

2-source extractor => bipartite Ramsey graph.

010101110

000110010

001011101

010000110

110101010

001000101

011101001

011001100

100110011

N

X

Y

x

y0/1

ext

yx

Previous work on two source extractors Emerged from [SV86,V86,V87,VV88,CG88]. Best known result: A 2-source extractor for

entropy rate δ=½ based on Hadamard matrix [CG88,V87]. We denote it by Had(x,y) and it is a component in our construction.

This is in fact, the Ramsey graph with clique-size N½=2n/2 mentioned earlier.

For one output bit Had(x,y)=<x,y>. Improvements by [DEOR04,Raz05]. Goal: achieve entropy rate δ < ½.

Multiple source extractors Why just 2 sources? Idea: [BIW04] improve rate to δ<½ at the

cost of having more sources. Theorem: [BIW04] for every constant δ>0

there is a number t=t(δ) and an explicit t-source extractor for entropy-rate δ.

Relies on “additive number theory” and recent results by [BKT,K].

Denote it by BIW(x1,..,xt). We use it as a component in our construction.

Goal: reduce number of sources.

Our results on multiple source extractors

Theorem: For every constant δ>0 there is a polynomial time computable function that is a -source-extractor for entropy-rate δ.

2?3

* The polynomial depends on δ.

**The distance from uniform cannot be very small. That is we can get any constant ε>0 but not much smaller than that.

Summary and plan Main results: for any rate δ>0.

Bipartite Ramsey graph. 3-source extractor.

Plan: Construction of 3-source extractor Component: Somewhere condenser High level description of the

construction of Ramsey graph.

(s=n-bits)

A dream 2-source extractor

|Y1|20.9s con

(n-bits)

|X1|2n x1

Had

(s-bits)

(n-bits)

con

x2

|X2|2n

(s=n-bits)

A dream 2-source extractor

|Y1|20.9s con

(n-bits)

|X1|2n x1

|Y2|20.9s

(100 bits)

Problem:No such thing!

Had Had Had Had Had Had

(n-bits)

con

x2

|X2|2n

Breaking the ½ barrier: a somewhere condenser

con

(n-bits)

|X1|2n x1

(3 times s)

( 3=3() )

(3 times s)

(9 times 100)

1 2

3

New! Key component!

Constant entropy rate.

Constant number of bits!

Had Had Had Had Had Had

(n-bits)

con

x2

|X2|2n

A 2-source somewhere extractor

con

(n-bits)

|X1|2n x1

(3 times s) (3 times

s)

(9 times 100)

Somewhere extractorSomewhere extractor

Somewhere condenser and somewhere 2-source extractor

Somewhere condenser: Con(x)=y1,..,yt. s.t. for every X with rate δ one of the outputs (is close to) having entropy rate > 0.9.

Somewhere 2-source extractor: Ext(x1,x2)=y1,..,yt. s.t. for every independent X,Y with rate δ one of the outputs is close to uniform.

Important note: The concatenation of the outputs can have constant size and constant entropy.

0/1

Opt

The 4-source extractor

(900 bits)

Somewhere Somewhere

extractorextractorSomewhere Somewhere

extractorextractor

(900 bits)

0/1

Opt

The 3-source extractor

(900 bits)

Somewhere Somewhere

extractorextractorSomewhere Somewhere

extractorextractor

(900 bits)

each extractor is strong:

Works for most fixings

of one input

Next: How to build a somewhere condenser

Reminder:

Somewhere condenser: Con(x)=y1,..,yt. s.t. for every X with rate δ one of the outputs (is close to) having entropy rate larger than δ. (Say δ+δ2).

Note: We want rate > 0.9. This can be achieved by repeated condensing.

BIW

The BIW lemma

|X1|2m |X2|2m |X3|2m

x2 x3

y

[BIW][BIW] X1,X2,X3 independent rate(Y)>min{ +2, 1 }

(3 x m-bits)

(m-bits)

Y , |Y|2 (1+ )m

x1

Somewhere condenser: Construction

Given input x of length n Split it into three equal parts: y1,y2,y3 of

length n/3. Let y4=BIW(y1,y2,y3) (may be dependent). Output y1,y2,y3,y4.

x

y4=BIW(y1,y2,y3)

y1 y2 y3

n

BIW

(n/3)

Y4

Somewhere condenser: Analysis

y4

Y1 Y2Y3

y1y2 y3

( 3 x

(n/3) )

(n-bits)

|X|2n x

[Thm][Thm] max( r(Y1), r(Y2), r(Y3), r(Y4) ) min{ +2/10, 1 }

BIW

(n/3)

Y4

Somewhere condenser: Analysis

y4

Y1 Y2Y3

y1y2 y3

( 3 x

(n/3) )

(n-bits)

|X|2n x

If equality holds then Y1, Y2, Y3 are independent! If equality holds then Y1, Y2, Y3 are independent!

X is in Y1 x Y2 x Y3

|Y1|.|Y2|.|Y3||X|

H(Y1)+H(Y2)+H(Y3) H(X) n

Exists i: rate(Yi)

If strong enough inequality here, we’re done! If strong enough inequality here, we’re done!

And by [BIW][BIW], we’re done!And by [BIW][BIW], we’re done!

Outline of the construction of bipartite Ramsey graphs

We try to construct a 2-source extractor.

Settle for a Ramsey graph.

Bipartite Ramsey graph: overview of construction

n

X

n

Y

A partition P of X and Y is a choice of two vertical lines that split X (and Y) into two blocks.

For a fixed partition P we define: OutputP(x,y)=4-source-ext (x1,y2,x2,y1).

Theorem: For every 2 independent sources X and Y there exists a partition P s.t. OutputP(X,Y) is uniform.

X1 X2 Y1 Y2

Intuition: There exist splitting

points where the blocks

are sufficiently independent

10/ blocks

Bipartite Ramsey graph: overview of construction

High level idea: on sources X,Y Find the “correct” partition P and output OutputP(x,y)=4-source-ext(x1,y2,x2,y1).

Problem: We know nothing about the distributions

X,Y. We only get to see one sample (x,y).?????????????????

Bipartite Ramsey graph: Selecting a partition We design a function Partition(x,y)

which outputs a partition P. On input x,y we compute:

Ramsey(x,y)=OutputPartition(x,y)(x,y) The design of the partition function

is fairly complicated and makes complicated use of the building blocks we saw previously.

Bipartite Ramsey graph: High level overview of proof

For every sources X,Y there exist (large) subsets X’ in X and Y’ in Y s.t.

For every x,y in X’,Y’: Partition(x,y) is some fixed partition P.

OutputP(X’,Y’) is uniformly distributed.

Thus: OutputP(X,Y) is not monochromatic

010101110

000110010

001011111

010001101

110101011

001001111

011101001

011001100

100110011

N

X

Y

X’

Y’

Existence of good partition

n

X

n

Y

Theorem: For every 2 independent sources X and Y there exists a partition P s.t. OutputP(X,Y) is uniform.

[SSZ95,TS96,TS98 :]…,Consider partitioning one source X.

Exists a partition P s.t. (X1,X2) is a block-wise source.

• H(X2) is large.

• H(X1|X2=x2) is large (for every x2).

X1 X2 Y1 Y2

0/1

Opt

Extractor for 2 independent block-wise sources

(900 bits)

Somewhere Somewhere

extractorextractorSomewhere Somewhere

extractorextractor

(900 bits)

X1 Y2 X2 Y1

• H(X2) is large.

• H(X1|X2=x2) is large (for every x2).

• H(Y2) is large.

• H(Y1|Y2=y2) is large (for every y2).

each extractor is strong:

Works for most fixings

of one input

For every choice of x2

most choices of y2 are

good for X1|X2=x2

For most choices of x2,y2

x2,y2 are good choices for:

X1|X2=x2 and Y1|Y2=y2

Bipartite Ramsey graph: Selecting the partition.

n

X

Goal: Design partition(x,y) and show that for every X,Y there are X’ in X and Y’ in Y s.t.

Partition(x,y) is the “correct” partition P over X’ x Y’.

• H(X’2) is large.

• H(X’1|X’2=x2) is large (for every x2).

• H(X’3)=0.

X1 X2 X3

Turns out that the correct partition has more

structure.

This will help finding it!

To find the correct partition we will find the

leftmost line so that to the right of it there is no

entropy.

Bipartite Ramsey graph: Selecting the partition.

n

X

For every choice of vertical line P we conduct a test:

testP(x,y)=pass/fail.

Intuition: we want the test to pass only on partitions P s.t.

• H(X2) is large.

• H(X3)=0.

• We define partition(x,y) to be the leftmost partition on which testP(x,y)=pass.

X1 X2 X3

Turns out that the correct partition has more

structure.

This will help finding it!

To find the correct partition we will find the

leftmost line so that to the right of it there is no

entropy.

Open problems

Get a 2-source extractor. Get subconstant δ. We can get

δ=1/log log n. Related work by [Raz05].

Improve the error parameter in 3 source extractor.

Use technique for other constructions. Simplify technique.

That’s it