cs537 randomness in computing

26
9/14/2021 Randomness in Computing LECTURE 4 Last time β€’ Randomized min-cut algorithm β€’ Amplification: Bayesian approach Today β€’ Finish Karger’s algorithm β€’ Random variables β€’ Expectation β€’ Linearity of expectation β€’ Jensen’s inequality Sofya Raskhodnikova; Randomness in Computing

Upload: others

Post on 18-Feb-2022

7 views

Category:

Documents


0 download

TRANSCRIPT

9/14/2021

Randomness in Computing

LECTURE 4Last timeβ€’ Randomized min-cut algorithm

β€’ Amplification: Bayesian

approach

Todayβ€’ Finish Karger’s algorithm

β€’ Random variables

β€’ Expectation

β€’ Linearity of expectation

β€’ Jensen’s inequality

Sofya Raskhodnikova; Randomness in Computing

9/14/2021 Sofya Raskhodnikova; Randomness in Computing

Β§1.5 (MU) Karger’s Min Cut Algorithm

1. While 𝑉 > 22. choose 𝑒 ∈ 𝐸 uniformly at random3. 𝐺 ← graph obtained by contracting 𝑒 in 𝐺4. Return the only cut in 𝐺.

(input: undirected graph 𝐺 = (𝑉, 𝐸)Algorithm Basic Karger

Theorem

Basic-Karger returns a min cut with probability β‰₯2

𝑛(π‘›βˆ’1).

Probability amplification for Karger’s algorithm

Probability Amplification: Repeat π‘Ÿ = 𝑛 𝑛 βˆ’ 1 ln 𝑛 times and

return the smallest cut found.

9/14/2021 Sofya Raskhodnikova; Randomness in Computing

Theorem

Basic-Karger returns a min cut with probability β‰₯2

𝑛(π‘›βˆ’1).

Probability Amplification: Repeat π‘Ÿ = 𝑛 𝑛 βˆ’ 1 ln 𝑛 times and

return the smallest cut found.

Running time of Basic Karger: Best known implementation: O π‘š

β€’ Easy: 𝑂(π‘š) per contraction, so 𝑂(π‘šπ‘›)

β€’ View as Kruskal’s MST algorithm in 𝐺 with 𝑀 𝑒𝑖 = πœ‹(𝑖) run until

two components are left: 𝑂(π‘š log 𝑛)

9/14/2021 Sofya Raskhodnikova; Randomness in Computing

Β§1.5 (MU) Karger’s Min Cut Algorithm

1. While 𝑉 > 22. choose 𝑒 ∈ 𝐸 uniformly at random3. 𝐺 ← graph obtained by contracting 𝑒 in 𝐺4. Return the only cut in 𝐺.

(input: undirected graph 𝐺 = (𝑉, 𝐸)Algorithm Basic Karger

Measurements in random experiments

β€’ Example 1: coin flips

– Measurement X: number of heads.

– E.g., if the outcome is HHTH, then X=3.

β€’ Example 2: permutations

– 𝑛 students exchange their hats, so that everybody gets a

random hat

– Measurement X: number of students that got their own hats.

– E.g., if students 1,2,3 got hats 2,1,3 then X=1.

9/14/2021

Recall: random variables

β€’ A random variable X on a sample space Ξ© is a

function 𝑋:Ξ© β†’ ℝ that assigns to each sample

point πœ” ∈ Ξ© a real number 𝑋 πœ” .

β€’ For each random variable, we should understand:

– The set of values it can take.

– The probabilities with which it takes on these values.

β€’ The distribution of a discrete random variable X

is the collection of pairs π‘Ž, Pr 𝑋 = π‘Ž .

9/14/2021

Question

You roll two dice. Let X be the random variable that represents the

sum of the numbers you roll.

What is the probability of the event X=6?

A. 1/36

B. 1/9

C. 5/36

D. 1/6

E. None of the above.

9/14/2021

Question

You roll two dice. Let X be the random variable that represents the

sum of the numbers you roll.

How many different values can X take on?

A. 6

B. 11

C. 12

D. 36

E. None of the above.

9/14/2021

Question

You roll two dice. Let X be the random variable that represents the

sum of the numbers you roll.

What is the distribution of X?

A. Uniform distribution on the set of possible values.

B. It satisfies Pr 𝑋 = 2 < Pr 𝑋 = 3 < β‹― < Pr 𝑋 = 12 .

C. It satisfies Pr 𝑋 = 2 > Pr 𝑋 = 3 > β‹― > Pr 𝑋 = 12 .

D. It satisfies Pr 𝑋 = 2 < Pr 𝑋 = 3 < β‹― < Pr 𝑋 = 7 and

Pr 𝑋 = 7 > Pr 𝑋 = 8 > β‹― > Pr 𝑋 = 12 .

E. None of the above is true.

9/14/2021

Independent RVs: definition

β€’ Random variables 𝑋 and π‘Œ are independent ifPr 𝑋 = π‘₯ ∩ π‘Œ = 𝑦= Pr 𝑋 = π‘₯ β‹… Pr π‘Œ = 𝑦

for all values π‘₯ and 𝑦.

β€’ Random variables 𝑋1, 𝑋2, … , 𝑋𝑛 are mutually independent

if for all subsets of 𝐼 βŠ† [𝑛] and all values π‘₯𝑖, where 𝑖 ∈ 𝐼,

Pr[βˆ©π‘–βˆˆπΌ 𝑋𝑖 = π‘₯𝑖 ]

=ΰ·‘

π‘–βˆˆπΌ

Pr 𝑋𝑖 = π‘₯𝑖 .

9/14/2021

Question

You roll one die. Let X be the random variable that represents the

result.

What value does X take, on average?

A. 1/6

B. 3

C. 3.5

D. 6

E. None of the above.

9/14/2021

Random variables: expectation

β€’ The expectation of a discrete random variable X over a

sample space Ξ© is

𝔼 𝑋 = Οƒπœ”βˆˆΞ©π‘‹ πœ” β‹… Pr πœ” .

β€’ We can group together outcomes πœ” for which 𝑋 πœ” = π‘Ž:

𝔼 𝑋 =

π‘Ž

π‘Ž β‹… Pr 𝑋 = π‘Ž ,

where the sum is over all possible values π‘Ž taken by X.

β€’ The second equality is more useful for calculations.

9/14/2021

Example: random hats

β€’ Example: permutations

– 𝑛 students exchange their hats, so that everybody gets a

random hat

– R.V. X: the number of students that got their own hats.

– E.g., if students 1,2,3 got hats 2,1,3 then X=1.

β€’ Distribution of X:

Pr 𝑋 = 0 =1

3, Pr 𝑋 = 1 =

1

2, Pr 𝑋 = 3 =

1

6.

β€’ What’s the expectation of X?

9/14/2021

Linearity of expectation

β€’ Theorem. For any two random variables X and Y on the

same probability space,

𝔼 𝑋 + π‘Œ = 𝔼 𝑋 + 𝔼 π‘Œ .

Also, for all 𝑐 ∈ ℝ,

𝔼 𝑐𝑋 = 𝑐 β‹… 𝔼 𝑋 .

9/14/2021

Indicator random variables

β€’ An indicator random variable takes on two

values: 0 and 1.

β€’ Lemma. For an indicator random variable X,

𝔼 𝑋 = Pr 𝑋 = 1 .

9/14/2021

Question

You have a coin with bias 3/4 (the bias is the probability of HEADS).

Let X be the number of HEADS in 1000 tosses of your coin.

You represent X as the sum: X = 𝑋1 + 𝑋2 +β‹―+ 𝑋1000.

What is 𝑋1?

A. 3/4.

B. The number of HEADS.

C. The probability of HEADS in toss 1.

D. The number of heads in toss 1.

E. None of the above.

9/14/2021

Question

You have a coin with bias 3/4 (the bias is the probability of HEADS).

Let X be the number of HEADS in 1000 tosses of your coin.

What is the expectation of X?

A. 3/4.

B. 4/3.

C. 500.

D. 750.

E. None of the above.

9/14/2021

Example: random hats

β€’ Example: permutations

– 𝑛 students exchange their hats, so that everybody gets a

random hat

– R.V. X: the number of students that got their own hats.

– E.g., if students 1,2,3 got hats 2,1,3 then X=1.

β€’ What’s the expectation of X for general 𝑛?

9/14/2021

Jensen’s inequality: example

β€’ Exercise: Let 𝑋 be the length of a side of a square chosen from

[99] uniformly at random. What is the expected value of the area?

Solution: Find 𝔼 𝑋2 .𝔼 𝑋2 =

β€’ Comparison. 𝔼 𝑋 2 =

β€’ In general, 𝔼 𝑋2 β‰₯ 𝔼 𝑋 2

9/14/2021 Sofya Raskhodnikova; Randomness in Computing

Jensen’s inequality

In general, 𝔼 𝑋2 β‰₯ 𝔼 𝑋 2

Proof: Let πœ‡ = 𝔼[𝑋]. Consider π‘Œ = 𝑋 βˆ’ πœ‡ 2.

9/14/2021 Sofya Raskhodnikova; Randomness in Computing

Jensen’s inequality

β€’ A function 𝑓:ℝ β†’ ℝ is convex if, for all π‘₯, 𝑦 and all πœ† ∈ 0,1 ,𝑓 πœ†π‘₯ + 1 βˆ’ πœ† 𝑦 ≀ πœ†π‘“ π‘₯ + 1 βˆ’ πœ† 𝑓 𝑦 .

β€’ Jensen’s inequality. If 𝑓 is a convex function and X is a random

variable, then

𝔼 𝑓 𝑋 β‰₯ 𝑓 𝔼 𝑋 .

9/14/2021 Sofya Raskhodnikova; Randomness in Computing

Bernoulli random variables

β€’ A Bernoulli random variable with parameters 𝑝:

1 with probability 𝑝;

0 otherwise.

β€’ The expectation of a Bernoulli R.V. 𝑋 is

𝔼 𝑋 = 𝑝 β‹… 1 + 1 βˆ’ 𝑝 β‹… 0 = 𝑝.

9/14/2021 Sofya Raskhodnikova; Randomness in Computing

Binomial random variables

β€’ A binomial random variable with parameters 𝑛and 𝑝, denoted Bin(𝑛, 𝑝), is the number of

HEADS in 𝑛 tosses of a coin with bias 𝑝.

β€’ Lemma. The probability distribution of 𝑋 =Bin(𝑛, 𝑝) is

Pr 𝑋 = 𝑗 =𝑛

𝑗𝑝𝑗(1 βˆ’ 𝑝)π‘›βˆ’π‘—

for all 𝑗 = 0,1, … , 𝑛.

β€’ Lemma. The expectation of 𝑋 =Bin(𝑛, 𝑝) is

𝔼 𝑋 = 𝑛𝑝.

9/14/2021 Sofya Raskhodnikova; Randomness in Computing

Question

Throw π‘š balls into 𝑛 bins.

Let X be the number of balls that land into bin 1.

(Recall that Bin(𝑛,𝑝) is the binomial distribution, i.e., the distribution

of the number of HEADS in 𝑛 tosses of a coin with bias 𝑝. )

Then the distribution of X is

A. Bin (𝑛,π‘š)

B. Bin (π‘š, 1/𝑛)

C. Bin π‘š,π‘›βˆ’1

𝑛

D. a binomial distribution, but none of the above.

E. not a binomial distribution.

9/14/2021 Sofya Raskhodnikova; Randomness in Computing

Question

Throw π‘š balls into 𝑛 bins. Let Y be the number of empty bins.

Compute 𝔼[Y].

A. 1

B. 𝑛/2

C. 1 βˆ’1

𝑛

π‘š

D. 𝑛 1 βˆ’1

𝑛

π‘š

E. None of the above.

9/14/2021 Sofya Raskhodnikova; Randomness in Computing

Product of independent RVs

β€’ Theorem. For any two independent random variables X

and Y on the same probability space,

𝔼 𝑋 β‹… π‘Œ = 𝔼 𝑋 β‹… 𝔼 π‘Œ .

β€’ Note. The equality does not hold, in general, for

dependent random variables.

Example. We toss two coins.

Let X= number of HEADS, Y= number of TAILS.

Calculate 𝔼[X], 𝔼[Y] and 𝔼[XY].

9/14/2021 Sofya Raskhodnikova; Randomness in Computing