hierarchical well-separated trees (hst)

16
Hierarchical Well-Separated Trees (HST) Edges’ distances are uniform across a level of the tree • Stretch = factor by which distances decrease from root to leaf Distortion = factor by which distance between 2 points increases when HST is used to traverse instead of direct distance Upper bound is O( log n) Diagram from Fakcharoenphol, Rao & Talwar 2

Upload: virgo

Post on 08-Jan-2016

43 views

Category:

Documents


4 download

DESCRIPTION

Hierarchical Well-Separated Trees (HST). Edges’ distances are uniform across a level of the tree Stretch s = factor by which distances decrease from root to leaf Distortion = factor by which distance between 2 points increases when HST is used to traverse instead of direct distance - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Hierarchical Well-Separated Trees (HST)

Hierarchical Well-Separated Trees (HST)

• Edges’ distances are uniform across a level of the tree• Stretch = factor by which distances decrease from root to leaf• Distortion = factor by which distance between 2 points increases

when HST is used to traverse instead of direct distance– Upper bound is O( log n) Diagram from Fakcharoenphol, Rao & Talwar 2003

Page 2: Hierarchical Well-Separated Trees (HST)

Pure Randomized vs. Fractional Algorithms

• “Fractional view” = keep track only of marginal distributions of some quantities

• Lossy compared to pure randomized• Which marginals to track?• Claim: for some algorithms, fractional view

can be converted back to randomized algorithm with little loss

Page 3: Hierarchical Well-Separated Trees (HST)

• For node j, let T(j) = leaves of the subtree of T rooted in j• At time step t, for leaf i, pi

t = probability of having a server at i

• If there is a request at i on time t, pit should be 1

• Expected number of servers across T(j) = kt(j) = i∈T(j) pit

• Movement cost to get servers at j = j∈T W(j) |kt(j) – kt-1(j)|

Fractional View of K-server Problem

Parts of diagram from Bansal 2011

j

T(j)

i

Page 4: Hierarchical Well-Separated Trees (HST)

The Allocation Problem

• Decide how to (re-)distribute servers among d locations (each location of uniform distance from a center and may request arbitrary no. of servers)

• Each location i has a request denoted as {ht(0), ht(1), … ht()}• ht(j) = cost of serving request using j servers• Ex.: request at i=0 is {∞, 2, 1, 0, 0} (monotonic decrease)• Total cost = hit cost + movement cost

Parts of diagram from Bansal 2011

I can workwith 1, but

I’d like 3!

Page 5: Hierarchical Well-Separated Trees (HST)

Fractional View of Allocation Problem

• Let xi,jt = (non-negative) probability of having j servers at location

i at time t• Sum of probabilities j xi,j

t = 1

• No. of servers used must not exceed no. availablei j j ∙ xi,j

t ≤

• Hit cost incurred = j ht(j) ∙ xi,jt

• Movement cost incurred = i j (|j’<j xi,jt – j’<j xi,j

t-1|)

• Note: fractional A.P. too weak to obtain randomized A.P. algorithm– But we don’t really care about A.P., we care about K-server problem!

Page 6: Hierarchical Well-Separated Trees (HST)

From Allocation to K-Server

• Theorem 2: It suffices to have a (1+, ())-competitive fractional AP algorithm on uniform metric to get a k-server algorithm that is O(βl)-competitive algorithm (Coté et al. 2008)

• Theorem 1: Bansal et al.’s k-server algorithm has a competitive ratio of Õ(log2 k log3 n)

Page 7: Hierarchical Well-Separated Trees (HST)

The Main Algorithm

1. Embed the n points into a distribution over -HSTs with stretch = (log n log(k log n))– (No time to discuss, this step is essentially from the paper

of Fakcharoenphol, Rao & Talwar 2003)

2. According to distribution , pick a random HST T– Extra step: Transform the HST to a weighted HST

(We’ll briefly touch on this)

Diagram from Bansal 2011

Page 8: Hierarchical Well-Separated Trees (HST)

The Main Algorithm

3. Solve the (fractional) allocation problem on T’s root node + immediate children, then recursively solve the same problem on each child– Intuitive application of Theorem 2– d = immediate children of a given node– At root node: = all k servers– At internal node i: = resulting (re)allocation of servers from

i’s parentAllocation instances

Diagram from Bansal 2011

Page 9: Hierarchical Well-Separated Trees (HST)

Detour: Weighted HST

• Degenerate case of normal HST:

• Depth l = O(n) (can happen if n points are on a line with geometrically increasing distances)

Page 10: Hierarchical Well-Separated Trees (HST)

Detour: Weighted HST

• Solution: allow lengths of edges to be non-uniform

• Allow distortion from leaf-to-leaf to be at most 2/(–1)• Depth l = O(log n)• Consequence: Uniform A.P. becomes weighted-star A.P.

Page 11: Hierarchical Well-Separated Trees (HST)

Proving the Main Algorithm

• Theorem 1: Bansal et al.’s k-server algorithm has a competitive ratio of Õ(log2 k log3 n)

• Idea of proof: How does competitive ratio and distortion evolve as we transform:

Fractional allocation algorithm↓

Fractional k-server algorithm on HST↓

Randomized k-server algorithm on HST

Page 12: Hierarchical Well-Separated Trees (HST)

Supplemental Theorems• Theorem 3: For > 0, there exists a fractional A.P. algorithm

on a weighted-star metric that is (1+, O(log(k/)))-competitive (Refinement of theorem 2, to be discussed by Tanvirul)

• Theorem 4: If T is a weighted -HST with depth l, if Theorem 3 holds, then there is a fractional k-server algorithm that is O(l log(kl))-competitive as long as = (l log(kl))

• Theorem 5: If T is a-HST with >5, then any fractional k-server algorithm on T converts to a randomized k-server algorithm on T that is about as competitive (only O(1) loss)

• Theorem 6: If T is a-HST with n leaves and any depth, it can transform to a weighted-HST with identical leaves but with depth O(log n) and leaf-to-leaf distance distorted only by at most 2/(–1)

Page 13: Hierarchical Well-Separated Trees (HST)

Proof of Theorem 1

1. Embed the n points into a distribution over -HSTs with stretch = (log n log(k log n))– Distortion at O( log n)

– Resulting HSTs may have depth l up to O(n)

2. According to distribution , pick a random HST T and transform to a weighted HST– From Theorem 6, depth l reduced to O(log n)– Stretch is now (l log (kl)))

Page 14: Hierarchical Well-Separated Trees (HST)

Proof of Theorem 1

3. Solve the (fractional) allocation problem on T’s root node + immediate children, then recursively solve the allocation problem on children– This is explicitly Theorem 2 refined by Theorem 3 – Stretch = (l log (kl))), so Theorem 4 is applicable!– Transform to a fractional k-server algorithm with

competitiveness =O(l log (kl))) = O(log n log (k log n))

– Applying Theorem 5, we get similar competitiveness for the randomized k-server algorithm

Page 15: Hierarchical Well-Separated Trees (HST)

Proof of Theorem 1

• Expected distortion to optimal solution Opt*M, given the cost of the solution on T, cT:

E[cT ] = O( log n) Opt*∙ M

AlgM ≤ AlgT

≤ O(log n log (k log n)) ∙ cT

E[AlgM ] = O(log n log (k log n)) ∙ E[cT ] = O(log n log (k log n)) O(∙ log n) Opt*∙ M

Page 16: Hierarchical Well-Separated Trees (HST)

Proof of Theorem 1

E[AlgM ] = O(log n log (k log n)) O(∙ log n) Opt*∙ M

• This implies a competitive ratio of:

O(log n log (k log n)) O(∙ log n)= O(log n log (k log n)) ∙ O( (log n / log ))= O{[log3 n (log (k log n))2] / log log n}= O(log2 k log3 n log log n)= Õ(log2 k log3 n)