the level ancestor problem simplified

Post on 27-Jul-2015

213 Views

Category:

Science

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

The Level Ancestor Problem simplified

Leif Walsh

leif.walsh@gmail.com@leifwalsh

November 13, 2014

Leif Walsh Level Ancestor November 13, 2014 1 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

The Level Ancestor Problem

Preprocess a rooted tree T to answer level ancestor queries:

T

Figure: A rooted tree

Definition (Depth)The depth of a node ν in a rooted tree T is thenumber of edges along the shortest pathfrom the root to ν .The root has depth 0.

Definition (Level Ancestor Problem)LAT(ν, d) - return the ancestor of ν in T ofdepth d.LAT(ν,Depth(ν)) = ν , LAT(·, 0) = Root(T)

Leif Walsh Level Ancestor November 13, 2014 2 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

The Level Ancestor Problem

Preprocess a rooted tree T to answer level ancestor queries:

T

Figure: A rooted tree

Definition (Depth)The depth of a node ν in a rooted tree T is thenumber of edges along the shortest pathfrom the root to ν .The root has depth 0.

Definition (Level Ancestor Problem)LAT(ν, d) - return the ancestor of ν in T ofdepth d.LAT(ν,Depth(ν)) = ν , LAT(·, 0) = Root(T)

Leif Walsh Level Ancestor November 13, 2014 2 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

The Level Ancestor Problem

Preprocess a rooted tree T to answer level ancestor queries:

T

Figure: A rooted tree

Definition (Depth)The depth of a node ν in a rooted tree T is thenumber of edges along the shortest pathfrom the root to ν .The root has depth 0.

Definition (Level Ancestor Problem)LAT(ν, d) - return the ancestor of ν in T ofdepth d.LAT(ν,Depth(ν)) = ν , LAT(·, 0) = Root(T)

Leif Walsh Level Ancestor November 13, 2014 2 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Analysis

If an algorithm has preprocessing time f(n) and query time g(n), we say it has complexity

⟨f(n), g(n)⟩

(Today, at least, space usage will be equal to preprocessing time.)

Leif Walsh Level Ancestor November 13, 2014 3 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Analysis

If an algorithm has preprocessing time f(n) and query time g(n), we say it has complexity

⟨f(n), g(n)⟩

(Today, at least, space usage will be equal to preprocessing time.)

Leif Walsh Level Ancestor November 13, 2014 3 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

A naïve solution

Do nothing for preprocessing, walk up the tree for each query.

⟨O(1),O(n)⟩

Leif Walsh Level Ancestor November 13, 2014 4 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

A naïve solution

Do nothing for preprocessing, walk up the tree for each query.

⟨O(1),O(n)⟩

Leif Walsh Level Ancestor November 13, 2014 4 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

The Plan

We’ll proceed by presenting three simple algorithms with different characteristics:

Table Algorithm

⟨O(n2),O(1)

Jump-Pointers Algorithm

⟨O(n log n),O(log n)⟩

Ladder Algorithm

⟨O(n),O(log n)⟩

At the end, we’ll combine the last two to get a better solution. ⟨O(n log n),O(1)⟩

Leif Walsh Level Ancestor November 13, 2014 5 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

The Plan

We’ll proceed by presenting three simple algorithms with different characteristics:

Table Algorithm⟨O(n2),O(1)

⟩Jump-Pointers Algorithm

⟨O(n log n),O(log n)⟩

Ladder Algorithm

⟨O(n),O(log n)⟩

At the end, we’ll combine the last two to get a better solution. ⟨O(n log n),O(1)⟩

Leif Walsh Level Ancestor November 13, 2014 5 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

The Plan

We’ll proceed by presenting three simple algorithms with different characteristics:

Table Algorithm⟨O(n2),O(1)

⟩Jump-Pointers Algorithm ⟨O(n log n),O(log n)⟩Ladder Algorithm

⟨O(n),O(log n)⟩

At the end, we’ll combine the last two to get a better solution. ⟨O(n log n),O(1)⟩

Leif Walsh Level Ancestor November 13, 2014 5 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

The Plan

We’ll proceed by presenting three simple algorithms with different characteristics:

Table Algorithm⟨O(n2),O(1)

⟩Jump-Pointers Algorithm ⟨O(n log n),O(log n)⟩Ladder Algorithm ⟨O(n),O(log n)⟩

At the end, we’ll combine the last two to get a better solution. ⟨O(n log n),O(1)⟩

Leif Walsh Level Ancestor November 13, 2014 5 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

The Plan

We’ll proceed by presenting three simple algorithms with different characteristics:

Table Algorithm⟨O(n2),O(1)

⟩Jump-Pointers Algorithm ⟨O(n log n),O(log n)⟩Ladder Algorithm ⟨O(n),O(log n)⟩

At the end, we’ll combine the last two to get a better solution.

⟨O(n log n),O(1)⟩

Leif Walsh Level Ancestor November 13, 2014 5 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

The Plan

We’ll proceed by presenting three simple algorithms with different characteristics:

Table Algorithm⟨O(n2),O(1)

⟩Jump-Pointers Algorithm ⟨O(n log n),O(log n)⟩Ladder Algorithm ⟨O(n),O(log n)⟩

At the end, we’ll combine the last two to get a better solution. ⟨O(n log n),O(1)⟩

Leif Walsh Level Ancestor November 13, 2014 5 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Table Algorithm

PreprocessingPrecompute the answers to all possible queries LAT(ν, i) and store them in a lookup table.

QueryLook up the answer in the lookup table.

Leif Walsh Level Ancestor November 13, 2014 6 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Table Algorithm

PreprocessingPrecompute the answers to all possible queries LAT(ν, i) and store them in a lookup table.

QueryLook up the answer in the lookup table.

Leif Walsh Level Ancestor November 13, 2014 6 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Table Algorithm

⟨O(n2),O(1)

PreprocessingThere are n nodes in the tree, and each node ν can be queried for up toDepth(ν) different depths.

Worst case: O(n2) (a stick)Dynamic programming allows us to compute the whole table in O(n2).

QueryThis is just array access, so O(1).

T

Figure: A stick

Leif Walsh Level Ancestor November 13, 2014 7 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Table Algorithm

⟨O(n2),O(1)

PreprocessingThere are n nodes in the tree, and each node ν can be queried for up toDepth(ν) different depths.Worst case: O(n2) (a stick)

Dynamic programming allows us to compute the whole table in O(n2).

QueryThis is just array access, so O(1).

T

Figure: A stick

Leif Walsh Level Ancestor November 13, 2014 7 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Table Algorithm

⟨O(n2),O(1)

PreprocessingThere are n nodes in the tree, and each node ν can be queried for up toDepth(ν) different depths.Worst case: O(n2) (a stick)Dynamic programming allows us to compute the whole table in O(n2).

QueryThis is just array access, so O(1).

T

Figure: A stick

Leif Walsh Level Ancestor November 13, 2014 7 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Table Algorithm

⟨O(n2),O(1)

PreprocessingThere are n nodes in the tree, and each node ν can be queried for up toDepth(ν) different depths.Worst case: O(n2) (a stick)Dynamic programming allows us to compute the whole table in O(n2).

QueryThis is just array access, so O(1).

T

Figure: A stick

Leif Walsh Level Ancestor November 13, 2014 7 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Table Algorithm⟨O(n2),O(1)

PreprocessingThere are n nodes in the tree, and each node ν can be queried for up toDepth(ν) different depths.Worst case: O(n2) (a stick)Dynamic programming allows us to compute the whole table in O(n2).

QueryThis is just array access, so O(1).

T

Figure: A stick

Leif Walsh Level Ancestor November 13, 2014 7 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Jump-Pointers Algorithm

Intuition:We don’t need to store every possible query from each node.We can associate less than O(n) information with each node, and still answer queries quickly.

To a node ν , we associate O(log n) “Jump Pointers” that let us jump up the tree from ν bypowers of 2.

Leif Walsh Level Ancestor November 13, 2014 8 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Jump-Pointers Algorithm

Intuition:We don’t need to store every possible query from each node.We can associate less than O(n) information with each node, and still answer queries quickly.

To a node ν , we associate O(log n) “Jump Pointers” that let us jump up the tree from ν bypowers of 2.

Leif Walsh Level Ancestor November 13, 2014 8 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Jump-Pointers Algorithm

T

Figure: Jump pointers

Leif Walsh Level Ancestor November 13, 2014 9 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Jump-Pointers Algorithm

Preprocessing

Compute a table Jumpwhere Jump[ν, i] is LAT(ν,Depth(ν)− 2i), the 2i jump from ν .

QueryTo answer a query LAT(ν, d), follow the jump pointer of maximal distance that won’t take uspast depth d.Let δ = Depth(ν)− d, then follow the jump pointer up to

ν ′ = Jump[ν, ⌊log δ⌋].

Recursively solve LAT(ν ′, d).

Leif Walsh Level Ancestor November 13, 2014 10 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Jump-Pointers Algorithm

Preprocessing

Compute a table Jumpwhere Jump[ν, i] is LAT(ν,Depth(ν)− 2i), the 2i jump from ν .

QueryTo answer a query LAT(ν, d), follow the jump pointer of maximal distance that won’t take uspast depth d.

Let δ = Depth(ν)− d, then follow the jump pointer up to

ν ′ = Jump[ν, ⌊log δ⌋].

Recursively solve LAT(ν ′, d).

Leif Walsh Level Ancestor November 13, 2014 10 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Jump-Pointers Algorithm

Preprocessing

Compute a table Jumpwhere Jump[ν, i] is LAT(ν,Depth(ν)− 2i), the 2i jump from ν .

QueryTo answer a query LAT(ν, d), follow the jump pointer of maximal distance that won’t take uspast depth d.Let δ = Depth(ν)− d,

then follow the jump pointer up to

ν ′ = Jump[ν, ⌊log δ⌋].

Recursively solve LAT(ν ′, d).

Leif Walsh Level Ancestor November 13, 2014 10 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Jump-Pointers Algorithm

Preprocessing

Compute a table Jumpwhere Jump[ν, i] is LAT(ν,Depth(ν)− 2i), the 2i jump from ν .

QueryTo answer a query LAT(ν, d), follow the jump pointer of maximal distance that won’t take uspast depth d.Let δ = Depth(ν)− d, then follow the jump pointer up to

ν ′ = Jump[ν, ⌊log δ⌋].

Recursively solve LAT(ν ′, d).

Leif Walsh Level Ancestor November 13, 2014 10 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Jump-Pointers Algorithm

Preprocessing

Compute a table Jumpwhere Jump[ν, i] is LAT(ν,Depth(ν)− 2i), the 2i jump from ν .

QueryTo answer a query LAT(ν, d), follow the jump pointer of maximal distance that won’t take uspast depth d.Let δ = Depth(ν)− d, then follow the jump pointer up to

ν ′ = Jump[ν, ⌊log δ⌋].

Recursively solve LAT(ν ′, d).

Leif Walsh Level Ancestor November 13, 2014 10 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Jump-Pointers Algorithm

x

T

Figure: LAT(x, 2)

Leif Walsh Level Ancestor November 13, 2014 11 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Jump-Pointers Algorithm

y

x

T

Figure: LAT(x, 2) = y

Leif Walsh Level Ancestor November 13, 2014 12 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Jump-Pointers Algorithm

⟨O(n log n),O(log n)⟩

PreprocessingThere are n nodes, and O(log n) jump pointers per node, so O(n log n) total pointers tocompute. Dynamic programming works.

QueryEach jump pointer we follow reduces δ by at least half, so we’ll reach the target in O(log n)jumps.

Leif Walsh Level Ancestor November 13, 2014 13 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Jump-Pointers Algorithm

⟨O(n log n),O(log n)⟩

PreprocessingThere are n nodes, and O(log n) jump pointers per node, so O(n log n) total pointers tocompute. Dynamic programming works.

QueryEach jump pointer we follow reduces δ by at least half, so we’ll reach the target in O(log n)jumps.

Leif Walsh Level Ancestor November 13, 2014 13 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Jump-Pointers Algorithm ⟨O(n log n),O(log n)⟩

PreprocessingThere are n nodes, and O(log n) jump pointers per node, so O(n log n) total pointers tocompute. Dynamic programming works.

QueryEach jump pointer we follow reduces δ by at least half, so we’ll reach the target in O(log n)jumps.

Leif Walsh Level Ancestor November 13, 2014 13 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm

Consider a single path P.

We can preprocess this (degenerate) tree by just putting it in anarray Awhere A[i] corresponds to the depth-i node in the path.To answer LAP(·, d), just return A[d].

P

Figure: A single path

Leif Walsh Level Ancestor November 13, 2014 14 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm

Consider a single path P.We can preprocess this (degenerate) tree by just putting it in anarray Awhere A[i] corresponds to the depth-i node in the path.

To answer LAP(·, d), just return A[d].

P

Figure: A single path

Leif Walsh Level Ancestor November 13, 2014 14 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm

Consider a single path P.We can preprocess this (degenerate) tree by just putting it in anarray Awhere A[i] corresponds to the depth-i node in the path.To answer LAP(·, d), just return A[d].

P

Figure: A single path

Leif Walsh Level Ancestor November 13, 2014 14 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Long-path Decomposition

The long-path decomposition of a tree T is constructed asfollows:

Find a longest root-to-leaf path in T, and remove it from the tree.Continue recursively until all remaining pieces are paths.The resulting forest of paths is the long-path decomposition.

Leif Walsh Level Ancestor November 13, 2014 15 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Long-path Decomposition

The long-path decomposition of a tree T is constructed asfollows:Find a longest root-to-leaf path in T, and remove it from the tree.

Continue recursively until all remaining pieces are paths.The resulting forest of paths is the long-path decomposition.

Leif Walsh Level Ancestor November 13, 2014 15 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Long-path Decomposition

The long-path decomposition of a tree T is constructed asfollows:Find a longest root-to-leaf path in T, and remove it from the tree.

Continue recursively until all remaining pieces are paths.The resulting forest of paths is the long-path decomposition.

Leif Walsh Level Ancestor November 13, 2014 15 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Long-path Decomposition

The long-path decomposition of a tree T is constructed asfollows:Find a longest root-to-leaf path in T, and remove it from the tree.

Continue recursively until all remaining pieces are paths.The resulting forest of paths is the long-path decomposition.

Leif Walsh Level Ancestor November 13, 2014 15 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Long-path Decomposition

The long-path decomposition of a tree T is constructed asfollows:Find a longest root-to-leaf path in T, and remove it from the tree.Continue recursively until all remaining pieces are paths.

The resulting forest of paths is the long-path decomposition.

Leif Walsh Level Ancestor November 13, 2014 15 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Long-path Decomposition

The long-path decomposition of a tree T is constructed asfollows:Find a longest root-to-leaf path in T, and remove it from the tree.Continue recursively until all remaining pieces are paths.

The resulting forest of paths is the long-path decomposition.

Leif Walsh Level Ancestor November 13, 2014 15 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Long-path Decomposition

The long-path decomposition of a tree T is constructed asfollows:Find a longest root-to-leaf path in T, and remove it from the tree.Continue recursively until all remaining pieces are paths.

The resulting forest of paths is the long-path decomposition.

Leif Walsh Level Ancestor November 13, 2014 15 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Long-path Decomposition

The long-path decomposition of a tree T is constructed asfollows:Find a longest root-to-leaf path in T, and remove it from the tree.Continue recursively until all remaining pieces are paths.The resulting forest of paths is the long-path decomposition.

Leif Walsh Level Ancestor November 13, 2014 15 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Long-path Decomposition

We can put each component of the long-pathdecomposition in a ⟨O(n),O(1)⟩ array as before, andthis lets us find ancestors in our component in O(1).

However, if we need to go higher, we need to step upto the next component higher and continue.

In the worst case, querying this structure can takeO(

√n).

Figure: Worst case for long pathdecomposition.

Leif Walsh Level Ancestor November 13, 2014 16 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Long-path Decomposition

We can put each component of the long-pathdecomposition in a ⟨O(n),O(1)⟩ array as before, andthis lets us find ancestors in our component in O(1).

However, if we need to go higher, we need to step upto the next component higher and continue.

In the worst case, querying this structure can takeO(

√n).

Figure: Worst case for long pathdecomposition.

Leif Walsh Level Ancestor November 13, 2014 16 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Long-path Decomposition

We can put each component of the long-pathdecomposition in a ⟨O(n),O(1)⟩ array as before, andthis lets us find ancestors in our component in O(1).

However, if we need to go higher, we need to step upto the next component higher and continue.

In the worst case, querying this structure can takeO(

√n).

Figure: Worst case for long pathdecomposition.

Leif Walsh Level Ancestor November 13, 2014 16 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Ladder Decomposition

We will augment the long-path decomposition, toessentially let us climb higher in a single array, andavoid checking O(

√n) separate arrays.

For each component in the long-path decomposition(of height h), we previously created an array of size h torepresent it.

Now, we allocate an array of size 2h, and in addition tostoring the component, we also store the h ancestorsdirectly above the component.

This doubles our storage and preprocessing work, butgives us some nice properties. We’ll call theseaugmented arrays ladders.

Leif Walsh Level Ancestor November 13, 2014 17 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Ladder Decomposition

We will augment the long-path decomposition, toessentially let us climb higher in a single array, andavoid checking O(

√n) separate arrays.

For each component in the long-path decomposition(of height h), we previously created an array of size h torepresent it.

Now, we allocate an array of size 2h, and in addition tostoring the component, we also store the h ancestorsdirectly above the component.

This doubles our storage and preprocessing work, butgives us some nice properties. We’ll call theseaugmented arrays ladders.

Leif Walsh Level Ancestor November 13, 2014 17 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Ladder Decomposition

We will augment the long-path decomposition, toessentially let us climb higher in a single array, andavoid checking O(

√n) separate arrays.

For each component in the long-path decomposition(of height h), we previously created an array of size h torepresent it.

Now, we allocate an array of size 2h, and in addition tostoring the component, we also store the h ancestorsdirectly above the component.

This doubles our storage and preprocessing work, butgives us some nice properties. We’ll call theseaugmented arrays ladders.

Leif Walsh Level Ancestor November 13, 2014 17 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Ladder Decomposition

We will augment the long-path decomposition, toessentially let us climb higher in a single array, andavoid checking O(

√n) separate arrays.

For each component in the long-path decomposition(of height h), we previously created an array of size h torepresent it.

Now, we allocate an array of size 2h, and in addition tostoring the component, we also store the h ancestorsdirectly above the component.

This doubles our storage and preprocessing work, butgives us some nice properties. We’ll call theseaugmented arrays ladders.

Leif Walsh Level Ancestor November 13, 2014 17 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Ladder Decomposition

We will augment the long-path decomposition, toessentially let us climb higher in a single array, andavoid checking O(

√n) separate arrays.

For each component in the long-path decomposition(of height h), we previously created an array of size h torepresent it.

Now, we allocate an array of size 2h, and in addition tostoring the component, we also store the h ancestorsdirectly above the component.

This doubles our storage and preprocessing work, butgives us some nice properties. We’ll call theseaugmented arrays ladders.

Leif Walsh Level Ancestor November 13, 2014 17 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Ladder Decomposition

We will augment the long-path decomposition, toessentially let us climb higher in a single array, andavoid checking O(

√n) separate arrays.

For each component in the long-path decomposition(of height h), we previously created an array of size h torepresent it.

Now, we allocate an array of size 2h, and in addition tostoring the component, we also store the h ancestorsdirectly above the component.

This doubles our storage and preprocessing work, butgives us some nice properties. We’ll call theseaugmented arrays ladders.

Leif Walsh Level Ancestor November 13, 2014 17 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Ladder Decomposition

We will augment the long-path decomposition, toessentially let us climb higher in a single array, andavoid checking O(

√n) separate arrays.

For each component in the long-path decomposition(of height h), we previously created an array of size h torepresent it.

Now, we allocate an array of size 2h, and in addition tostoring the component, we also store the h ancestorsdirectly above the component.

This doubles our storage and preprocessing work, butgives us some nice properties. We’ll call theseaugmented arrays ladders.

Leif Walsh Level Ancestor November 13, 2014 17 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Ladder Decomposition

We will augment the long-path decomposition, toessentially let us climb higher in a single array, andavoid checking O(

√n) separate arrays.

For each component in the long-path decomposition(of height h), we previously created an array of size h torepresent it.

Now, we allocate an array of size 2h, and in addition tostoring the component, we also store the h ancestorsdirectly above the component.

This doubles our storage and preprocessing work, butgives us some nice properties. We’ll call theseaugmented arrays ladders.

Leif Walsh Level Ancestor November 13, 2014 17 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Ladder Decomposition

We will augment the long-path decomposition, toessentially let us climb higher in a single array, andavoid checking O(

√n) separate arrays.

For each component in the long-path decomposition(of height h), we previously created an array of size h torepresent it.

Now, we allocate an array of size 2h, and in addition tostoring the component, we also store the h ancestorsdirectly above the component.

This doubles our storage and preprocessing work, butgives us some nice properties. We’ll call theseaugmented arrays ladders.

Leif Walsh Level Ancestor November 13, 2014 17 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Ladder Decomposition

We will augment the long-path decomposition, toessentially let us climb higher in a single array, andavoid checking O(

√n) separate arrays.

For each component in the long-path decomposition(of height h), we previously created an array of size h torepresent it.

Now, we allocate an array of size 2h, and in addition tostoring the component, we also store the h ancestorsdirectly above the component.

This doubles our storage and preprocessing work, butgives us some nice properties. We’ll call theseaugmented arrays ladders.

Leif Walsh Level Ancestor November 13, 2014 17 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Ladder Decomposition

We will augment the long-path decomposition, toessentially let us climb higher in a single array, andavoid checking O(

√n) separate arrays.

For each component in the long-path decomposition(of height h), we previously created an array of size h torepresent it.

Now, we allocate an array of size 2h, and in addition tostoring the component, we also store the h ancestorsdirectly above the component.

This doubles our storage and preprocessing work, butgives us some nice properties. We’ll call theseaugmented arrays ladders.

Leif Walsh Level Ancestor November 13, 2014 17 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Ladder Decomposition

Definition (Height)The height of a node ν in a tree T is the number of nodes on the longest path from ν to anydescendant.Leaves have height 1.

PropertyConsider a node ν of height h. The top of ν ’s ladder is at least distance h above ν .That is, we can jump up ν ’s ladder by at least h in O(1) time.

CorollaryIf a node ν has height h, then ν ’s ladder either includes a node of height 2h, or it includes theentire path from the root to ν , and therefore all of ν ’s ancestors.

Leif Walsh Level Ancestor November 13, 2014 18 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Ladder Decomposition

Definition (Height)The height of a node ν in a tree T is the number of nodes on the longest path from ν to anydescendant.Leaves have height 1.

PropertyConsider a node ν of height h. The top of ν ’s ladder is at least distance h above ν .That is, we can jump up ν ’s ladder by at least h in O(1) time.

CorollaryIf a node ν has height h, then ν ’s ladder either includes a node of height 2h, or it includes theentire path from the root to ν , and therefore all of ν ’s ancestors.

Leif Walsh Level Ancestor November 13, 2014 18 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm: Ladder Decomposition

Definition (Height)The height of a node ν in a tree T is the number of nodes on the longest path from ν to anydescendant.Leaves have height 1.

PropertyConsider a node ν of height h. The top of ν ’s ladder is at least distance h above ν .That is, we can jump up ν ’s ladder by at least h in O(1) time.

CorollaryIf a node ν has height h, then ν ’s ladder either includes a node of height 2h, or it includes theentire path from the root to ν , and therefore all of ν ’s ancestors.

Leif Walsh Level Ancestor November 13, 2014 18 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm

PreprocessingConstruct the long-path decomposition of T, by precomputing heights in one DFS pass, and in asecond pass, greedily following the maximal-height child at each node.

Extend each component to a ladder by following parent pointers.

QueryTo answer LAT(ν, d), consider ν ’s ladder. If it is tall enough to contain the depth-d ancestor, weare done. Otherwise, jump to the node at the top of ν ’s ladder, and try again from that node’sladder.

Leif Walsh Level Ancestor November 13, 2014 19 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm

PreprocessingConstruct the long-path decomposition of T, by precomputing heights in one DFS pass, and in asecond pass, greedily following the maximal-height child at each node.

Extend each component to a ladder by following parent pointers.

QueryTo answer LAT(ν, d), consider ν ’s ladder. If it is tall enough to contain the depth-d ancestor, weare done. Otherwise, jump to the node at the top of ν ’s ladder, and try again from that node’sladder.

Leif Walsh Level Ancestor November 13, 2014 19 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm

⟨O(n),O(log n)⟩

PreprocessingThe long-path decomposition just takes a couple O(n) traversals of the tree.

Extending the components into ladders costs another O(n), because each ancestor we touch toaugment a ladder can be charged to a node in the component.

QueryJumping up to a higher ladder costs O(1), it’s just array access.

Each time we step up from a ladder of height h, we find ourselves in a ladder of height≥ 2h.Since all ladders are of height at least 1, after i ladders we reach a node of height at least 2i.

The ancestor we want has height at most n, so we reach it after visiting O(log n) ladders.

Leif Walsh Level Ancestor November 13, 2014 20 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm

⟨O(n),O(log n)⟩

PreprocessingThe long-path decomposition just takes a couple O(n) traversals of the tree.

Extending the components into ladders costs another O(n), because each ancestor we touch toaugment a ladder can be charged to a node in the component.

QueryJumping up to a higher ladder costs O(1), it’s just array access.

Each time we step up from a ladder of height h, we find ourselves in a ladder of height≥ 2h.Since all ladders are of height at least 1, after i ladders we reach a node of height at least 2i.

The ancestor we want has height at most n, so we reach it after visiting O(log n) ladders.

Leif Walsh Level Ancestor November 13, 2014 20 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Ladder Algorithm ⟨O(n),O(log n)⟩

PreprocessingThe long-path decomposition just takes a couple O(n) traversals of the tree.

Extending the components into ladders costs another O(n), because each ancestor we touch toaugment a ladder can be charged to a node in the component.

QueryJumping up to a higher ladder costs O(1), it’s just array access.

Each time we step up from a ladder of height h, we find ourselves in a ladder of height≥ 2h.Since all ladders are of height at least 1, after i ladders we reach a node of height at least 2i.

The ancestor we want has height at most n, so we reach it after visiting O(log n) ladders.

Leif Walsh Level Ancestor November 13, 2014 20 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Attacking from both sides

Leif Walsh Level Ancestor November 13, 2014 21 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Attacking from both sides

Leif Walsh Level Ancestor November 13, 2014 21 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Attacking from both sides

Why do these algorithms (Jump Pointers and Ladders) work?

Jump pointers allow us to cover a lot of ground quickly: at least half the distance to ourdestination in the first step.

Ladders start out slowly, but once you’re halfway to the destination, you get there in one morejump.

Leif Walsh Level Ancestor November 13, 2014 22 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Attacking from both sides

Why do these algorithms (Jump Pointers and Ladders) work?

Jump pointers allow us to cover a lot of ground quickly: at least half the distance to ourdestination in the first step.

Ladders start out slowly, but once you’re halfway to the destination, you get there in one morejump.

Leif Walsh Level Ancestor November 13, 2014 22 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Attacking from both sides

Why do these algorithms (Jump Pointers and Ladders) work?

Jump pointers allow us to cover a lot of ground quickly: at least half the distance to ourdestination in the first step.

Ladders start out slowly, but once you’re halfway to the destination, you get there in one morejump.

Leif Walsh Level Ancestor November 13, 2014 22 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Attacking from both sides

PreprocessingConstruct both the jump pointers and the ladder decomposition of T.

QueryConsider LAT(ν, d). We need to travel up δ = Depth(ν)− d.

One step of the jump pointer algorithm takes us to a node ν ′ at least δ/2 higher than ν .

This node ν ′ has height h′ ≥ δ/2. The ladder for ν ′ must extend higher by at least another h′,which is enough to take us the rest of the way.

Leif Walsh Level Ancestor November 13, 2014 23 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Attacking from both sides

PreprocessingConstruct both the jump pointers and the ladder decomposition of T.

QueryConsider LAT(ν, d). We need to travel up δ = Depth(ν)− d.

One step of the jump pointer algorithm takes us to a node ν ′ at least δ/2 higher than ν .

This node ν ′ has height h′ ≥ δ/2. The ladder for ν ′ must extend higher by at least another h′,which is enough to take us the rest of the way.

Leif Walsh Level Ancestor November 13, 2014 23 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Attacking from both sides

PreprocessingConstruct both the jump pointers and the ladder decomposition of T.

QueryConsider LAT(ν, d). We need to travel up δ = Depth(ν)− d.

One step of the jump pointer algorithm takes us to a node ν ′ at least δ/2 higher than ν .

This node ν ′ has height h′ ≥ δ/2. The ladder for ν ′ must extend higher by at least another h′,which is enough to take us the rest of the way.

Leif Walsh Level Ancestor November 13, 2014 23 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Attacking from both sides

PreprocessingConstruct both the jump pointers and the ladder decomposition of T.

QueryConsider LAT(ν, d). We need to travel up δ = Depth(ν)− d.

One step of the jump pointer algorithm takes us to a node ν ′ at least δ/2 higher than ν .

This node ν ′ has height h′ ≥ δ/2. The ladder for ν ′ must extend higher by at least another h′,which is enough to take us the rest of the way.

Leif Walsh Level Ancestor November 13, 2014 23 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Attacking from both sides

⟨O(n log n),O(1)⟩

PreprocessingJump pointers cost O(n log n) preprocessing, and the ladder decomposition costs O(n).

QueryWe need to follow one jump pointer (O(1)) and use one ladder (O(1)).

Leif Walsh Level Ancestor November 13, 2014 24 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Attacking from both sides

⟨O(n log n),O(1)⟩

PreprocessingJump pointers cost O(n log n) preprocessing, and the ladder decomposition costs O(n).

QueryWe need to follow one jump pointer (O(1)) and use one ladder (O(1)).

Leif Walsh Level Ancestor November 13, 2014 24 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Attacking from both sides ⟨O(n log n),O(1)⟩

PreprocessingJump pointers cost O(n log n) preprocessing, and the ladder decomposition costs O(n).

QueryWe need to follow one jump pointer (O(1)) and use one ladder (O(1)).

Leif Walsh Level Ancestor November 13, 2014 24 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Recap

We have a data structure that lets us travel up the tree to arbitrary depth in constant time, and isbuilt in O(n log n) time.

Now let’s build it in linear time.

Leif Walsh Level Ancestor November 13, 2014 25 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Recap

We have a data structure that lets us travel up the tree to arbitrary depth in constant time, and isbuilt in O(n log n) time.

Now let’s build it in linear time.

Leif Walsh Level Ancestor November 13, 2014 25 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Mathematicians HATE me for this one weird trick...

Leif Walsh Level Ancestor November 13, 2014 26 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

One weird math trick...

Consider a problem of size n.

We will divide this into O(n/ log n) subproblems each of size O(log n), and one superproblem ofsize O(n/ log n).We’ll use an O(n log n) algorithm to solve the superproblem, and then we just have smallproblems left to solve, which we’ll also solve quickly.

Leif Walsh Level Ancestor November 13, 2014 27 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

One weird math trick...

Consider a problem of size n.We will divide this into O(n/ log n) subproblems each of size O(log n), and one superproblem ofsize O(n/ log n).

We’ll use an O(n log n) algorithm to solve the superproblem, and then we just have smallproblems left to solve, which we’ll also solve quickly.

Leif Walsh Level Ancestor November 13, 2014 27 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

One weird math trick...

Consider a problem of size n.We will divide this into O(n/ log n) subproblems each of size O(log n), and one superproblem ofsize O(n/ log n).We’ll use an O(n log n) algorithm to solve the superproblem, and then we just have smallproblems left to solve, which we’ll also solve quickly.

Leif Walsh Level Ancestor November 13, 2014 27 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

One weird math trick...

Letm = O(n/ log n) be the size of the superproblem.The preprocessing complexity for the superproblem is:

O(m logm) = O(

nlog n

log(

nlog n

))

= O(

nlog n

(log n− log log n))

= O(

nlog n

log n)

= O (n)

Leif Walsh Level Ancestor November 13, 2014 28 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

One weird math trick...

Letm = O(n/ log n) be the size of the superproblem.The preprocessing complexity for the superproblem is:

O(m logm) = O(

nlog n

log(

nlog n

))

= O(

nlog n

(log n− log log n))

= O(

nlog n

log n)

= O (n)

Leif Walsh Level Ancestor November 13, 2014 28 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

One weird math trick...

Letm = O(n/ log n) be the size of the superproblem.The preprocessing complexity for the superproblem is:

O(m logm) = O(

nlog n

log(

nlog n

))= O

(n

log n(log n− log log n)

)

= O(

nlog n

log n)

= O (n)

Leif Walsh Level Ancestor November 13, 2014 28 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

One weird math trick...

Letm = O(n/ log n) be the size of the superproblem.The preprocessing complexity for the superproblem is:

O(m logm) = O(

nlog n

log(

nlog n

))= O

(n

log n(log n− log log n)

)= O

(n

log nlog n

)

= O (n)

Leif Walsh Level Ancestor November 13, 2014 28 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

One weird math trick...

Letm = O(n/ log n) be the size of the superproblem.The preprocessing complexity for the superproblem is:

O(m logm) = O(

nlog n

log(

nlog n

))= O

(n

log n(log n− log log n)

)= O

(n

log nlog n

)

= O (n)

Leif Walsh Level Ancestor November 13, 2014 28 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

One weird math trick...

Letm = O(n/ log n) be the size of the superproblem.The preprocessing complexity for the superproblem is:

O(m logm) = O(

nlog n

log(

nlog n

))= O

(n

log n(log n− log log n)

)= O

(n

log nlog n

)= O (n)

Leif Walsh Level Ancestor November 13, 2014 28 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

We choose Jump Nodes as the maximally deep nodeswith at least log(n)/4 descendants

This gives us many “micro trees” of size less thanlog(n)/4, and one “macro tree” of size O(n/ log n). Themacro tree has the jump nodes as its leaves.

We compute jump pointers only for these jump nodes,and for all other nodes ν in the macro tree, we assignJumpDesc(ν) to be one of its jump node descendants.

Leif Walsh Level Ancestor November 13, 2014 29 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

We choose Jump Nodes as the maximally deep nodeswith at least log(n)/4 descendants

This gives us many “micro trees” of size less thanlog(n)/4, and one “macro tree” of size O(n/ log n). Themacro tree has the jump nodes as its leaves.

We compute jump pointers only for these jump nodes,and for all other nodes ν in the macro tree, we assignJumpDesc(ν) to be one of its jump node descendants.

Leif Walsh Level Ancestor November 13, 2014 29 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

We choose Jump Nodes as the maximally deep nodeswith at least log(n)/4 descendants

This gives us many “micro trees” of size less thanlog(n)/4, and one “macro tree” of size O(n/ log n). Themacro tree has the jump nodes as its leaves.

We compute jump pointers only for these jump nodes,and for all other nodes ν in the macro tree, we assignJumpDesc(ν) to be one of its jump node descendants.

Leif Walsh Level Ancestor November 13, 2014 29 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

We choose Jump Nodes as the maximally deep nodeswith at least log(n)/4 descendants

This gives us many “micro trees” of size less thanlog(n)/4, and one “macro tree” of size O(n/ log n). Themacro tree has the jump nodes as its leaves.

We compute jump pointers only for these jump nodes,and for all other nodes ν in the macro tree, we assignJumpDesc(ν) to be one of its jump node descendants.

Leif Walsh Level Ancestor November 13, 2014 29 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

We choose Jump Nodes as the maximally deep nodeswith at least log(n)/4 descendants

This gives us many “micro trees” of size less thanlog(n)/4, and one “macro tree” of size O(n/ log n). Themacro tree has the jump nodes as its leaves.

We compute jump pointers only for these jump nodes,and for all other nodes ν in the macro tree, we assignJumpDesc(ν) to be one of its jump node descendants.

Leif Walsh Level Ancestor November 13, 2014 29 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

To solve a query LAT(ν, d)where ν is in the macro tree,we first jump down to JumpDesc(ν), then use one of itsjump pointers and then one ladder to findLAT(JumpDesc(ν), d) = LAT(ν, d).

Leif Walsh Level Ancestor November 13, 2014 30 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

If the query is in one of the micro trees, we need astrategy to solve it.

Consider a DFS on a micro tree. We visit each edgetwice, first going down, then later, going up.

We can identify a tree shape withm nodes with a bitvector, representing the DFS, of length 2(m− 1).

Each micro tree has less than log(n)/4 nodes, so thereare few possible shapes of micro tree:

22(m−1) ≤ 2log(n)/2 =(2log n

) 12=

√n

0 0 1 0 1 1 0 1

Leif Walsh Level Ancestor November 13, 2014 31 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

If the query is in one of the micro trees, we need astrategy to solve it.

Consider a DFS on a micro tree. We visit each edgetwice, first going down, then later, going up.

We can identify a tree shape withm nodes with a bitvector, representing the DFS, of length 2(m− 1).

Each micro tree has less than log(n)/4 nodes, so thereare few possible shapes of micro tree:

22(m−1) ≤ 2log(n)/2 =(2log n

) 12=

√n

0 0 1 0 1 1 0 1

Leif Walsh Level Ancestor November 13, 2014 31 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

If the query is in one of the micro trees, we need astrategy to solve it.

Consider a DFS on a micro tree. We visit each edgetwice, first going down, then later, going up.

We can identify a tree shape withm nodes with a bitvector, representing the DFS, of length 2(m− 1).

Each micro tree has less than log(n)/4 nodes, so thereare few possible shapes of micro tree:

22(m−1) ≤ 2log(n)/2 =(2log n

) 12=

√n

0

0 1 0 1 1 0 1

Leif Walsh Level Ancestor November 13, 2014 31 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

If the query is in one of the micro trees, we need astrategy to solve it.

Consider a DFS on a micro tree. We visit each edgetwice, first going down, then later, going up.

We can identify a tree shape withm nodes with a bitvector, representing the DFS, of length 2(m− 1).

Each micro tree has less than log(n)/4 nodes, so thereare few possible shapes of micro tree:

22(m−1) ≤ 2log(n)/2 =(2log n

) 12=

√n

0 0

1 0 1 1 0 1

Leif Walsh Level Ancestor November 13, 2014 31 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

If the query is in one of the micro trees, we need astrategy to solve it.

Consider a DFS on a micro tree. We visit each edgetwice, first going down, then later, going up.

We can identify a tree shape withm nodes with a bitvector, representing the DFS, of length 2(m− 1).

Each micro tree has less than log(n)/4 nodes, so thereare few possible shapes of micro tree:

22(m−1) ≤ 2log(n)/2 =(2log n

) 12=

√n

0 0 1

0 1 1 0 1

Leif Walsh Level Ancestor November 13, 2014 31 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

If the query is in one of the micro trees, we need astrategy to solve it.

Consider a DFS on a micro tree. We visit each edgetwice, first going down, then later, going up.

We can identify a tree shape withm nodes with a bitvector, representing the DFS, of length 2(m− 1).

Each micro tree has less than log(n)/4 nodes, so thereare few possible shapes of micro tree:

22(m−1) ≤ 2log(n)/2 =(2log n

) 12=

√n

0 0 1 0

1 1 0 1

Leif Walsh Level Ancestor November 13, 2014 31 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

If the query is in one of the micro trees, we need astrategy to solve it.

Consider a DFS on a micro tree. We visit each edgetwice, first going down, then later, going up.

We can identify a tree shape withm nodes with a bitvector, representing the DFS, of length 2(m− 1).

Each micro tree has less than log(n)/4 nodes, so thereare few possible shapes of micro tree:

22(m−1) ≤ 2log(n)/2 =(2log n

) 12=

√n

0 0 1 0 1

1 0 1

Leif Walsh Level Ancestor November 13, 2014 31 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

If the query is in one of the micro trees, we need astrategy to solve it.

Consider a DFS on a micro tree. We visit each edgetwice, first going down, then later, going up.

We can identify a tree shape withm nodes with a bitvector, representing the DFS, of length 2(m− 1).

Each micro tree has less than log(n)/4 nodes, so thereare few possible shapes of micro tree:

22(m−1) ≤ 2log(n)/2 =(2log n

) 12=

√n

0 0 1 0 1 1

0 1

Leif Walsh Level Ancestor November 13, 2014 31 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

If the query is in one of the micro trees, we need astrategy to solve it.

Consider a DFS on a micro tree. We visit each edgetwice, first going down, then later, going up.

We can identify a tree shape withm nodes with a bitvector, representing the DFS, of length 2(m− 1).

Each micro tree has less than log(n)/4 nodes, so thereare few possible shapes of micro tree:

22(m−1) ≤ 2log(n)/2 =(2log n

) 12=

√n

0 0 1 0 1 1 0

1

Leif Walsh Level Ancestor November 13, 2014 31 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

If the query is in one of the micro trees, we need astrategy to solve it.

Consider a DFS on a micro tree. We visit each edgetwice, first going down, then later, going up.

We can identify a tree shape withm nodes with a bitvector, representing the DFS, of length 2(m− 1).

Each micro tree has less than log(n)/4 nodes, so thereare few possible shapes of micro tree:

22(m−1) ≤ 2log(n)/2 =(2log n

) 12=

√n

0 0 1 0 1 1 0 1

Leif Walsh Level Ancestor November 13, 2014 31 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

If the query is in one of the micro trees, we need astrategy to solve it.

Consider a DFS on a micro tree. We visit each edgetwice, first going down, then later, going up.

We can identify a tree shape withm nodes with a bitvector, representing the DFS, of length 2(m− 1).

Each micro tree has less than log(n)/4 nodes, so thereare few possible shapes of micro tree:

22(m−1) ≤ 2log(n)/2 =(2log n

) 12=

√n

0 0 1 0 1 1 0 1

Leif Walsh Level Ancestor November 13, 2014 31 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

If the query is in one of the micro trees, we need astrategy to solve it.

Consider a DFS on a micro tree. We visit each edgetwice, first going down, then later, going up.

We can identify a tree shape withm nodes with a bitvector, representing the DFS, of length 2(m− 1).

Each micro tree has less than log(n)/4 nodes, so thereare few possible shapes of micro tree:

22(m−1) ≤ 2log(n)/2 =(2log n

) 12=

√n

0 0 1 0 1 1 0 1

Leif Walsh Level Ancestor November 13, 2014 31 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

We’ll use the simple⟨O(n2),O(1)

⟩Table Algorithm to preprocess every possiblemicro tree

shape.

To answer a query LAT(ν, d)when ν is in a micro tree, either:

Use the Table Algorithm if the target is in the micro tree.

Jump to the root of the micro tree and use the macro tree algorithm from its parent.

Leif Walsh Level Ancestor November 13, 2014 32 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

We’ll use the simple⟨O(n2),O(1)

⟩Table Algorithm to preprocess every possiblemicro tree

shape.

To answer a query LAT(ν, d)when ν is in a micro tree, either:

Use the Table Algorithm if the target is in the micro tree.

Jump to the root of the micro tree and use the macro tree algorithm from its parent.

Leif Walsh Level Ancestor November 13, 2014 32 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

PreprocessingAs before, we can build the Ladder Algorithm’s data structure in O(n) time.

We can identify the jump nodes and the micro trees with DFS.We can compute jump pointers for the jump nodes, using the ladders, in O(log n) time perjump node. There are O(n/ log n) jump nodes, so computing all jump pointers takes O(n) time.

Preprocessing one micro tree costs O(log2 n), so all microtrees together have complexityO(

√n log2 n) ≤ O(n).

Leif Walsh Level Ancestor November 13, 2014 33 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

PreprocessingAs before, we can build the Ladder Algorithm’s data structure in O(n) time.

We can identify the jump nodes and the micro trees with DFS.We can compute jump pointers for the jump nodes, using the ladders, in O(log n) time perjump node. There are O(n/ log n) jump nodes, so computing all jump pointers takes O(n) time.

Preprocessing one micro tree costs O(log2 n), so all microtrees together have complexityO(

√n log2 n) ≤ O(n).

Leif Walsh Level Ancestor November 13, 2014 33 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

PreprocessingAs before, we can build the Ladder Algorithm’s data structure in O(n) time.

We can identify the jump nodes and the micro trees with DFS.We can compute jump pointers for the jump nodes, using the ladders, in O(log n) time perjump node. There are O(n/ log n) jump nodes, so computing all jump pointers takes O(n) time.

Preprocessing one micro tree costs O(log2 n), so all microtrees together have complexityO(

√n log2 n) ≤ O(n).

Leif Walsh Level Ancestor November 13, 2014 33 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

⟨O(n),O(1)⟩

QueryIf the query is in the macro tree, we jump down to a jump node, use one jump pointer, and oneladder, which are all O(1).

If the query is in the micro tree, we solve it there with the Table Algorithm in O(1) time, or usethe macro tree, which is also O(1) as above.

Leif Walsh Level Ancestor November 13, 2014 34 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm

⟨O(n),O(1)⟩

QueryIf the query is in the macro tree, we jump down to a jump node, use one jump pointer, and oneladder, which are all O(1).

If the query is in the micro tree, we solve it there with the Table Algorithm in O(1) time, or usethe macro tree, which is also O(1) as above.

Leif Walsh Level Ancestor November 13, 2014 34 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Macro-Micro-Tree Algorithm ⟨O(n),O(1)⟩

QueryIf the query is in the macro tree, we jump down to a jump node, use one jump pointer, and oneladder, which are all O(1).

If the query is in the micro tree, we solve it there with the Table Algorithm in O(1) time, or usethe macro tree, which is also O(1) as above.

Leif Walsh Level Ancestor November 13, 2014 34 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Lessons

Look for paired algorithms that complement each other by reinforcing each others’weaknesses (Ladders and Jump Pointers).Turn an O(n log n) algorithm into an O(n) algorithm:

Divide into subproblems of size O(log n)which are easier to solve together.Usually, you want to find duplicates.Solve the O(n/ log n) problem instance with the fancy algorithm.PWL NYC #7: The LCA Problem Revisited (bit.ly/pwl-lca)

Leif Walsh Level Ancestor November 13, 2014 35 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Lessons

Look for paired algorithms that complement each other by reinforcing each others’weaknesses (Ladders and Jump Pointers).

Turn an O(n log n) algorithm into an O(n) algorithm:

Divide into subproblems of size O(log n)which are easier to solve together.Usually, you want to find duplicates.Solve the O(n/ log n) problem instance with the fancy algorithm.PWL NYC #7: The LCA Problem Revisited (bit.ly/pwl-lca)

Leif Walsh Level Ancestor November 13, 2014 35 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Lessons

Look for paired algorithms that complement each other by reinforcing each others’weaknesses (Ladders and Jump Pointers).Turn an O(n log n) algorithm into an O(n) algorithm:

Divide into subproblems of size O(log n)which are easier to solve together.Usually, you want to find duplicates.Solve the O(n/ log n) problem instance with the fancy algorithm.PWL NYC #7: The LCA Problem Revisited (bit.ly/pwl-lca)

Leif Walsh Level Ancestor November 13, 2014 35 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Lessons

Look for paired algorithms that complement each other by reinforcing each others’weaknesses (Ladders and Jump Pointers).Turn an O(n log n) algorithm into an O(n) algorithm:

Divide into subproblems of size O(log n)which are easier to solve together.Usually, you want to find duplicates.

Solve the O(n/ log n) problem instance with the fancy algorithm.PWL NYC #7: The LCA Problem Revisited (bit.ly/pwl-lca)

Leif Walsh Level Ancestor November 13, 2014 35 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Lessons

Look for paired algorithms that complement each other by reinforcing each others’weaknesses (Ladders and Jump Pointers).Turn an O(n log n) algorithm into an O(n) algorithm:

Divide into subproblems of size O(log n)which are easier to solve together.Usually, you want to find duplicates.Solve the O(n/ log n) problem instance with the fancy algorithm.

PWL NYC #7: The LCA Problem Revisited (bit.ly/pwl-lca)

Leif Walsh Level Ancestor November 13, 2014 35 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Lessons

Look for paired algorithms that complement each other by reinforcing each others’weaknesses (Ladders and Jump Pointers).Turn an O(n log n) algorithm into an O(n) algorithm:

Divide into subproblems of size O(log n)which are easier to solve together.Usually, you want to find duplicates.Solve the O(n/ log n) problem instance with the fancy algorithm.PWL NYC #7: The LCA Problem Revisited (bit.ly/pwl-lca)

Leif Walsh Level Ancestor November 13, 2014 35 / 36

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

Thanks!

Leif Walsh Level Ancestor November 13, 2014 36 / 36

top related