Download - Lower Bounds for NNS and Metric Expansion
![Page 1: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/1.jpg)
Lower Bounds for NNS and Metric Expansion
Rina Panigrahy Kunal TalwarUdi Wieder
Microsoft Research SVC
![Page 2: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/2.jpg)
Nearest Neighbor Search
Given points in a metric spacePreprocess into a small data structure
Given a query point Quickly retrieve the closest to
Many Applications
![Page 3: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/3.jpg)
Decision Version. Given search radius r
• Find a point in distance r of query point• Relation to Approximate NNS:– If second neighbor is at distance cr– Then this is also a c-approximate NN
r
cr
![Page 4: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/4.jpg)
Cell Probe Model
Preprocess into data structure with– words– bits per word
Query algorithm gets charged t if it probes words of – All computation is free
Study tradeoff between and In this talk
m
w
mws
![Page 5: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/5.jpg)
Many different lower boundsMetric space
Approximation
Randomized?
Ref
Exact yes PT[06], BR[02]
no PT[06], Liu[04]
yes AIP[06]
yes PTW[08]
no ACP[08]
n.exp(ϵ3 d)
![Page 6: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/6.jpg)
Lower bounds from Expansion
Show a unified approach for proving cell probe lower bounds for near neighbor and other similar problems.
Show that all lower bounds stem from the same combinatorial property of the metric space
Expansion : |number of points near A|/|A|(show some new lower bounds)
![Page 7: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/7.jpg)
Graphical Nearest Neighbor
• Convert metric space to Graph• Place an edge if nodes are within
distance r• Return a neighbor of the query. Now
r=1
![Page 8: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/8.jpg)
Graphical Nearest Neighbor
• Assume uniform degree • Use a random data set• Assume W.h.p the n balls are disjoint.
![Page 9: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/9.jpg)
Deterministic Bounds via Expansion•
![Page 10: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/10.jpg)
Deterministic Bound
• sdddddddddddddddlklkj
![Page 11: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/11.jpg)
Example Application( 𝑠𝑡𝑛 )𝑡≥Φ (G )
•
n.exp(ϵ2d)
![Page 12: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/12.jpg)
Proof Idea when t=1 Shattering( 𝑠𝑡𝑛 )
𝑡≥Φ (G )
• F : V → [m] partitions V into m regions
• Split large regions• A random ball is
shattered into many parts: about ф(G)
• ф(G) replication in space
![Page 13: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/13.jpg)
Proof Idea when t=1• determines
which cell in is read
• Select a fraction of cells such
• it is likely that cantains a quarter of the data set points
• So, and
( 𝑠𝑡𝑛 )𝑡≥Φ (G )
![Page 14: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/14.jpg)
Generalizing for larger t• Select a fraction of
each table such • Continue as before
– Non adaptive algorithms
• Adaptive alg. depend upon content of selected cells– Subexp. number of
algs– Union bound
![Page 15: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/15.jpg)
Randomized Bounds• So far we assumed the algorithm is
correct on –What if only of are good query point?
Need to relax the definition of vertex expansion
![Page 16: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/16.jpg)
Randomized Bounds
• Robust Expansion
AN(A)
• N(A) captures all edges from A
• Expansion =|N(A)|/|A|
• Capture only ¾ of the edges from A
![Page 17: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/17.jpg)
Robust Exapnsion• Small set vertex expansion:
• In other words:We can cover all the edges incident on with a set of size
• We can cover of the edges incident on with a set of size
– Robust expansion is at least the edge expansion
![Page 18: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/18.jpg)
Bound for Randomized Data Structure
• Theorem: if is weakly Independent, then a randomized data structure that answers GNS queries with space and queries must satisfy
and
![Page 19: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/19.jpg)
Proof Idea when t=1 Shattering
• Most of a random ball is shattered into many parts: about фr
• фr replication in space
![Page 20: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/20.jpg)
Generalizing for larger t
• Sample 1/фr1/t
fraction from each table.
• A random ball, good part survives in all tables.
• Union bound for adaptive is trickier.
![Page 21: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/21.jpg)
Applications
• We know how to calculate robust expansion of graphs derived from:– when (known) – when (new)– when (natural input dist.)
• Don’t know the robust expansion of:– – when
![Page 22: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/22.jpg)
General Upper Bound• Say is a Cayley
Graph• Take • Take with r.e. • Use random
translations of to define the access function
• For rand. input success prob. is constant
![Page 23: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/23.jpg)
Conclusions and Open Problems
Unified approach to NNS cell probe lower bounds– often characterized by expansion – Average case with natural distributions
• Higher lower bounds?– Improve dependency on (very hard)– Dynamic NNS, tight bound for special
cases shown in the paper
![Page 24: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/24.jpg)
![Page 25: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/25.jpg)
Approximate Near Neighbor Search
• sdfsdfsffjlaskdjffj
![Page 26: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/26.jpg)
• gdgsgsdfgdfffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffkffffsdfgddddddjffjdfgdfg
![Page 27: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/27.jpg)
Graphical Nearest Neighbor
•
![Page 28: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/28.jpg)
![Page 29: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/29.jpg)
Randomized Bounds• So far we assumed the algorithm is
correct on –What if only of are good query point?
Need to relax the definition of vertex expansion and independence
is weakly independent if for random it holds that
![Page 30: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/30.jpg)
Deterministic Bounds via Expansion
•
![Page 31: Lower Bounds for NNS and Metric Expansion](https://reader036.vdocument.in/reader036/viewer/2022062305/568164dc550346895dd7369e/html5/thumbnails/31.jpg)
Proof Idea
• Can we plug the new definitions in the old proof?– Conceptually – yes!– Actually….well no
• Dependencies everywhere – the set of good neighbors of a data point depends upon the rest of the data set
• Solving this is the technical crux of the paper