Unformatted text preview:

Home PageTitle PageContentsJJ IIJ IPage 1 of 17Go BackFull ScreenCloseQuitNearest Neighbors: The scenario• “Find the nearest Pizza Hut.” (Compare with the McDonald prob-lem).• Assume kd-tree T given, and C is the region associated with a node.• Input p is a point• Searching for point p in T helps– In one dimension, T is very useful: the closest neighbor is fromthe set of nodes visited (MANY nodes are pruned)– In higher dimensions, T is not as useful (the closest neighbormay be far away).• Nevertheless, pruning is possible.• General strategy: Collect partial results, judicial traversal, andprune.Home PageTitle PageContentsJJ IIJ IPage 2 of 17Go BackFull ScreenCloseQuitWhat If We Locate Point?We visit (35, 90) , (70, 80), . . . , and fall off (70, 30)Closest point is nowhere near this path. We must visit both subtrees.Home PageTitle PageContentsJJ IIJ IPage 3 of 17Go BackFull ScreenCloseQuitNearest Neighbor: Naive versionc l a s s Result { int d i s t a n c e ; Point point ;}i n i t ( ) {r e s u l t . d i s t a n c e = i n f i n i t y ; r e s u l t . point = n u l l}Point query ;Result p r o c e s s (KDNode k , Result r e s ) {i f ( n u l l ( k ) ) return r e s ;in t c o s t = d i s t a n c e ( k . data , query ) ;i f ( c o s t < r e s . d i s t a n c e ) {r e s . po i n t = node . data ;r e s . d i s t a n c e = d i s t a n c e ( node . data , query ) ;}r e s = p r o c e s s (k . l e f t , r e s ) ;r e s = p r o c e s s ( k . r i g h t , r e s ) ;return r e s ;}Home PageTitle PageContentsJJ IIJ IPage 4 of 17Go BackFull ScreenCloseQuitNearest Neighbor: Pruned version• Maintain the rectangle r associated with a node• Compute a lower bound on the distance from the query q to therectangle– Distance between q and any point in r is at leastlowerbound(r, q)– Do not compute all distances between q and every point in r• lowerbound() helps because if the lower bound is larger than thedistance computed so far, we do not consider many points• Must compute lowerbound() quicklyHome PageTitle PageContentsJJ IIJ IPage 5 of 17Go BackFull ScreenCloseQuitNearest Neighbor: Pruned Versionf l o a t lowerbound ( Rectangle r , Point p ) {i f ( r . i n s i d e (p ) ) return 0 ;i f ( r . l e f t (p ) ) return r . minX − p . x ;. . .}Result p r o c e s s (KDNode k , in t cd , Rectangle r , R e sult r e s ) {i f ( k == n u l l ) return r es ;i f ( lowerbound ( r , query ) >= r e s . d i s t a n c e ) return r e s ;. . .}• If the lower bound is larger than the distance computed so far, exit!• Otherwise compute the distance with the current node• Process the two children in order!Home PageTitle PageContentsJJ IIJ IPage 6 of 17Go BackFull ScreenCloseQuitNearest Neighbour Pruned VersionHome PageTitle PageContentsJJ IIJ IPage 7 of 17Go BackFull ScreenCloseQuitNearest Neighbour Pruned VersionHome PageTitle PageContentsJJ IIJ IPage 8 of 17Go BackFull ScreenCloseQuitNearest Neighbour Pruned VersionHome PageTitle PageContentsJJ IIJ IPage 9 of 17Go BackFull ScreenCloseQuitNearest Neighbour Pruned VersionHome PageTitle PageContentsJJ IIJ IPage 10 of 17Go BackFull ScreenCloseQuitNearest Neighbour Pruned VersionHome PageTitle PageContentsJJ IIJ IPage 11 of 17Go BackFull ScreenCloseQuitNearest Neighbour Pruned VersionHome PageTitle PageContentsJJ IIJ IPage 12 of 17Go BackFull ScreenCloseQuitNearest Neighbour Pruned VersionHome PageTitle PageContentsJJ IIJ IPage 13 of 17Go BackFull ScreenCloseQuitNearest Neighbour Pruned VersionHome PageTitle PageContentsJJ IIJ IPage 14 of 17Go BackFull ScreenCloseQuitNearest Neighbor: Pruned versionf l o a t lowerbound ( Rectangle r , Point p ) {i f ( r . i n s i d e (p ) ) return 0 ;i f ( r . l e f t (p ) ) return r . minX − p . x ;i f ( r . r i g h t (p ) ) return p . x − r .maxX;i f ( r . southEastCorner (p ) ) . . .}Result p r o c e s s (KDNode k , in t cd , Rectangle r , R e sult r e s ) {i f ( k == n u l l ) return r es ;i f ( lowerbound ( r , q) >= r e s . d i s t a n c e ) return r e s ;f l o a t d i s t = d i s t a n c e ( node . data , query ) ;i f ( d i s t < r e s . d i s t a n c e ) {r e s u l t . point = node . data ;r e s u l t . d i s t a n c e = d i s t a n c e ( node . data , query ) ;}i f ( q [ cd ] < k . data [ cd ] ) . . . e lse . . .return r e s ;}Home PageTitle PageContentsJJ IIJ IPage 15 of 17Go BackFull ScreenCloseQuitNearest Neighbor: Pruned versionResult p r o c e s s (KDNode k , in t cd , Rectangle r , R e sult r e s ) {f l o a t d i s t = d i s t a n c e ( node . data , query ) ;i f ( d i s t < r e s . d i s t a n c e ) {r e s u l t . point = node . data ;r e s u l t . d i s t a n c e = d i s t a n c e ( node . data , query ) ;}i f ( q [ cd ] < k . data [ cd ] ) {r e s = p r o c e s s (k . l e f t , cd +1, r . tr i mL e f t ( cd , k . data ) , r e s ) ;r e s = p r o c e s s (k . r i g h t , cd +1, r . trimRight ( cd , k . data ) , r e s ) ;}e lse {r e s = p …


View Full Document

UMD CMSC 420 - Nearest Neighbors: The scenario

Download Nearest Neighbors: The scenario
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view Nearest Neighbors: The scenario and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view Nearest Neighbors: The scenario 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?