DOC PREVIEW
CMU CS 15780 - lecture

This preview shows page 1-2-3-20-21-22-41-42-43 out of 43 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 43 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

15-780: Grad AILecture 16: ProbabilityGeoff Gordon (this lecture)Tuomas Sandholm TAs Erik Zawadzki, Abe OthmanRandomness in searchRapidly-exploring Random TreesBreak up C-space into Voronoi regions around random landmarksInvariant: landmarks always form a tree‣known path to rootSubject to this requirement, placed in a way that tends to split large Voronoi regions‣coarse-to-fine searchGoal: feasibility not optimality (*)RRT: required subroutinesRANDOM_CONFIG‣samples from C-spaceEXTEND(q, q’)‣local controller, heads toward q’ from q‣stops before hitting obstacle (and perhaps also after bound on time or distance)FIND_NEAREST(q, Q)‣searches current tree Q for point near qPath Planning with RRTs[ Kuffner & LaValle , ICRA’00]RRT = Rapidly-Exploring Random TreeBUILT_RRT(qinit) {!T = qinit! for k = 1 to K {!!qrand = RANDOM_CONFIG()! ! EXTEND(T, qrand);!}}EXTEND(T, q) {!qnear = FIND_NEAREST(q, T)!qnew = EXTEND(qnear, q)!T = T + (qnear, qnew)}Path Planning with RRTsqinit[ Kuffner & LaValle , ICRA’00]RRT = Rapidly-Exploring Random TreeBUILT_RRT(qinit) {!T = qinit! for k = 1 to K {!!qrand = RANDOM_CONFIG()! ! EXTEND(T, qrand);!}}EXTEND(T, q) {!qnear = FIND_NEAREST(q, T)!qnew = EXTEND(qnear, q)!T = T + (qnear, qnew)}Path Planning with RRTsqinitqrand[ Kuffner & LaValle , ICRA’00]RRT = Rapidly-Exploring Random TreeBUILT_RRT(qinit) {!T = qinit! for k = 1 to K {!!qrand = RANDOM_CONFIG()! ! EXTEND(T, qrand);!}}EXTEND(T, q) {!qnear = FIND_NEAREST(q, T)!qnew = EXTEND(qnear, q)!T = T + (qnear, qnew)}Path Planning with RRTsqnearqinitqrand[ Kuffner & LaValle , ICRA’00]RRT = Rapidly-Exploring Random TreeBUILT_RRT(qinit) {!T = qinit! for k = 1 to K {!!qrand = RANDOM_CONFIG()! ! EXTEND(T, qrand);!}}EXTEND(T, q) {!qnear = FIND_NEAREST(q, T)!qnew = EXTEND(qnear, q)!T = T + (qnear, qnew)}Path Planning with RRTsqnearqnewqinitqrand[ Kuffner & LaValle , ICRA’00]RRT = Rapidly-Exploring Random TreeBUILT_RRT(qinit) {!T = qinit! for k = 1 to K {!!qrand = RANDOM_CONFIG()! ! EXTEND(T, qrand);!}}EXTEND(T, q) {!qnear = FIND_NEAREST(q, T)!qnew = EXTEND(qnear, q)!T = T + (qnear, qnew)}RRT examplePlanar holonomic robotRRTs explore coarse to fineTend to break up large Voronoi regions‣higher probability of qrand being in themLimiting distribution of vertices given by RANDOM_CONFIG‣as RRT grows, probability that qrand is reachable with local controller (and so immediately becomes a new vertex) approaches 1RRT exampleRRT for a car (3 dof)Planning with RRTsBuild RRT from start until we add a node that can reach goal using local controller(Unique) path: root → last node → goalOptional: “rewire” tree during growth by testing connectivity to more than just closest nodeOptional: grow forward and backwardProbabilityProbabilityRandom variablesAtomic eventsSample spaceProbabilityEventsCombining eventsProbabilityMeasure:‣disjoint union:‣e.g.:‣interpretation:Distribution:‣interpretation:‣e.g.:ExampleWeatherAAPL priceupsamedownsunrain0.090.150.060.210.350.14Bigger exampleWeatherAAPL priceupsamedownsunrain0.030.050.020.070.120.05Weatherupsamedownsunrain0.140.230.090.060.100.04LAX PITNotationX=x: event that r.v. X is realized as value xP(X=x) means probability of event X=x‣if clear from context, may omit “X=”‣instead of P(Weather=rain), just P(rain)‣complex events too: e.g., P(X=x, Y≠y)P(X) means a function: x → P(X=x)Functions of RVsExtend definition: any deterministic function of RVs is also an RVE.g., “profit”:WeatherAAPL priceupsamedownsunrain–11011–11011Sample v. populationSuppose we watch for 100 days and count up our observationsWeatherAAPL priceupsamedownsunrain0.090.150.060.210.350.14WeatherAAPL priceupsamedownsunrain7123224115Law of large numbersIf we take a sample of size N from distribution P, count up frequencies of atomic events, and normalize (divide by N) to get a distribution PThen P → P as N → ∞~~(simple version)Working w/ distributionsMarginals (eliminate an irrelevant RV)Conditionals (incorporate an observation)Joint (before marginalizing or conditioning)MarginalsWeatherAAPL priceupsamedownsunrain0.090.150.060.210.350.14Law of total probabilityTwo RVs, X and YY has values y1, y2, …, ykP(X) = P(X, Y=y1) + P(X, Y=y2) + …also called “sum rule”Conditioning on an observationTwo steps:‣enforce consistency‣renormalizeNotation:WeatherCoinHTsunrain0.150.150.350.35ConditionalsWeatherAAPL priceupsamedownsunrain0.030.050.020.070.120.05Weatherupsamedownsunrain0.140.230.090.060.100.04LAX PITConditionalsThought experiment: what happens if we condition on an event of zero probability?P(X | Y) is a function: x, y → P(X=x | Y=y)So:‣P(X | Y) P(Y) means the function x, y → NotationConditionals in literatureWhen you have eliminated the impossible, whatever remains, however improbable, must be the truth. —Sir Arthur Conan Doyle, as Sherlock


View Full Document

CMU CS 15780 - lecture

Download lecture
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view lecture and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view lecture 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?