DOC PREVIEW
Berkeley COMPSCI 188 - CSPs II (2PP)

This preview shows page 1-2-3-4 out of 12 pages.

Save
View full document
View full document
Premium Document
Do you want full access? Go Premium and unlock all 12 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 12 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 12 pages.
Access to all documents
Download any document
Ad free experience
View full document
Premium Document
Do you want full access? Go Premium and unlock all 12 pages.
Access to all documents
Download any document
Ad free experience
Premium Document
Do you want full access? Go Premium and unlock all 12 pages.
Access to all documents
Download any document
Ad free experience

Unformatted text preview:

1CS 188: Artificial IntelligenceFall 2011Lecture 5: CSPs II9/8/2011Dan Klein – UC BerkeleyMultiple slides over the course adapted from either Stuart Russell or Andrew Moore1Today Efficient Solution of CSPs Local Search22Reminder: CSPs CSPs: Variables Domains Constraints Implicit (provide code to compute) Explicit (provide a subset of the possible tuples) Unary / Binary / N-ary Goals: Usually: find any solution Find all, find best, etc3Backtracking Search43Improving Backtracking General-purpose ideas give huge gains in speed … but it’s all still NP-hard Ordering (last class): Which variable should be assigned next? In what order should its values be tried? Filtering: Can we detect inevitable failure early? Structure: Can we exploit the problem structure?5Filtering: Forward Checking Idea: Keep track of remaining legal values for unassigned variables (using immediate constraints) Idea: Terminate when any variable has no legal valuesWASANTQNSWV6[demo: forward checking animation]4Filtering: Constraint Propagation Forward checking propagates information from assigned to unassigned variables, but doesn't provide early detection for all failures: NT and SA cannot both be blue! Why didn’t we detect this yet? Constraint propagation propagates from constraint to constraintWASANTQNSWV7Consistency of An Arc An arc X → Y is consistent iff for every x in the tail there is some y in the head which could be assigned without violating a constraint What happens? Forward checking = Enforcing consistency of each arc pointing to the new assignmentWASANTQNSWV8Delete from tail!5Arc Consistency of a CSP A simple form of propagation makes sure all arcs are consistent: If X loses a value, (incoming) neighbors of X need to be rechecked! Arc consistency detects failure earlier than forward checking What’s the downside of enforcing arc consistency? Can be run as a preprocessor or after each assignment WASANTQNSWV9Delete from tail!Establishing Arc Consistency Runtime: O(n2d3), can be reduced to O(n2d2) … but detecting all possible future problems is NP-hard – why?10[DEMO]6Limitations of Arc Consistency After running arc consistency: Can have one solution left Can have multiple solutions left Can have no solutions left (and not know it)What went wrong here?11K-Consistency Increasing degrees of consistency 1-Consistency (Node Consistency): Each single node’s domain has a value which meets that node’s unary constraints 2-Consistency (Arc Consistency): For each pair of nodes, any consistent assignment to one can be extended to the other K-Consistency: For each k nodes, any consistent assignment to k-1 can be extended to the kthnode. Higher k more expensive to compute (You need to know the k=2 algorithm)127Strong K-Consistency Strong k-consistency: also k-1, k-2, … 1 consistent Claim: strong n-consistency means we can solve without backtracking! Why? Choose any assignment to any variable Choose a new variable By 2-consistency, there is a choice consistent with the first Choose a new variable By 3-consistency, there is a choice consistent with the first 2 … Lots of middle ground between arc consistency and n-consistency! (e.g. path consistency)13Problem Structure Tasmania and mainland are independent subproblems Identifiable as connected components of constraint graph Suppose each subproblem has c variables out of n total Worst-case solution cost is O((n/c)(dc)), linear in n E.g., n = 80, d = 2, c =20 280= 4 billion years at 10 million nodes/sec (4)(220) = 0.4 seconds at 10 million nodes/sec148Tree-Structured CSPs Theorem: if the constraint graph has no loops, the CSP can be solved in O(n d2) time Compare to general CSPs, where worst-case time is O(dn) This property also applies to probabilistic reasoning (later): an important example of the relation between syntactic restrictions and the complexity of reasoning.15Tree-Structured CSPs Choose a variable as root, ordervariables from root to leaves suchthat every node’s parent precedesit in the ordering  For i = n : 2, apply RemoveInconsistent(Parent(Xi),Xi) For i = 1 : n, assign Xiconsistently with Parent(Xi) Runtime: O(n d2) (why?)169Tree-Structured CSPs Why does this work? Claim: After processing the right k nodes, given any satisfying assignment to the rest, the right k can be assigned (left to right) without backtracking Proof: Induction on position Why doesn’t this algorithm work with loops? Note: we’ll see this basic idea again with Bayes’ nets17Nearly Tree-Structured CSPs Conditioning: instantiate a variable, prune its neighbors' domains Cutset conditioning: instantiate (in all ways) a set of variables such that the remaining constraint graph is a tree Cutset size c gives runtime O( (dc) (n-c) d2 ), very fast for small c1810Tree Decompositions*19 Create a tree-structured graph of overlapping subproblems, each is a mega-variable Solve each subproblem to enforce local constraints Solve the CSP over subproblem mega-variables using our efficient tree-structured CSP algorithmM1 M2 M3 M4{(WA=r,SA=g,NT=b), (WA=b,SA=r,NT=g),…}{(NT=r,SA=g,Q=b),(NT=b,SA=g,Q=r),…}Agree: (M1,M2) ∈{((WA=g,SA=g,NT=g), (NT=g,SA=g,Q=g)), …}Agree on shared varsNTSA≠≠≠≠WA≠≠≠≠≠≠≠≠QSA≠≠≠≠NT≠≠≠≠≠≠≠≠Agree on shared varsNSWSA≠≠≠≠Q≠≠≠≠≠≠≠≠Agree on shared varsVSA≠≠≠≠NSW≠≠≠≠≠≠≠≠Iterative Algorithms for CSPs Local search methods typically work with “complete” states, i.e., all variables assigned To apply to CSPs: Start with some assignment with unsatisfied constraints Operators reassign variable values No fringe! Live on the edge. Variable selection: randomly select any conflicted variable Value selection by min-conflicts heuristic: Choose a value that violates the fewest constraints I.e., hill climb with h(n) = total number of violated constraints2011Example: 4-Queens States: 4 queens in 4 columns (44= 256 states) Operators: move queen in column Goal test: no attacks Evaluation: c(n) = number of attacks[DEMO]21Performance of Min-Conflicts Given random initial state, can solve n-queens in almost constant time for arbitrary n with high probability (e.g., n =


View Full Document

Berkeley COMPSCI 188 - CSPs II (2PP)

Documents in this Course
CSP

CSP

42 pages

Metrics

Metrics

4 pages

HMMs II

HMMs II

19 pages

NLP

NLP

23 pages

Midterm

Midterm

9 pages

Agents

Agents

8 pages

Lecture 4

Lecture 4

53 pages

CSPs

CSPs

16 pages

Midterm

Midterm

6 pages

MDPs

MDPs

20 pages

mdps

mdps

2 pages

Games II

Games II

18 pages

Load more
Download CSPs II (2PP)
Our administrator received your request to download this document. We will send you the file to your email shortly.
Loading Unlocking...
Login

Join to view CSPs II (2PP) and access 3M+ class-specific study document.

or
We will never post anything without your permission.
Don't have an account?
Sign Up

Join to view CSPs II (2PP) 2 2 and access 3M+ class-specific study document.

or

By creating an account you agree to our Privacy Policy and Terms Of Use

Already a member?