Post on 10-Apr-2022
Soleymani
Constraint Satisfaction ProblemsCE417: Introduction to Artificial IntelligenceSharif University of TechnologySpring 2019
โArtificial Intelligence: A Modern Approachโ, 3rd Edition, Chapter 6Some slides have been adopted from Klein and Abdeel, CS188, UC Berkeley.
Constraint Satisfaction Problems
2
Outline} Constraint Satisfaction Problems (CSP)
} Representation for wide variety of problems} CSP solvers can be faster than general state-space searchers
} Backtracking search for CSPs} Inference in CSPs} Problem Structure} Local search for CSPs
3
What is CSPs?} In CSPs, the problem is to search for a set of values for
the variables (features) so that the assigned values satisfyconstraints.
} CSPs yield a natural representation for a widevariety of problems} CSP search algorithms use general-purpose heuristics
based on the structure of states
4
What is CSPs?} Components of a CSP
} ๐ is a set of variables {๐1, ๐2, โฆ , ๐๐}} ๐ท is the set of domains {๐ท1, ๐ท2, โฆ , ๐ท๐} where ๐ท๐ is the domain of ๐๐} ๐ถ is a set of constraints {๐ถ1, ๐ถ2, โฆ , ๐ถ๐}
} Each constraint limits the values that variables can take (e.g.,๐- โ ๐2)
} Solving a CSP
} State:An assignment of values to some or all of the variables
} Solution (goal):A complete and consistent assignment
} Consistent:An assignment that does not violate any constraint
} Complete:All variables are assigned.
5
CSP Examples
6
CSP: Map coloring example} Coloring regions with tree colors such that no neighboring
regions have the same color} Variables corresponding to regions:๐ = {๐๐ด,๐๐, ๐, ๐๐๐, ๐, ๐๐ด, ๐}} The domain of all variables is {๐๐๐, ๐๐๐๐๐, ๐๐๐ข๐}} Constraints: ๐ถ = {๐๐ด โ ๐๐ด, ๐๐ด โ ๐๐, ๐๐ด โ ๐, ๐๐ด โ ๐๐๐, ๐ โ ๐,
๐๐ด โ ๐๐,๐๐ โ ๐, ๐ โ ๐๐๐,๐๐๐ โ ๐}
} A solution:{๐๐ด = ๐๐๐, ๐๐ = ๐๐๐๐๐, ๐ = ๐๐๐,๐๐๐ = ๐๐๐๐๐, ๐ = ๐๐๐, ๐ = ๐๐๐๐๐}
7
Example: N-Queens} Variables: {๐1, ๐2, โฆ ,
๐?}} Domains: {1,2, โฆ , ๐}} Constraints:
} Implicit:โ๐, ๐ โ ๐๐๐๐_๐กโ๐๐๐๐ก๐๐๐๐๐(๐H, ๐I)} Explicit: ๐H, ๐I โ { 1,3 , 1,4 , โฆ , (8,6)}
8
Boolean satisfiability example} Given a Boolean expression, is it satisfiable?
} ๐. ๐. , ((๐-โง ๐S) โ ๐U) โจ (ยฌ๐-โง ๐U)} Representing Boolean expression in 3-CNF} 3-SAT: find a satisfying truth assignment for 3-CNF
} Variables: ๐-,๐S, โฆ , ๐X} Domains: {๐ก๐๐ข๐, ๐๐๐๐ ๐}} Constraints: all clauses must be satisfied
9
Sudoku example} Variables: Each (open) square} Domains: {1,2, โฆ , 9}} Constraints:
} 9-way ๐๐๐๐๐๐๐ for each row} 9-way ๐๐๐๐๐๐๐ for each column} 9-way ๐๐๐๐๐๐๐ for each region
10
Varieties of CSPs and Constraints
11
Varieties of CSPs} DiscreteVariables
} Finite domains} Size d means O(dn) complete assignments} E.g., Boolean CSPs, including Boolean satisfiability (NP-complete)
} Infinite domains (integers, strings, etc.)} E.g., job scheduling, variables are start/end times for each job} Linear constraints solvable, nonlinear undecidable
} Continuous variables} E.g., start/end times for Hubble Telescope observations} Linear constraints solvable in polynomial time by LP methods (see
cs170 for a bit of this theory)
12
Varieties of Constraints} Varieties of Constraints
} Unary constraints involve a single variable (equivalent toreducing domains), e.g.:
} Binary constraints involve pairs of variables, e.g.:
} Higher-order constraints involve 3 or more variables:e.g., cryptarithmetic column constraints
} Preferences (soft constraints):} E.g., red is better than green} Often representable by a cost for each variable
assignment} Gives constrained optimization problems} (Weโll ignore these until we get to Bayesโ nets)
13
Cryptarithmetic example
Hypergraph
Global constraint
} Variables:๐ = {๐น, ๐, ๐,๐, ๐ , ๐, ๐1, ๐2, ๐3}} Domains:{0,1, . . , 9}} Constraints:
} ๐๐๐๐๐๐๐(๐น, ๐, ๐,๐, ๐ , ๐)} ๐ + ๐ = ๐ + 10ร๐1} ๐1 +๐ +๐ = ๐ + 10ร๐2} ๐2 + ๐ + ๐ = ๐ + 10ร๐3} ๐3 = ๐น, ๐ โ 0, ๐น โ 0
14
Example: Sudoku
ยง Variables:ยง Each(open)square
ยง Domains:ยง {1,2,โฆ,9}
ยง Constraints:
9-wayalldiff foreachrow
9-wayalldiffforeachcolumn
9-wayalldiffforeachregion
(orcanhaveabunchofpairwiseinequalityconstraints)
15
Real-World CSPs} Assignment problems: e.g., who teaches what class} Timetabling problems: e.g., which class is offered when and where?} Hardware configuration} Transportation scheduling} Factory scheduling} Circuit layout} Fault diagnosis} โฆ lots more!
} Many real-world problems involve real-valued variablesโฆ
16
Solving CSPs
17
Solving CSPs as a standard search problemIncremental} Initial State: { }
} Actions: assign a value to an unassigned variable thatdoes not conflict with current assignment
} Goal test: Consistent & complete assignment} Path cost: not important
Weโll start with the straightforward, naรฏve approach, then improve it
18
Properties of CSPs as a standard search problem
} Generic problem formulation: same formulation for all CSPs
} Every solution appears at depth ๐ with ๐ variables
} Which search algorithm is proper?} Depth-limited search
} Branching factor is ๐๐ at the top level, ๐ = (๐ โ ๐)๐ at depth๐, hence there are ๐! ๐๐ leaves.} However, there are only ๐X complete assignments.
19
Assignment community} When assigning values to variables, we reach the same partial
assignment regardless of the order of variables
๐!ร๐๐
WA NT TWA WA
WANT
WANT
WANT
(๐ร๐)ร((๐ โ 1)ร๐)NTWA
Equal!There are ๐!ร๐๐ leaves in the tree but only ๐๐ distinct states!
โฆ.TT
โฆ.
NT
โฆ
๐ร๐
20
Backtracking Search
21
Backtracking Search} Backtracking search is the basic uninformed algorithm for solving CSPs
} Idea 1: One variable at a time} Variable assignments are commutative, so fix ordering} I.e., [WA = red then NT = green] same as [NT = green then WA = red]} Only need to consider assignments to a single variable at each step
} Idea 2: Check constraints as you go} I.e. consider only values which do not conflict previous assignments} Might have to do some computation to check the constraints} โIncremental goal testโ
} Depth-first search with these two improvementsis called backtracking search (not the best name)
} Can solve n-queens for n ยป 25
22
Backtracking search} Depth-first search for CSPs with single-variable assignments is
called backtracking search} assigns one variable at each level (eventually they all have to be
assigned.)
} Naรฏve backtracking is not generally efficient for solving CSPs.} More heuristics will be introduced later to speedup it.
23
Backtracking search} Nodes are partial assignments
} Incremental completion} Each partial candidate is the parent of all candidates that differ from it by
a single extension step.
} Traverses the search tree in depth first order.
} At each node c} If it cannot be completed to a valid solution, the whole sub-tree rooted
at c is skipped (not promising branches are pruned).} Otherwise, the algorithm (1) checks whether c itself is a valid solution,
returns it; and (2) recursively enumerates all sub-trees of c.
24
Search tree} Variable assignments in the order:๐๐ด,๐๐, ๐,โฆ
WANTSA
QNSWVT
25
General backtracking searchfunction ๐ต๐ด๐ถ๐พ๐๐ ๐ด๐ถ๐พ(๐ฃ) returns a solution, or failure
if there is a solution at ๐ฃ then return solutionfor each child ๐ข of ๐ฃ do
if ๐๐๐๐๐๐ ๐๐๐(๐ข) then ๐๐๐ ๐ข๐๐ก โ ๐ต๐ด๐ถ๐พ๐๐ ๐ด๐ถ๐พ(๐ข)if ๐๐๐ ๐ข๐๐ก โ failure return ๐๐๐ ๐ข๐๐ก
return failure
function ๐ต๐ด๐ถ๐พ๐๐ ๐ด๐ถ๐พ(๐๐ ๐ ๐๐๐๐๐๐๐ก, ๐๐ ๐) returns an assignment, or failureIf ๐๐ ๐ ๐๐๐๐๐๐๐ก is complete then return ๐๐ ๐ ๐๐๐๐๐๐๐ก๐ฃ๐๐ โselect an unassigned variablefor each ๐ฃ๐๐ in ๐ท๐๐๐๐๐(๐ฃ๐๐) do
if ๐ถ๐๐๐ ๐๐ ๐ก๐๐๐ก(๐๐ ๐ ๐๐๐๐๐๐๐ก โช {๐ฃ๐๐ โ ๐ฃ๐๐๐ข๐}, ๐๐ ๐) then๐๐๐ ๐ข๐๐ก โ ๐ต๐ด๐ถ๐พ๐๐ ๐ด๐ถ๐พ(๐๐ ๐ ๐๐๐๐๐๐๐ก โช {๐ฃ๐๐ โ ๐ฃ๐๐๐ข๐}, ๐๐ ๐)if ๐๐๐ ๐ข๐๐ก โ failure return ๐๐๐ ๐ข๐๐ก
return failure26
4-Queens
27
4-Queens
28
4-Queens
29
4-Queens
30
4-Queens
31
4-Queens
32
4-Queens
Solution33
Naรฏve backtracking (late failure)} Map coloring with three colors
} {๐๐ด = ๐๐๐, ๐ = ๐๐๐ข๐} can not be completed.} However, the backtracking search does not detect this before
selecting but ๐๐ and ๐๐ด variables
WANT
SAQ
NSWVT
Variable NT has nopossible value. Butit is not detecteduntil tying to assignit a value.
๐๐ด = ๐๐๐
๐ = ๐๐๐ข๐๐๐๐๐๐
๐๐ด๐๐
34
Backtracking Search
} Backtracking = DFS + variable-ordering + fail-on-violation
} What are the choice points?
35
Improving Backtracking
} General-purpose ideas give huge gains in speed
} Ordering:} Which variable should be assigned next?} In what order should its values be tried?
} Filtering: Can we detect inevitable failure early?
} Structure: Can we exploit the problem structure?
36
Constraint Graphs
38
Constraint Graphs
} Binary CSP: each constraint relates (at most) twovariables
} Binary constraint graph: nodes are variables, arcsshow constraints
} General-purpose CSP algorithms use the graphstructure to speed up search. E.g., Tasmania is anindependent subproblem!
39
Filtering} Filtering: Keep track of domains for unassigned variables
and cross off bad options} Filtering by inference (looking ahead) in solving CSPs
40
Filtering by inference (looking ahead) in solving CSPs} Filtering as a preprocessing stage
} AC-3 (arc consistency) algorithm} Removes values from domains of variables (and propagates constraints)
to provide all constrained pairs of variables arc consistent.
} Filtering intertwined with search} Forward checking
} When selecting a value for a variable, infers new domain reductions onneighboring unassigned variables
} Maintaining Arc Consistency (MAC) - Constraint propagation} Forward checking + recursively propagating constraints when changes
are made to the domains
} โฆ
41
Forward Checking (FC)} When selecting a value for a variable, infer new domain
reductions on neighboring unassigned variables.} Terminate search when a variable has no legal value
} When ๐ is assigned, FC establishes arc-consistency for it.
WANTSA
QNSWVT
42
Forward Checking (FC)} When selecting a value for a variable, infer new domain
reductions on neighboring unassigned variables.} Terminate search when a variable has no legal value
} When ๐ is assigned, FC establishes arc-consistency for it.
WANTSA
QNSWVT
๐๐ด = ๐๐๐
43
Forward Checking (FC)} When selecting a value for a variable, infer new domain
reductions on neighboring unassigned variables.} Terminate search when a variable has no legal value
} When ๐ is assigned, FC establishes arc-consistency for it.
WANTSA
QNSWVT
๐๐ด = ๐๐๐๐ = ๐๐๐๐๐
Combination of MRV heuristic and FC is more effective.FC incrementally computes the information that MRV needs.
44
Forward Checking (FC)} When selecting a value for a variable, infer new domain
reductions on neighboring unassigned variables.} Terminate search when a variable has no legal value
} When ๐ is assigned, FC establishes arc-consistency for it.
WANTSA
QNSWVT
๐๐ด = ๐๐๐๐ = ๐๐๐๐๐๐ = ๐๐๐ข๐
โน {๐๐ด = ๐๐๐, ๐ = ๐๐๐๐๐, ๐ = ๐๐๐ข๐} is an inconsistent partial assignment
45
Example: 4-Queens
1
32
4
32 41
X1{1,2,3,4}
X3{1,2,3,4}
X4{1,2,3,4}
X2{1,2,3,4}
4-Qeens slides have been adopted from Dorrโs slides on CMSC-421 course46
Example: 4-Queens
1
32
4
32 41
X1{1,2,3,4}
X3{1,2,3,4}
X4{1,2,3,4}
X2{1,2,3,4}
47
Example: 4-Queens
1
32
4
32 41
X1{1,2,3,4}
X3{ ,2, ,4}
X4{ ,2,3, }
X2{ , ,3,4}
48
Example: 4-Queens
1
32
4
32 41
X1{1,2,3,4}
X3{ ,2, ,4}
X4{ ,2,3, }
X2{ , ,3,4}
49
Example: 4-Queens
1
32
4
32 41
X1{1,2,3,4}
X3{ , , , }
X4{ , ,3, }
X2{ , ,3,4}
50
Example: 4-Queens
1
32
4
32 41
X1{1,2,3,4}
X3{ ,2, ,4}
X4{ ,2,3, }
X2{ , , ,4}
51
Example: 4-Queens
1
32
4
32 41
X1{1,2,3,4}
X3{ ,2, ,4}
X4{ ,2,3, }
X2{ , , ,4}
52
Example: 4-Queens
1
32
4
32 41
X1{1,2,3,4}
X3{ ,2, , }
X4{ , ,3, }
X2{ , , ,4}
53
Example: 4-Queens
1
32
4
32 41
X1{1,2,3,4}
X3{ ,2, , }
X4{ , ,3, }
X2{ , , ,4}
54
Example: 4-Queens
1
32
4
32 41
X1{1,2,3,4}
X3{ ,2, , }
X4{ , , , }
X2{ , ,3,4}
55
Filtering: shortcoming} Forward checking propagates information from assigned to
unassigned variables, but doesn't provide early detection for allfailures:
} NT and SA cannot both be blue!} Why didnโt we detect this yet?} Constraint propagation: reason from constraint to constraint
WA SA
NT Q
NSW
V
56
Consistency of A Single Arc
} An arc X ยฎ Y is consistent iff for every x there is some y which could beassigned without violating a constraint
}
} Forward checking: Enforcing consistency of arcs pointing to each newassignment
Deletefromthetail!
WA SA
NT Q
NSW
V
57
Arc consistency} ๐H is arc-consistent with respect to ๐I
if for every value in ๐ทH there is a consistent value in ๐ทI
} Example} Variables:๐ = ๐-, ๐S} Domain: {0,1,2, โฆ , 9}} Constraint:๐- = ๐SS
} Is ๐- is arc-consistent w.r.t.๐S?} No, to be arc-consistent ๐ท๐๐๐๐๐(๐-) = {0,1,4,9}
} Is ๐S is arc-consistent w.r.t.๐-?} No, to be arc-consistent ๐ท๐๐๐๐๐(๐S) = {0,1,2,3}
58
Arc-consistency: cryptarithmetic example
(๐ , ๐)
makearc-consistent
It is not arc-consistent๐ท๐๐๐๐๐(๐ ) = {0,2,4,6,8}
59
Arc Consistency of an Entire CSP
} A simple form of propagation makes sure all arcs are consistent:
} Important: If X loses a value, neighbors of X need to be rechecked!} Arc consistency detects failure earlier than forward checking} Can be run as a preprocessor or after each assignment} Whatโs the downside of enforcing arc consistency?
WA SANT Q
NSW
V
60
Arc consistency algorithm (AC-3)function ๐ด๐ถ_3(๐๐ ๐) returns false if an inconsistency is found and true otherwise
inputs: ๐๐ ๐, a binary CSP with components ๐, ๐ท, ๐ถlocal variables: ๐๐ข๐๐ข๐, a queue of arcs, initially all the arcs in ๐๐ ๐
while ๐๐ข๐๐ข๐ is not empty do(๐H, ๐I) โ ๐ ๐ธ๐๐๐๐ธ_๐น๐ผ๐ ๐๐(๐๐ข๐๐ข๐)if ๐ ๐ธ๐๐ผ๐๐ธ(๐๐ ๐, ๐H, ๐I) then
If size of ๐ทH = 0 then return ๐๐๐๐ ๐for each ๐q in ๐H. ๐๐ธ๐ผ๐บ๐ป๐ต๐๐ ๐ โ {๐I} do
add (๐q, ๐H) to ๐๐ข๐๐ข๐
function ๐ ๐ธ๐๐ผ๐๐ธ(๐๐ ๐, ๐H, ๐I) returns true iff we revise the domain of ๐H๐๐๐ฃ๐๐ ๐๐ โ ๐๐๐๐ ๐for each ๐ฅ in ๐ทH do
if no value ๐ฆ in ๐ทI allows (๐ฅ, ๐ฆ) to satisfy the constraint between ๐H and ๐I thendelete ๐ฅ from ๐ทH๐๐๐ฃ๐๐ ๐๐ โ ๐ก๐๐ข๐
return ๐๐๐ฃ๐๐ ๐๐
Makes ๐H arc-consistent with respect to ๐I61
AC-3For each arc ๐H, ๐I in the queue
Remove it from queueMakes ๐H arc-consistent with respect to ๐I
1) If ๐ทH remains unchanged then continue2) If |๐ทH| = 0 then return false3) For each neighbor ๐q of ๐H except to ๐I do
add ๐q, ๐H to queueIf domain of ๐H loses a value, neighbors of ๐H must be rechecked
} Removing a value from a domain may cause further inconsistency, sowe have to repeat the procedure until everything is consistent.
} When queue is empty, resulted CSP is equivalent to the original CSP.} Same solution (usually reduced domains speed up the search)
62
AC-3: time complexity} Time complexity (๐ variables, ๐ binary constraints, ๐
domain size): ๐(๐๐U)} Each arc (๐q, ๐H) is inserted in the queue at most ๐ times.
} At most all values in domain ๐H can be deleted.
} Checking consistency of an arc:๐(๐S)
63
Constraint propagation} FC makes the current variable arc-consistent but does not
make all the other variables arc-consistent
} NT and SA cannot both be blue!} FC does not look far enough ahead to find this inconsistency
} Maintaining Arc Consistency (MAC) - Constraint propagation} Forward checking + recursively propagating constraints when
changing domains (similar to AC-3 but only arcs related to thecurrent variable are put in the queue at start)
WANTSA
QNSWVT
๐๐ด = ๐๐๐๐ = ๐๐๐๐๐
64
Limitations of Arc Consistency
} After enforcing arc consistency:} Can have one solution left} Can have multiple solutions left} Can have no solutions left (and not know it)
} Arc consistency still runs inside abacktracking search! Whatwent
wronghere?
65
Arc consistency: map coloring example} For general map coloring problem all pairs of variables are arc-
consistent if ๐ทH โฅ 2(๐ = 1,โฆ , ๐)
} Arc consistency can do nothing.} Fails to make enough inference
} We need stronger notion of consistency to detect failure atstart.} 3-consistency (path consistency): for any consistent assignment to each
set of two variables, a consistent value can be assigned to any othervariable.
} Both of the possible assignments to set {๐๐ด, ๐๐ด} are inconsistent with๐๐.
WANT
SAQ
NSWVT
66
Local consistency} Node consistency (1-consistency)
} Unary constraints are satisfied.
} Arc consistency (2-consistency)} Binary constraints are satisfied.
} Path consistency (3-consistency)
67
k-consistency} Arc consistency does not detect all inconsistencies
} Partial assignment {๐๐ด = ๐๐๐, ๐๐๐ = ๐๐๐} is inconsistent.
} A CSP is k-consistent if for any set of ๐ โ 1 variables and forany consistent assignment to those variables, a consistent valuecan always be assigned to any ๐th variable.} E.g. 1-consistency = node-consistency} E.g. 2-consistency = arc-consistency} E.g. 3-consistency = path-consistency
WANTSA
QNSWVT
68
Strong K-Consistency} Strong k-consistency: also k-1, k-2, โฆ 1 consistent
} Claim: strong n-consistency means we can solve without backtracking!
} Why?} Choose any assignment to any variable} Choose a new variable} By 2-consistency, there is a choice consistent with the first} Choose a new variable} By 3-consistency, there is a choice consistent with the first 2} โฆ
} Lots of middle ground between arc consistency and n-consistency! (e.g.k=3, called path consistency)
69
Which level of consistency?} Trade off between the required time to establish k-
consistency and amount of the eliminated search space.} If establishing consistency is slow, this can slow the search
down to the point where no propagation is better.
} Establishing k-consistency need exponential time andspace in ๐ (in the worst case)
} Commonly computing 2-consistency and less commonly3-consistency
70
Ordering
71
Ordering: Minimum Remaining Values
} Variable Ordering: Minimum remaining values (MRV):} Choose the variable with the fewest legal left values in its domain
} Why min rather than max?} Also called โmost constrained variableโ} โFail-fastโ ordering
72
Minimum Remaining Values (MRV)} Chooses the variable with the fewest legal values
} Fail first
} Also known as Most ConstrainedVariable (MCS)
} Most likely to cause a failure soon and so pruning thesearch tree
73
Degree heuristic} Tie-breaker among MRV variables
} Degree heuristic: choose the variable with the mostconstraints on remaining variables} To choose one who interferes the others most!} reduction in branching factor WA
NT
SAQ
NSWVT
74
Least constraining value} Given a variable, choose the least constraining value:
} one that rules out the fewest values in the remaining variables} leaving maximum flexibility for subsequent variable assignments
} Fail last (the most likely values first)
} Assumption: we only need one solution WANTSA
QNSWVT
75
Ordering: Least Constraining Value
} Value Ordering: Least ConstrainingValue} Note that it may take some computation to
determine this! (E.g., rerunning filtering)
} Why least rather than most?
} Combining these ordering ideas makes1000 queens feasible
76
Inference during the search process
} It can be more powerful than inference in thepreprocessing stage.
} Interleaving search and inference
77
Solving CSP efficiently} Which variable should be assigned next?
} ๐๐ธ๐ฟ๐ธ๐ถ๐_๐๐๐ด๐๐๐ผ๐บ๐๐ธ๐ท_๐๐ด๐ ๐ผ๐ด๐ต๐ฟ๐ธ
} In what order should values of the selected variable betried?} ๐๐ ๐ท๐ธ๐ _๐ท๐๐๐ด๐ผ๐_๐๐ด๐ฟ๐๐ธ๐
} What inferences should be performed at each step in thesearch?} ๐ผ๐๐น๐ธ๐ ๐ธ๐๐ถ๐ธ
78
CSP backtracking search function ๐ต๐ด๐ถ๐พ๐๐ ๐ด๐ถ๐พ๐ผ๐_๐๐ธ๐ด๐ ๐ถ๐ป(๐๐ ๐) returns a solution, or failure
return ๐ต๐ด๐ถ๐พ๐๐ ๐ด๐ถ๐พ({}, ๐๐ ๐)
function ๐ต๐ด๐ถ๐พ๐๐ ๐ด๐ถ๐พ(๐๐ ๐ ๐๐๐๐๐๐๐ก, ๐๐ ๐) returns a solution, or failureif ๐๐ ๐ ๐๐๐๐๐๐๐ก is complete then return ๐๐ ๐ ๐๐๐๐๐๐๐ก๐ฃ๐๐ โ ๐๐ธ๐ฟ๐ธ๐ถ๐_๐๐๐ด๐๐๐ผ๐บ๐๐ธ๐ท_๐๐ด๐ ๐ผ๐ด๐ต๐ฟ๐ธ(๐๐ ๐)for each ๐ฃ๐๐๐ข๐ in ๐๐ ๐ท๐ธ๐ _๐ท๐๐๐ด๐ผ๐_๐๐ด๐ฟ๐๐ธ๐(๐ฃ๐๐, ๐๐ ๐ ๐๐๐๐๐๐๐ก, ๐๐ ๐) do
if ๐ฃ๐๐๐ข๐ is consistent with ๐๐ ๐ ๐๐๐๐๐๐๐ก thenadd {๐ฃ๐๐ = ๐ฃ๐๐๐ข๐} to ๐๐ ๐ ๐๐๐๐๐๐๐ก๐๐๐๐๐๐๐๐๐๐ โ ๐ผ๐๐น๐ธ๐ ๐ธ๐๐ถ๐ธ(๐๐ ๐, ๐ฃ๐๐, ๐ฃ๐๐๐ข๐)if ๐๐๐๐๐๐๐๐๐๐ โ ๐๐๐๐๐ข๐๐ then
add ๐๐๐๐๐๐๐๐๐๐ to ๐๐ ๐ ๐๐๐๐๐๐๐ก๐๐๐ ๐ข๐๐ก โ ๐ต๐ด๐ถ๐พ๐๐ ๐ด๐ถ๐พ(๐๐ ๐ ๐๐๐๐๐๐๐ก, ๐๐ ๐)if ๐๐๐ ๐ข๐๐ก โ ๐๐๐๐๐ข๐๐ then return ๐๐๐ ๐ข๐๐ก
remove {๐ฃ๐๐ = ๐ฃ๐๐๐ข๐} and ๐๐๐๐๐๐๐๐๐๐ from ๐๐ ๐ ๐๐๐๐๐๐๐กreturn ๐๐๐๐๐ข๐๐
79
CSP backtracking search function ๐ต๐ด๐ถ๐พ๐๐ ๐ด๐ถ๐พ๐ผ๐_๐๐ธ๐ด๐ ๐ถ๐ป(๐๐ ๐) returns a solution, or failure
return ๐ต๐ด๐ถ๐พ๐๐ ๐ด๐ถ๐พ({}, ๐๐ ๐)
function ๐ต๐ด๐ถ๐พ๐๐ ๐ด๐ถ๐พ(๐๐ ๐ ๐๐๐๐๐๐๐ก, ๐๐ ๐) returns a solution, or failureif ๐๐ ๐ ๐๐๐๐๐๐๐ก is complete then return ๐๐ ๐ ๐๐๐๐๐๐๐ก๐ฃ๐๐ โ ๐๐ธ๐ฟ๐ธ๐ถ๐_๐๐๐ด๐๐๐ผ๐บ๐๐ธ๐ท_๐๐ด๐ ๐ผ๐ด๐ต๐ฟ๐ธ(๐๐ ๐)for each ๐ฃ๐๐๐ข๐ in ๐๐ ๐ท๐ธ๐ _๐ท๐๐๐ด๐ผ๐_๐๐ด๐ฟ๐๐ธ๐(๐ฃ๐๐, ๐๐ ๐ ๐๐๐๐๐๐๐ก, ๐๐ ๐) do
if ๐ฃ๐๐๐ข๐ is consistent with ๐๐ ๐ ๐๐๐๐๐๐๐ก thenadd {๐ฃ๐๐ = ๐ฃ๐๐๐ข๐} to ๐๐ ๐ ๐๐๐๐๐๐๐ก๐๐๐๐๐๐๐๐๐๐ โ ๐ผ๐๐น๐ธ๐ ๐ธ๐๐ถ๐ธ(๐๐ ๐, ๐ฃ๐๐, ๐ฃ๐๐๐ข๐)if ๐๐๐๐๐๐๐๐๐๐ โ ๐๐๐๐๐ข๐๐ then
add ๐๐๐๐๐๐๐๐๐๐ to ๐๐ ๐ ๐๐๐๐๐๐๐ก๐๐๐ ๐ข๐๐ก โ ๐ต๐ด๐ถ๐พ๐๐ ๐ด๐ถ๐พ(๐๐ ๐ ๐๐๐๐๐๐๐ก, ๐๐ ๐)if ๐๐๐ ๐ข๐๐ก โ ๐๐๐๐๐ข๐๐ then return ๐๐๐ ๐ข๐๐ก
remove {๐ฃ๐๐ = ๐ฃ๐๐๐ข๐} and ๐๐๐๐๐๐๐๐๐๐ from ๐๐ ๐ ๐๐๐๐๐๐๐กreturn ๐๐๐๐๐ข๐๐
80
Constraint graph} Nodes are variables, arcs are constraints
} Enforcing local consistency in each part of the graph can causeinconsistent values to be eliminated
WANT
SAQ
NSWVT
81
Problem Structure
} Independent subproblems are identifiable asconnected components of constraint graph
} Suppose a graph of n variables can be broken intosubproblems of only c variables:} Worst-case solution cost is O((n/c)(dc)), linear in n} E.g., n = 80, d = 2, c =20} 280 = 4 billion years at 10 million nodes/sec} (4)(220) = 0.4 seconds at 10 million nodes/sec
Graph Structure} Extreme case: independent subproblems
} Example:Tasmania and mainland do not interact
} Connected components as independent sub-problems} The color of ๐ is independent of those of other regions
} Suppose each sub-problem has โ variables out of ๐} Worst-case solution cost is ๐((๐/โ)(๐{)) that is linear in ๐
} Example: ๐ = 80, ๐ = 2, โ = 20 (processing: 10| nodes/sec)} 40 billion years} 4 seconds
83
Tree structured CSPs} Any two variables are connected by only one path
} Theorem: if the constraint graph has no loops, the CSP can besolved in O(n d2) time} Compare to general CSPs, where worst-case time is O(dn)
84
Tree structured CSPs: topological ordering } Construct a rooted tree (picking any variable to be root, โฆ)
} Order variables from root to leaves such that every nodeโsparent precedes it in the ordering (topological ordering)
85
Tree-Structured CSPs
} Algorithm for tree-structured CSPs:} Order: Choose a root variable, order variables so that parents precede
children
} Remove backward: For i=n:2, apply ArcConsistent(Parent(Xi),Xi)} Assign forward: For i=1:n, assign Xi consistently with Parent(Xi)
Tree structured CSP Solver
} After running loop1, any arc from a parent to its child is arc-consistent.
} โ if the constraint graph has no loops, the CSP can be solvedin ๐(๐๐S) time.
๐ โTopological Sortfor ๐ = ๐downto2 do
Make-Arc-Consistent(Parent(XH), XH)for ๐ = 1to๐ do
XH โ any consistent value (with its parent) in ๐ทH
remove all values fromdomain of Parent(๐H) whichmay violate arc-consistency.
87
function ๐๐ ๐ธ๐ธ_๐ถ๐๐_๐๐๐ฟ๐๐ธ๐ (๐๐ ๐) returns a solution or failureinput: ๐๐ ๐, a CSP with components ๐,๐ท, ๐ถ
๐ โ number of variables in ๐๐๐ ๐ ๐๐๐๐๐๐๐ก โ an empty assignment๐๐๐๐ก โ any variable in ๐๐ โ ๐๐๐๐๐ฟ๐๐บ๐ผ๐ถ๐ด๐ฟ(๐, ๐๐๐๐ก)for ๐ = ๐ down to 2 do๐๐ด๐พ๐ธ_๐ด๐ ๐ถ_๐ถ๐๐๐๐ผ๐๐๐ธ๐๐(๐๐ด๐ ๐ธ๐๐(๐I), ๐I))if it cannot be made consistent then return ๐๐๐๐๐ข๐๐
for ๐ = 1 to ๐ do๐๐ ๐ ๐๐๐๐๐๐๐ก ๐H โ ๐๐๐ฆ๐๐๐๐ ๐๐ ๐ก๐๐๐ก๐ฃ๐๐๐ข๐๐๐๐๐๐ทHif there is no consistent value then return ๐๐๐๐๐ข๐๐
return ๐๐ ๐ ๐๐๐๐๐๐๐ก
Runtime: O(n d2) (why?)
88
Tree-Structured CSPs} Claim 1:After backward pass, all root-to-leaf arcs are consistent} Proof: Each XยฎY was made consistent at one point and Yโs domain could
not have been reduced thereafter (because Yโs children were processedbeforeY)
} Claim 2: If root-to-leaf arcs are consistent, forward assignment will notbacktrack
} Proof: Induction on position
} Why doesnโt this algorithm work with cycles in the constraint graph?
} Note: weโll see this basic idea again with Bayesโ nets
Reduction of general graphs into trees} Removing nodes
} Collapsing nodes together
90
Nearly Tree-Structured CSPs
} Conditioning: instantiate a variable, prune its neighbors' domains
} Cutset conditioning: instantiate (in all ways) a set of variables such thatthe remaining constraint graph is a tree
Cutset Conditioning
SA
SA SA SA
Instantiatethecutset(allpossibleways)
ComputeresidualCSPforeachassignment
SolvetheresidualCSPs(treestructured)
Chooseacutset
Cut-set conditioning
} Cutset size ๐ gives runtime ๐((๐๏ฟฝ)(๐ โ ๐)๐S)} very fast for small ๐} ๐ can be as large as ๐ โ 2
1) Find a subset S such that the remaining graph becomes a tree2) For each possible consistent assignment to S
a) remove inconsistent values from domains of remaining variablesb) solve the remaining CSP which has a tree structure
93
Tree Decomposition} Create a tree-structured graph of overlapping sub-
problems (each sub-problem as a mega-variable)
} Solve each sub-problem (enforcing local constraints)
} Solve the tree-structured CSP over mega-variables
94
Tree Decomposition} Include all variables} Each constraint must be in at least one sub problem.} If a variable is in two sub-probs, it must be in all sub-
probs along the path.
95
Tree Decomposition*
ยง Idea:createatree-structuredgraphofmega-variablesยง Eachmega-variableencodespartoftheoriginalCSPยง Subproblems overlaptoensureconsistentsolutions
M1 M2 M3 M4
{(WA=r,SA=g,NT=b), (WA=b,SA=r,NT=g),โฆ}
{(NT=r,SA=g,Q=b),(NT=b,SA=g,Q=r),โฆ}
Agree: (M1,M2) ร{((WA=g,SA=g,NT=g), (NT=g,SA=g,Q=g)), โฆ}
Agree on shared vars
NT
SA
ยนWA
ยน ยน
Q
SA
ยนNT
ยน ยน
Agree on shared vars
NSW
SA
ยนQ
ยน ยนAgree on shared vars
V
SA
ยนNSW
ยน ยน
Solving CSPs by local search algorithms} In the CSP formulation as a search problem, path is
irrelevant, so we can use complete-state formulation
} State: an assignment of values to variables} Successors(s): all states resulted from ๐ by choosing
a new value for a variable} Cost function โ(s): Number of violated constraints} Global minimum: โ ๐ = 0
97
Iterative Algorithms for CSPs} Local search methods typically work with โcompleteโ states, i.e., all variables
assigned
} To apply to CSPs:} Take an assignment with unsatisfied constraints} Operators reassign variable values
} Algorithm:While not solved,} Variable selection: randomly select any conflicted variable} Value selection: min-conflicts heuristic:
} Choose a value that violates the fewest constraints} I.e., hill climb with h(n) = -total number of violated constraints
98
function ๐๐ผ๐_๐ถ๐๐๐น๐ฟ๐ผ๐ถ๐๐(๐๐ ๐,max_๐ ๐ก๐๐๐ ) returns a solution or failure
inputs: ๐๐ ๐, a constraint satisfaction problem
m๐๐ฅ_๐ ๐ก๐๐๐ , the number of steps allowed before giving up
๐๐ข๐๐๐๐๐ก โ an initial complete assignment for ๐๐ ๐for ๐ = 1 to max_๐ ๐ก๐๐๐ do
if ๐๐ข๐๐๐๐๐ก is a solution for ๐๐ ๐ then return current
๐ฃ๐๐ โ a randomly chosen conflicted variable from ๐๐ ๐. ๐๐ด๐ ๐ผ๐ด๐ต๐ฟ๐ธ๐๐ฃ๐๐๐ข๐ โ the value ๐ฃ for ๐ฃ๐๐ that minimizes ๐ถ๐๐๐น๐ฟ๐ผ๐ถ๐๐(๐ฃ๐๐, ๐ฃ, ๐๐ข๐๐๐๐๐ก, ๐๐ ๐)set ๐ฃ๐๐ = ๐ฃ๐๐๐ข๐ in ๐๐ข๐๐๐๐๐ก
return ๐๐๐๐๐ข๐๐
if current state is consistent thenreturn it
elsechoose a random variable v, and change assignment of v to a value that causes minimum conflict.
99
Local search for CSPs} Variable selection: randomly select any conflicted variable
} Value selection by min-conflicts heuristic} choose value that violates the fewest constraints
} i.e., hill-climbing
} Given random initial state, it can solve n-queens in almostconstant time for arbitrary n with high probability} ๐ = 1000000 in an average of 50 steps
} N-queens is easy for local search methods (while quite trickyfor backtracking)} Solutions are very densely distributed in the space and any initial
assignment is guaranteed to have a solution nearby.
100
8-Queens example
๐๏ฟฝ = 3 ๐| = 8
101
4-Queens example
102
Performance of Min-Conflicts
} Given random initial state, can solve n-queens in almost constant time forarbitrary n with high probability (e.g., n = 10,000,000)!
} The same appears to be true for any randomly-generated CSP except in anarrow range of the ratio
103
Summary} CSP benefits
} Standard representation of many problems} Generic heuristics (no domain specific expertise)
} CSPs solvers (based on systematic search)} Basic solution: backtracking search} Speed-ups:
} Ordering} Filtering} Structure
} Graph structure may be useful in solving CSPs efficiently.
} Local search methods for CSPs: Iterative min-conflicts is usually effective insolving CSPs.
104