Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh...

62
Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January 25, 2000

Transcript of Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh...

Page 1: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Course Outline 4.2

Searching with Problem-specific Knowledge

Presented by:

Wing Hang CheungPaul Anh Tri Huynh

Mei-Ki Maggie YangLu Ye

CPSC 533 January 25, 2000

Page 2: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Chapter 4 - Informed Search Methods

4.1 Best-First Search

4.2 Heuristic Functions

4.3 Memory Bounded Search

4.4 Iterative Improvement Algorithms

Page 3: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

4.1 Best First Search

Is just a General Searchminimum-cost nodes are expanded firstwe basically choose the node that appears to be best according to the evaluation function

Two basic approaches:Expand the node closest to the goalExpand the node on the least-cost solution path

Page 4: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Function BEST-FIRST-SEARCH(problem, EVAL-FN) returns a solution sequence Inputs: problem, a problem Eval-Fn, an evaluation function Queueing-Fu a function that orders nodes by EVAL-FN Return GENERAL-SEARCH(problem, Queueing-Fn)

The Algorithm

Page 5: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Greedy Search

“ … minimize estimated cost to reach a goal”

a heuristic function calculates such cost estimates

h(n) = estimated cost of the cheapest path from the state

at node n to a goal state

Page 6: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Function GREEDY-SEARCH(problem) return a solution or failurereturn BEST-FIRST-SEARCH(problem, h)

Required that h(n) = 0 if n = goal

The Code

Page 7: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Straight-line distance

The straight-line distance heuristic function is a for finding route-finding problem

hSLD(n) = straight-line distance between n and the goal location

Page 8: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

75

118

111

140

80

97

99

101

211

A

B

C

D

E F

G

H I

State h(n)

A 366B 374C 329D 244E 253F 178G 193H 98I 0

A

E BC

h=253 h=329 h=374

Page 9: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

A

E C B

A F G

h=253

A E F

Total distance = 253 + 178= 431

h = 366 h = 178 h = 193

h = 253 h = 178

Page 10: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

A

A F

BCE

G

IE

A E IF

h = 253

h =366 h=178 h=193

h = 253 h = 0h = 178

h = 0h = 253

Total distance=253 + 178 + 0=431

Page 11: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Optimality

A-E-F-I = 431

A-E-G-H-I = 418

vs.

State h(n)

A 366B 374C 329D 244E 253F 178G 193H 98I 0

75

118

111

140

80

97

99

101

211

A

B

C

D

EF

G

H I

Page 12: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Completeness

Greedy Search is incomplete

Worst-case time complexity O(bm)

Straight-line distance

A

B

C

D

h(n)

0

5

7

6 A

D

B

CTarget node

Starting node

Page 13: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

A* search

f(n) = g(n) + h(n)

h = heuristic function

g = uniform-cost search

75

118

111

140

80

97

99

101

211

A

B

C

D

E F

G

H I

State h(n)

A 366B 374C 329D 244E 253F 178G 193H 98I 0

Page 14: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

“ Since g(n) gives the path from the start node to node n, and h(n) is the estimated cost of the cheapest path from n to the goal, we have… ”

f(n) = estimated cost of the cheapest solution through n

Page 15: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

A

E C B

f = 75 +374 =449

f = 118+329 =447

f=140 + 253 =393

f(n) = g(n) + h(n)

A

E C B

A F G

f = 140+253 = 393

f =0 + 366 = 366

f = 280+366 = 646

f =239+178 =417

f = 220+193 = 413

Page 16: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

A

A F

BCE

G

HE

f = 140 + 253 = 393

f =220+193 =413

f = 317+98 = 415

f = 300+253 = 553

f =0 + 366= 366

Page 17: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

A

A F

BCE

G

HE

f = 140 + 253 = 393

f =220+193 =413

f = 300+253 = 553

f =0 + 366= 366

f = 317+98 = 415

If = 418+0 = 418

State h(n)

A 366B 374C 329D 244E 253F 178G 193H 98I 0

f(n) = g(n) + h(n)

Page 18: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

A-E-G-H-I = 418

Remember earlier

G

H

I

E

A

f = 418+0 = 418

Page 19: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

The Algorithm

function A*-SEARCH(problem) returns a solution or failurereturn BEST-FIRST-SEARCH(problem,g+h)

Page 20: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Chapter 4 - Informed Search Methods

4.1 Best-First Search

4.2 Heuristic Functions

4.3 Memory Bounded Search

4.4 Iterative Improvement Algorithms

Page 21: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

HEURISTIC FUNCTIONS

Page 22: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

OBJECTIVE

calculates the cost estimates of an algorithm

Page 23: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

IMPLEMENTATION

Greedy Search A* Search IDA*

Page 24: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

EXAMPLES

straight-line distance to B 8-puzzle

Page 25: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

EFFECTS Quality of a given heuristic

Determined by the effective branching factor b*

A b* close to 1 is ideal N = 1 + b* + (b*)2 + . . . + (b*)d

N = # nodes

d = solution depth

Page 26: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

EXAMPLE

If A* finds a solution depth 5 using 52 nodes, then b* = 1.91

Usual b* exhibited by a given heuristic is fairly constant over a large range of problem instances

Page 27: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

. . .

A well-designed heuristic should have a b* close to 1.

… allowing fairly large problems to be solved

Page 28: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

NUMERICAL EXAMPLE

Fig. 4.8 Comparison of the search costs and effective branching factors for the IDA and A*

algorithms with h1, h2. Data are averaged over 100 instances of the 8-puzzle, for

various solution lengths.

Page 29: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

INVENTING HEURISTIC FUNCTIONS

How ?

• Depends on the restrictions of a given problem

• A problem with lesser restrictions is known as a relaxed problem

Page 30: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

INVENTING HEURISTIC FUNCTIONS

Fig. 4.7 A typical instance of the 8-puzzle.

Page 31: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

INVENTING HEURISTIC FUNCTIONS

One problem

… one often fails to get one “clearly best” heuristic

Given h1, h2, h3, … , hm ; none dominates any others.

Which one to choose ?

h(n) = max(h1(n), … ,hm(n))

Page 32: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

INVENTING HEURISTIC FUNCTIONS

Another way:

performing experiment randomly on a particular problem gather results decide base on the collected information

Page 33: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

HEURISTICS FORCONSTRAINT SATISFACTION PROBLEMS

(CSPs)

most-constrained-variable

least-constraining-value

Page 34: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

EXAMPLE

Fig 4.9 A map-coloring problem after the first two variables (A and B) have been selected.

Which country should we color next?

DF

A B

EC

Page 35: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Chapter 4 - Informed Search Methods

4.1 Best-First Search

4.2 Heuristic Functions

4.3 Memory Bounded Search

4.4 Iterative Improvement Algorithms

Page 36: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

4.3 MEMORY BOUNDED SEARCH

In this section, we investigate two algorithms that are designed to conserve memory

Depth Nodes Time Memory

0 1 1 100 bytes

2 111 .1 11 kilobytes

4 11,111 11 1 megabyte

6 106 18 111 megabyte

8 108 31 11 gigabytes

10 1010 128 1 terabyte

12 1012 35 111 terabytes

14 1014 3500 11,111 terabytes

Figure 3.12 Time and memory requirements for breadth-first search.

Page 37: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Memory Bounded Search1. IDA* (Iterative Deepening A*) search

- is a logical extension of ITERATIVE - DEEPENING SEARCH to use heuristic information

2. SMA* (Simplified Memory Bounded

A*) search

Page 38: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Iterative Deepening search

Iterative Deepening is a kind of uniformed search strategy

combines the benefits of depth- first and breadth-first search

advantage - it is optimal and complete like breadth first search

- modest memory requirement like depth-first search

Page 39: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

IDA* (Iterative Deepening A*) search

turning A* search IDA* search

each iteration is a depth first search

~ use an f-cost limit rather than a depth limit

space requirement ~ worse case : b f*/ b - branching factor

f* - optimal solution

- smallest operator cost

d - depth

~ most case : b d is a good estimate of the storage requirement

time complexity -- IDA* does not need to insert and delete nodes on a priority queue, its overhead per node can be much less than that of A*

Page 40: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

IDA* searchFirst, each iteration expands all nodes inside the contour for the current f-cost

peeping over to find out the next contour lines

once the search inside a given contour has been complete

a new iteration is started using a new f-cost for the next contour

Contour

Page 41: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

IDA* search Algorithmfunction IDA* (problem) returns a solution sequence inputs: problem, a problem local variables: f-limit, the current f-cost limit root, a noderoot <-MAKE-NODE(INITIAL-STATE[problem]) f-limit f-COST (root)loop do

solution,f-limit DFS-CONTOUR(root,f-limit) if solution is non-null then return solution

if f-limit = then return failure; end

----------------------------------------------------------------------------------------------------------------------------------

function DFS -CONTOUR (node, f-limit) returns a solution sequence and a new f- COST limit inputs: node, a node f-limit, the current f-COST limit

local variables: next-f , the f-COST limit for the next contour, initially

if f-COST [node] > f-limit then return null, f-COST [node] if GOAL-TEST [problem] (STATE[node]} then return node, f-limit for each node s in SUCCESSOR (node) do solution, new-f DFS-CONTOUR (s, f-limit)

if solution is non-null then return solution, f-limit next-f MIN (next-f,new-f); end return null, next-f

Page 42: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

MEMORY BOUNDED SEARCH

1. IDA* (Iterative Deepening A*) search

2. SMA* (Simplified Memory

Bounded A*) search

- is similar to A* , but restricts the queue size

to fit into the available memory

Page 43: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

SMA* (Simplified Memory Bounded A*) Search

advantage to use more memory -- improve search efficiency

Can make use of all available memory to carry out the search

remember a node rather than to regenerate it when needed

Page 44: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

SMA* search (cont.)

SMA* has the following properties

SMA* will utilize whatever memory is made available to it

SMA* avoids repeated states as far as its memory allows

SMA* is complete if the available memory is sufficient to store the shallowest solution path

Page 45: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

SMA* search (cont.)

SMA* properties cont.

SMA* is optimal if enough memory is available to store the shallowest optimal solution path

when enough memory is available for the entire search tree, the search is optimally efficient

Page 46: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Progress of the SMA* search

A 12 A

B

12

15

A

G

13 (15)

13

H

A

G

G

15 (15)

24 (infinite)24

A

B

15

15

A

B

15 (24)

15

A

B

20 (24)

20 (infinite)

D

20

Aim: find the lowest -cost goal node with enough memory

Max Nodes = 3

A - root node

D,F,I,J - goal node

12

----------------------------------------------------------------------------------------------------------

A

B G15

DC

E F

H I

J K

25 20

35 30

18 24

24 29

A

B

13

15 G 13

(Infinite)

I 24

C (Infinite)

------------------------------------------------------------------------------------------------------------

Label: current f-cost

• Memory is full

• update (A) f-cost for the min child

• expand G, drop the higher f-cost leaf (B)

• Memorize B

• memory is full

• H not a goal node, mark

h to infinite

• Drop H and add I

• G memorize H

• update (G) f-cost for the min child

• update (A) f-cost

• I is goal node , but may not be the best solution

• the path through G is not so great so B is generate for the second time

• Drop G and add C

• A memorize G

• C is non-goal node

• C mark to infinite

• Drop C and add D

• B memorize C

• D is a goal node and it is lowest f-cost node then terminate

• How about J has a cost of 19 instead of 24 ????????

13

Page 47: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

SMA* search (cont.)Function SMA *(problem) returns a solution sequence inputs: problem, a problem local variables: Queue, a queue of nodes ordered by f-cost Queue Make-Queue({MAKENODE(INITIALSTATE[problem])}) loop do

if Queue is empty then return failuren deepest least-f-cost node in Queueif GOAL-TEST(n) then return successs NEXT-SUCCESSOR(n)if s is not a goal and is at maximum depth then

f(s) else

f(s) MAX(f(n), g(s)+h(s))if all of n’s successors have been generated then

update n’s f-cost and those of its ancestors if necessaryif SUCCESSORS(n) all in memory then remove n from Queueif memory is full then

delete shallowest, highest-f-cost node in Queueremove it from its parent’s successor listinsert its parent on Queue if necessary

insert s on Queueend

Page 48: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Chapter 4 - Informed Search Methods

4.1 Best-First Search

4.2 Heuristic Functions

4.3 Memory Bounded Search

4.4 Iterative Improvement Algorithms

Page 49: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

ITERATIVE IMPROVEMENT ITERATIVE IMPROVEMENT ALGORITHMSALGORITHMS

For the most practical approach in which

All the information needed for a solution are contained in the state description itself

The path of reaching a solution is not important

Advantage: memory save by keeping track of only the current state

Two major classes: Hill-climbing (gradient descent)

Simulated annealing

Page 50: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Hill-Climbing SearchHill-Climbing Search

Only record the state and it evaluation instead of maintaining a search tree

Function Hill-Climbing(problem) returns a solution stateinputs: problem, a problemlocal variables: current, a node

next, a mode

current Make-Node(Initial-State[problem])loop do next a highest-valued successor of current if Value[next] Value[current] then return current current next end

Page 51: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Hill-Climbing SearchHill-Climbing Search

select at random when there is more than one best successor to choose from

Three well-known drawbacks:

Local maxima

Plateaux

Ridges

When no progress can be made, start from a new point.

Page 52: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Local MaximaLocal Maxima

A peak lower than the highest peak in the state space

The algorithm halts when a local maximum is reached

Page 53: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

PlateauxPlateaux

Neighbors of the state are about the same height

A random walk will be generated

Page 54: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

RidgesRidges

No steep sloping sides between the top and peaks

The search makes little progress unless the top is directly reached

Page 55: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Random-Restart Hill-ClimbingRandom-Restart Hill-Climbing

Generates different starting points when no progress can be made from the previous point

Saves the best result

Can eventually find out the optimal solution if enough iterations are allowed

The fewer local maxima, the quicker it finds a good solution

Page 56: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Simulated AnnealingSimulated Annealing

Picks random moves

Keeps executing the move if the situation is actually improved; otherwise, makes the move of a probability less than 1

Number of cycles in the search is determined according to probability

The search behaves like hill-climbing when approaching the end

Originally used for the process of cooling a liquid

Page 57: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Simulated-Annealing FunctionSimulated-Annealing FunctionFunction Simulated-Annealing(problem, schedule) returns a solution state

inputs: problem, a problem schedule, a mapping from time to “temperature”

local variables: current, a node next, a node T, a “temperature” controlling the probability of downward

steps

current Make-Node(Initial-State[problem])for t 1 to do T schedule[t]

if T=0 then return current next a randomly selected successor of current

E Value[next] - Value[current] if E 0 then current next else current next only with probability eE/T

Page 58: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Applications in Constraint Applications in Constraint Satisfaction ProblemsSatisfaction Problems

General algorithms for Constraint Satisfaction Problemsassigns values to all variables

applies modifications to the current configuration by assigning different values to variables towards a solution

Example problem: an 8-queens problem(Definition of an 8-queens problem is on Pg64, text)

Page 59: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

An 8-queens ProblemAn 8-queens Problem

Algorithm chosen:

the min-conflicts heuristic repair method

Algorithm Characteristics:

repairs inconsistencies in the current configuration

selects a new value for a variable that results in the minimum number of conflicts with other variables

Page 60: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Detailed StepsDetailed Steps1. One by one, find out the number of conflicts between

the inconsistent variable and other variables.

m 2m 2

m 1m 2

m 3m 1

m 2m

Page 61: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Detailed StepsDetailed Steps2. Choose the one with the smallest number of

conflicts to make a move

m3m 3

m mm 2

m 32 m

m 30

Page 62: Course Outline 4.2 Searching with Problem-specific Knowledge Presented by: Wing Hang Cheung Paul Anh Tri Huynh Mei-Ki Maggie Yang Lu Ye CPSC 533 January.

Detailed StepsDetailed Steps3. Repeat previous steps until all the inconsistent

variables have been assigned with a proper value.

mm

mm

mm

mm