Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

32
Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself

Transcript of Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Page 1: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Informed Search Methods

Read Chapter 4

Use text for more Examples:

work them out yourself

Page 2: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Best First

• Store is replaced by sorted data structure

• Knowledge added by the “sort” function

• No guarantees yet – depends on qualities of the evaluation function

• ~ Uniform Cost with user supplied evaluation function.

Page 3: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Uniform Cost• Now assume edges have positive cost• Storage = Priority Queue: scored by path cost

– or sorted list with lowest values first

• Select- choose minimum cost• add – maintains order• Check: careful – only check minimum cost for

goal• Complete & optimal• Time & space like Breadth.

Page 4: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Uniform Cost Example

• Root – A cost 1

• Root – B cost 3

• A -- C cost 4

• B – C cost 1

• C is goal state.

• Why is Uniform cost optimal?– Expanded does not mean checked node.

Page 5: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Watch the queue

• R/0 // Path/path-cost

• R-A/1, R-B/3

• R-B/3, R-A-C/5: – Note: you don’t test expanded node– You put it in the queue

• R-B-C/4, R-A-C/5

Page 6: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Concerns

• What knowledge is available?

• How can it be added to the search?

• What guarantees are there?

• Time

• Space

Page 7: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Greedy/Hill-climbing Search

• Adding heuristic h(n)

• h(n) = estimated cost of cheapest solution from state n to the goal

• Require h(goal) = 0.

• Complete – no; can be mislead.

Page 8: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Examples:

• Route Finding: goal from A to B– straight-line distance from current to B

• 8-tile puzzle:– number of misplaced tiles– number and distance of misplaced tiles

Page 9: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

A*

• Combines greedy and Uniform cost• f(n) = g(n)+h(n) where

– g(n) = current path cost to node n– h(n) = estimated cost to goal

• If h(n) <= true cost to goal, then admissible.• Best-first using admissible f is A*.• Theorem: A* is optimal and complete

Page 10: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Admissibility?

• Route Finding: goal from A to B– straight-line distance from current to B– Less than true distance?

• 8-tile puzzle:– number of misplaced tiles– Less than number of moves?– number and distance of misplaced tiles– Less than number of moves?

Page 11: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

A* Properties

• Dechter and Pearl: A* optimal among all algorithms using h. (Any algorithm must search at least as many nodes).

• If 0<=h1 <= h2 and h2 is admissible, then h1 is admissible and h1 will search at least as many nodes as h2. So bigger is better.

• Sub exponential if h estimate error is within (approximately) log of true cost.

Page 12: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

A* special cases

• Suppose h(n) = 0. => Uniform Cost

• Suppose g(n) = 1, h(n) = 0 => Breadth First

• If non-admissible heuristic– g(n) = 0, h(n) = 1/depth => depth first

• One code, many algorithms

Page 13: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Heuristic Generation• Relaxation: make the problem simpler• Route-Planning

– don’t worry about paths: go straight

• 8-tile puzzle– don’t worry about physical constraints: pick up tile and

move to correct position

– better: allow sliding over existing tiles

• TSP– MST, lower bound on tour

• Should be easy to compute

Page 14: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Iterative Deepening A*

• Like iterative deepening, but:

• Replaces depth limit with f-cost

• Increase f-cost by smallest operator cost.

• Complete and optimal

Page 15: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

SMA*

• Memory Bounded version due to authors

• Beware authors.

• SKIP

Page 16: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Hill-climbing

• Goal: Optimizing an objective function.

• Does not require differentiable functions

• Can be applied to “goal” predicate type of problems.– BSAT with objective function number of

clauses satisfied.

• Intuition: Always move to a better state

Page 17: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Some Hill-Climbing Algo’s• Start = random state or special state.• Until (no improvement)

– Steepest Ascent: find best successor– OR (greedy): select first improving successor– Go to that successor

• Repeat the above process some number of times (Restarts).

• Can be done with partial solutions or full solutions.

Page 18: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Hill-climbing Algorithm• In Best-first, replace storage by single node• Works if single hill• Use restarts if multiple hills• Problems:

– finds local maximum, not global– plateaux: large flat regions (happens in BSAT)– ridges: fast up ridge, slow on ridge

• Not complete, not optimal• No memory problems

Page 19: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Beam

• Mix of hill-climbing and best first

• Storage is a cache of best K states

• Solves storage problem, but…

• Not optimal, not complete

Page 20: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Local (Iterative) Improving

• Initial state = full candidate solution

• Greedy hill-climbing: – if up, do it– if flat, probabilistically decide to accept move– if down, don’t do it

• We are gradually expanding the possible moves.

Page 21: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Local Improving: Performance

• Solves 1,000,000 queen problem quickly

• Useful for scheduling

• Useful for BSAT– solves (sometimes) large problems

• More time, better answer

• No memory problems

• No guarantees of anything

Page 22: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Simulated Annealing

• Like hill-climbing, but probabilistically allows down moves, controlled by current temperature and how bad move is.

• Let t[1], t[2],… be a temperature schedule.– usually t[1] is high, t[k] = 0.9*t[k-1].

• Let E be quality measure of state

• Goal: maximize E.

Page 23: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Simulated Annealing Algorithm

• Current = random state, k = 1• If T[k] = 0, stop.• Next = random next state• If Next is better than start, move there.• If Next is worse:

– Let Delta = E(next)-E(current)

– Move to next with probabilty e^(Delta/T[k])

• k = k+1

Page 24: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Simulated Annealing Discussion• No guarantees• When T is large, e^delta/t is close to e^0, or

1. So for large T, you go anywhere.• When T is small, e^delta/t is close to e^-inf,

or 0. So you avoid most bad moves.• After T becomes 0, one often does simple

hill-climbing.• Execution time depends on schedule;

memory use is trivial.

Page 25: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Genetic Algorithm• Weakly analogous to “evolution”

• No theoretic guarantees

• Applies to nearly any problem.

• Population = set of individuals

• Fitness function on individuals

• Mutation operator: new individual from old one.

• Cross-over: new individuals from parents

Page 26: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

GA Algorithm (a version)• Population = random set of n individuals

• Probabilistically choose n pairs of individuals to mate

• Probabilistically choose n descendants for next generation (may include parents or not)

• Probability depends on fitness function as in simulated annealing.

• How well does it work? Good question

Page 27: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

Scores to Probabilities

• Suppose the scores of the n individuals are:

a[1], a[2],….a[n].

The probability of choosing the jth individual

prob = a[j]/(a[1]+a[2]+….a[n]).

Page 28: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

GA Example

• Problem Boolean Satisfiability.

• Individual = bindings for variables

• Mutation = flip a variable

• Cross-over = For 2 parents, randomly positions from 1 parent. For one son take those bindings and use other parent for others.

• Fitness = number of clauses solved.

Page 29: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

GA Example

• N-queens problem

• Individual: array indicating column where ith queen is assigned.

• Mating: Cross-over

• Fitness (minimize): number of constraint violations

Page 30: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

GA Function Optimization Ex.

• Let f(x,y) be the function to optimize.

• Domain for x and y is real number between 0 and 10.

• Say the hidden function is:– f(x,y) = 2 if x> 9 & y>9.– f(x,y) = 1 if x>9 or y>9– f(x,y) = 0 otherwise.

Page 31: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

GA Works Well here.

• Individual = point = (x,y)

• Mating: something from each so:

mate({x,y},{x’,y’}) is {x,y’} and {x’,y}.

• No mutation

• Hill-climbing does poorly, GA does well.

• This example generalizes functions with large arity.

Page 32: Informed Search Methods Read Chapter 4 Use text for more Examples: work them out yourself.

GA Discussion

• Reported to work well on some problems.

• Typically not compared with other approaches, e.g. hill-climbing with restarts.

• Opinion: Works if the “mating” operator captures good substructures.

• Any ideas for GA on TSP?