chapter09

69
March 14, 2006 AI: Chapter 9: Inference in First-Order Logic 1 Artificial Intelligence Chapter 9: Inference in First-Order Logic Michael Scherger Department of Computer Science Kent State University

Transcript of chapter09

Page 1: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

1

Artificial IntelligenceChapter 9: Inference in First-

Order LogicMichael Scherger

Department of Computer Science

Kent State University

Page 2: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

2

Contents• Reducing FO inference to

propositional inference• Unification• Generalized Modus Ponens• Forward and Backward Chaining• Logic Programming• Resolution

Page 3: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

3

Necessary Algorithms• We already know enough to

implement TELL (although maybe not efficiently)

• But how do we implement ASK?• Recall 3 cases

– Direct matching– Finding a proof (inference)– Finding a set of bindings (unification)

Page 4: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

4

Inference with Quantifiers• Universal Instantiation:

– Given x, person(x) likes(x, McDonalds)– Infer person(John) likes(John, McDonalds)

• Existential Instantiation:– Given x, likes(x, McDonalds)– Infer likes(S1, McDonalds)– S1 is a “Skolem Constant” that is not found

anywhere else in the KB and refers to (one of) the indviduals that likes McDonalds.

Page 5: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

5

Universal Instantiation• Every instantiation of a universally quantified

sentence is entailed by it:

– for any variable v and ground term g• ground term…a term with out variables

• Example: x King(x) Greedy(x) Evil(x) yields

• King(John) Greedy(John) Evil(John)• King(Richard) Greedy(Richard) Evil(Richard)• King(Father(John)) Greedy(Father(John) Evil(Father(John))• …

Page 6: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

6

Existential Instantiation• For any sentence , variable v, and

constant k that does not appear in the KB:

• Example: x Crown(x) OnHead(x, John) yields:

• Crown(C1) OnHead(C1, John)• provided C1 is a new constant (Skolem)

Page 7: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

7

Existential Instantiation• UI can be applied several times to add new sentences– The KB is logically equivalent to the old

• EI can be applied once to replace the existential sentence– The new KB is not equivalent to the old

but is satisfiable iff the old KB was satisfiable

Page 8: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

8

Reduction to Propositional Inference

• Use instantiation rules to create relevant propositional facts in the KB, then use propositional reasoning

• Problems:– May generate infinite sentences when it

cannot prove– Will generate many irrelevant sentences

along the way!

Page 9: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

9

Reduction to Propositional Inference

• Suppose the KB had the following sentencex King(x) Greedy(x) Evil(x)King(John)Greedy(John)Brother(Richard, John)

Page 10: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

10

Reduction to Propositional Inference

• Instantiating the universal sentence in all possible ways…King(John) Greedy(John) Evil(John)King(Richard) Greedy(Richard) Evil(Richard)King(John)Greedy(John)Brother(Richard, John)

• The new KB is propositionalized: propositional symbols are…King(John), Greedy(John), Evil(John),

King(Richard), etc…

Page 11: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

11

Problems with Propositionalization

• Propositionalization tends to generate lots of irrelevant sentences

• Examplex King(x) Greedy(x) Evil(x)King(John)y Greedy(y)Brother(Richard, John)

– Obvious that Evil(John) is true, but the fact Greedy(Richard) is irrelevant

• With p k-ary predicates and n constants, there are p * nk instantiations!!!

Page 12: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

12

Unification• Unification: The process of finding

all legal substitutions that make logical expressions look identical

• This is a recursive algorithm– See text and online source code for

details!

Page 13: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

13

Unification• We can get the inference

immediately if we can find a substitution θ such that King(x) and Greedy(x) match King(John) and Greedy(y)

• θ = {x/John, y/John} works• Unify(,) = θ if θ = θ

Page 14: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

14

Unification

Page 15: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

15

Generalized Modus Ponens• This is a general inference rule for FOL

that does not require instantiation • Given:

p1’, p2’ … pn’ (p1 … pn) qSubst(θ, pi’) = subst(θ, pi) for all p

• Conclude:– Subst(θ, q)

Page 16: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

16

GMP in “CS terms”• Given a rule containing variables

• If there is a consistent set of bindings for all of the variables of the left side of the rule (before the arrow)

• Then you can derive the result of substituting all of the same variable bindings into the right side of the rule

Page 17: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

17

GMP Examplex, Parent(x,y) Parent(y,z)

GrandParent(x,z)• Parent(James, John), Parent(James,

Richard), Parent(Harry, James)• We can derive:

– GrandParent(Harry, John), bindings:((x Harry) (y James) (z John)

– GrandParent(Harry, Richard), bindings: ((x Harry) (y James) (z Richard)

Page 18: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

18

Base Cases for Unification• If two expressions are identical, the

result is (NIL) (succeed with empty unifier set)

• If two expressions are different constants, the result is NIL (fail)

• If one expression is a variable and is not contained in the other, the result is ((x other-exp))

Page 19: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

19

Recursive Case• If both expressions are lists,

– Combine the results of unifying the CAR with unifying the CDR

• In Lisp…(cons (unify (car list1) (car list2))

(unify (cdr list1) (cdr list2))

Page 20: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

20

A few more details…• Don’t reuse variable names

– Before actually unifying, give each rule a separate set of variables

– The lisp function gentemp creates uniquely numbered variables

• Keep track of bindings– If a variable is already bound to something, it

must retain the same value throughout the computation

– This requires substituting each successful binding in the remainder of the expression

Page 21: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

21

Storage and retrieval• Most systems don’t use variables on

predicates

• Therefore, hash statements by predicate for quick retrieval (predicate indexing)

• Subsumption lattice for efficiency (see p. 279)

Page 22: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

22

Forward Chaining• Forward Chaining

– Start with atomic sentences in the KB and apply Modus Ponens in the forward direction, adding new atomic sentences, until no further inferences can be made.

Page 23: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

23

Forward Chaining• Given a new fact, generate all

consequences• Assumes all rules are of the form

– C1 and C2 and C3 and…. --> Result• Each rule & binding generates a new fact• This new fact will “trigger” other rules• Keep going until the desired fact is

generated• (Semi-decidable as is FOL in general)

Page 24: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

24

FC: Example Knowledge Base

• The law says that it is a crime for an American to sell weapons to hostile nations. The country Nono, an enemy America, has some missiles, and all of its missiles were sold to it by Col. West, who is an American.

• Prove that Col. West is a criminal.

Page 25: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

25

FC: Example Knowledge Base

• …it is a crime for an American to sell weapons to hostile nationsAmerican(x) Weapon(y) Sells(x,y,z) Hostile(z) Criminal(x)

• Nono…has some missilesx Owns(Nono, x) Missiles(x)

Owns(Nono, M1) and Missle(M1)

• …all of its missiles were sold to it by Col. Westx Missle(x) Owns(Nono, x) Sells( West, x, Nono)

• Missiles are weaponsMissle(x) Weapon(x)

Page 26: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

26

FC: Example Knowledge Base

• An enemy of America counts as “hostile”Enemy( x, America ) Hostile(x)

• Col. West who is an AmericanAmerican( Col. West )

• The country Nono, an enemy of AmericaEnemy(Nono, America)

Page 27: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

27

FC: Example Knowledge Base

Page 28: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

28

FC: Example Knowledge Base

Page 29: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

29

FC: Example Knowledge Base

Page 30: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

30

Efficient Forward Chaining• Order conjuncts appropriately

– E.g. most constrained variable• Don’t generate redundant facts; each

new fact should depend on at least one newly generated fact.– Production systems– RETE matching– CLIPS

Page 31: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

31

Forward Chaining Algorithm

Page 32: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

32

OPS• Facts

– Type, attributes & values– (goal put-on yellow-block red-block)

• Rules– If conditions, then action. – Variables (<x>, <y>, etc) can be bound– If (goal put-on <x> <y>) AND– (clear <x>) THEN add (goal clear

<y>)

Page 33: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

33

RETE Network• Based only on Left Sides (conditions)

of rules• Each condition (test) appears once in

the network• Tests with “AND” are connected with

“JOIN”– Join means all tests work with same

bindings

Page 34: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

34

Example RulesIf (goal put-on <x> <y>) AND (clear <x>) AND (clear <y>) THENadd (on <x> <y>) delete (clear <x>)

If (goal clear <x>) AND (on <y> <x>) AND (clear <y>) THEN add (clear <x>) add (on <y> table) delete (on <y> <x>)

If (goal clear <x>) AND (on <y> <x>) THENadd (goal clear <y>)

If (goal put-on <x> <y>) AND (clear <x>) THENDadd (goal clear <y>)

If (goal put-on <x> <y>) AND (clear <y>) THENadd (goal clear <x>)

Page 35: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

35

RETE Network

type is goal? type is clear?

bind x and y (3)bind x (4) bind y (5)

JOIN report rule 1 is satisfied

goal type is put-on?goal-type is clear?

bind x (2)

type is on? bind x and y (1)

JOIN report rule 2 is satisfied JOIN

report rule 4 is satisfied

JOIN report rule 3 is satisfied

JOIN report rule 5 is satisfied

Page 36: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

36

Using the RETE Network• Each time a fact comes in…

– Update bindings for the relevant node (s)

– Update join(s) below those bindings– Note new rules satisfied

• Each processing cycle– Choose a satisfied rule

Page 37: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

37

Example (Facts)1. (goal put-on yellow-block red-block)2. (on blue-block yellow-block)3. (on yellow-block table)4. (on red-block table)5. (clear red-block)6. (clear blue-block)

Page 38: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

38

Why RETE is Efficient• Rules are “pre-compiled”

• Facts are dealt with as they come in– Only rules connected to a matching node are considered– Once a test fails, no nodes below are considered– Similar rules share structure

• In a typical system, when rules “fire”, new facts are created / deleted incrementally– This incrementally adds / deletes rules (with bindings) to

the conflict set

Page 39: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

39

CLIPS• CLIPS is another forward-chaining

production system

• Important commands– (assert fact) (deffacts fact1 fact2 … )– (defrule rule-name rule)– (reset) - eliminates all facts except “initial-fact”– (load file) (load-facts file)– (run)– (watch all)– (exit)

Page 40: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

40

CLIPS Rule Example

(defrule putting-on ?g <- (goal put-on ?x ?y) (clear ?x) ?bottomclear <- (clear ?y)==> (assert (on ?x ?y)) (retract ?g) (retract ?bottomclear))

Page 41: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

41

Backward Chaining• Consider the item to be proven a goal• Find a rule whose head is the goal (and

bindings)• Apply bindings to the body, and prove

these (subgoals) in turn• If you prove all the subgoals, increasing

the binding set as you go, you will prove the item.

• Logic Programming (cprolog, on CS)

Page 42: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

42

Backward Chaining Example

Page 43: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

43

Backward Chaining Example

Page 44: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

44

Backward Chaining Example

Page 45: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

45

Backward Chaining Example

Page 46: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

46

Backward Chaining Example

Page 47: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

47

Backward Chaining Example

Page 48: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

48

Backward Chaining Example

Page 49: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

49

Backward Chaining Algorithm

Page 50: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

50

Properties of Backward Chaining

• Depth-first recursive proof search: space is linear in size of proof

• Incomplete due to infinite loops– Fix by checking current goal with every

subgoal on the stack• Inefficient due to repeated subgoals (both

success and failure)– Fix using caching of previous results (extra

space)• Widely used without improvements for

logic programming

Page 51: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

51

Logic Programming• Logic Programming

– Identify problem– Assemble information– Tea Break– Encode information in

KB– Encode problem

instance as facts– Ask queries– Find false facts

• Ordinary Programming– Identify problem– Assemble information– Figure out solution– Program Solution– Encode problem

instance as data– Apply program to data– Debug procedural

errors

Page 52: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

52

Logic Programming• Basis: backward chaining with Horn clauses + lots

of bells and whistles– Widely used in Europe and Japan

• Basis of 5th Generation Languages and Projects

• Compilation techniques -> 60 million LIPS

• Programming = set of clauseshead :- literal1, …, literaln

criminal(X) :- american(X), weapon(X), sells(X, Y, Z), hostile(Z)

Page 53: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

53

Logic Programming• Rule Example

puton(X,Y) :- cleartop(X), cleartop(Y), takeoff(X,Y).

• Capital letters are variables

• Three parts to the rule– Head (thing to prove) – Neck :-– Body (subgoals, separated by ,)

• Rules end with .

Page 54: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

54

Logic Programming• Efficient unification by open coding• Efficient retrieval of matching clauses by

direct linking• Depth-first, left-to-right, backward chaining• Built-in predicate for arithmetic e.g. X is

Y*Z+2• Closed-world assumption (“negation as

failure”)– e.g. given alive(X) :- not dead(X).– alive(Joe) succeeds if dead(joe) fails

Page 55: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

55

Logic Programming• These notes are for gprolog (available on the

departmental servers)• To read a file, consult(‘file’).• To enter data directly, consult(user). Type

control-D when done.• Every statement must end in a period. If you

forget, put it on the next line.• To prove a fact, enter the fact directly at the

command line. gprolog will respond Yes, No, or give you a binding set. If you want another answer, type ; otherwise return.

• Trace(predicate) or trace(all) will allow you to watch the backward chaining process.

Page 56: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

56

Logic Programming• Depth-first search from start state X

– dfs(X) :- goal(X).– dfs(X) :- successor(X,S), dfs(S).

• No need to loop over S: successor succeeds for each

Page 57: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

57

Logic Programming• Example: Appending two lists to

produce a third– append([], Y, Y).– append([X|L], Y, [X|Z]) :- append( L, Y, Z).

– query: append( A, B, [1,2]).– answers:

• A=[] B=[1,2]• A=[1] B=[2]• A=[1,2] B=[]

Page 58: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

58

Inference Methods• Unification (prerequisite)• Forward Chaining

– Production Systems– RETE Method (OPS)

• Backward Chaining– Logic Programming (Prolog)

• Resolution– Transform to CNF– Generalization of Prop. Logic resolution

Page 59: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

59

Resolution• Convert everything to CNF

• Resolve, with unification

• If resolution is successful, proof succeeds

• If there was a variable in the item to prove, return variable’s value from unification bindings

Page 60: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

60

Resolution (Review)• Resolution allows a complete inference mechanism

(search-based) using only one rule of inference• Resolution rule:

– Given: P1 P2 P3 … Pn, and P1 Q1 … Qm– Conclude: P2 P3 … Pn Q1 … Qm

Complementary literals P1 and P1 “cancel out”• To prove a proposition F by resolution,

– Start with F– Resolve with a rule from the knowledge base (that

contains F)– Repeat until all propositions have been eliminated– If this can be done, a contradiction has been derived and

the original proposition F must be true.

Page 61: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

61

Propositional Resolution Example

• Rules– Cold and precipitation -> snow

¬cold ¬precipitation snow– January -> cold

¬January cold– Clouds -> precipitation

¬clouds precipitation

• Facts– January, clouds

• Prove– snow

Page 62: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

62

Propositional Resolution Example

¬snow ¬cold ¬precipitation snow

¬cold ¬precipitation ¬January cold

¬January ¬precipitation ¬clouds precipitation

¬January ¬cloudsJanuary

¬clouds clouds

Page 63: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

63

Resolution Theorem Proving (FOL)

• Convert everything to CNF• Resolve, with unification

– Save bindings as you go!• If resolution is successful, proof

succeeds• If there was a variable in the item to

prove, return variable’s value from unification bindings

Page 64: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

64

Converting to CNF1. Replace implication (A B) by A B 2. Move “inwards” • x P(x) is equivalent to x P(x) & vice versa

3. Standardize variables • x P(x) x Q(x) becomes x P(x) y Q(y)

4. Skolemize• x P(x) becomes P(A)

5. Drop universal quantifiers• Since all quantifiers are now , we don’t need

them6. Distributive Law

Page 65: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

65

Convert to FOPL, then CNF1. John likes all kinds of food2. Apples are food.3. Chicken is food.4. Anything that anyone eats and isn’t

killed by is food.5. Bill eats peanuts and is still alive.6. Sue eats everything Bill eats.

Page 66: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

66

Prove Using Resolution1. John likes peanuts.2. Sue eats peanuts.3. Sue eats apples.4. What does Sue eat?• Translate to Sue eats X• Result is a valid binding for X in the

proof

Page 67: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

67

Another Example• Steve only likes easy courses• Science courses are hard• All the courses in the basket weaving

department are easy• BK301 is a basket weaving course

• What course would Steve like?

Page 68: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

68

Another Resolution Example

Page 69: chapter09

March 14, 2006 AI: Chapter 9: Inference in First-Order Logic

69

Final Thoughts on Resolution

• Resolution is complete. If you don’t want to take this on faith, study pp. 300-303

• Strategies (heuristics) for efficient resolution include– Unit preference. If a clause has only one literal,

use it first.– Set of support. Identify “useful” rules and ignore

the rest. (p. 305)– Input resolution. Intermediately generated

sentences can only be combined with original inputs or original rules. (We used this strategy in our examples).

– Subsumption. Prune unnecessary facts from the database.