22c:145 Artificial Intelligence - University of...

27
22c:145 Artificial Intelligence Fall 2005 Logical Agents Cesare Tinelli The University of Iowa Copyright 2001-05 — Cesare Tinelli and Hantao Zhang. a a These notes are copyrighted material and may not be used in other course settings outside of the University of Iowa in their current or modified form without the express written permission of the copyright holders. 22c:145 Artificial Intelligence, Fall’05 – p.1/20

Transcript of 22c:145 Artificial Intelligence - University of...

Page 1: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

22c:145 Artificial IntelligenceFall 2005

Logical Agents

Cesare Tinelli

The University of Iowa

Copyright 2001-05 — Cesare Tinelli and Hantao Zhang. a

a These notes are copyrighted material and may not be used in other course settings outside of the

University of Iowa in their current or modified form without the express written permission of the copyright

holders.22c:145 Artificial Intelligence, Fall’05 – p.1/20

Page 2: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Readings

Chap. 7 of [Russell and Norvig, 2003]

22c:145 Artificial Intelligence, Fall’05 – p.2/20

Page 3: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Reasoning Agents

Remember our goal-based agent.

Agent

Environm

ent

Sensors

What it will be like if I do action A

What action Ishould do now

State

How the world evolves

What my actions do

Goals

Actuators

What the worldis like now

22c:145 Artificial Intelligence, Fall’05 – p.3/20

Page 4: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

(Knowledge-based) ReasoningAgents

Know about the world. They maintain a collection of facts(sentences) about the world, their Knowledge Base, expressed insome formal language.

Reason about the world. They are able to derive new facts fromthose in the KB using some inference mechanism.

Act upon the world. They map percepts to actions by querying andupdating the KB.

22c:145 Artificial Intelligence, Fall’05 – p.4/20

Page 5: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Automated Reasoning

Main Assumption (or the “Church Thesis” of AI)

1. Facts about the world can be represented as particularconfigurations of symbols a.

2. Reasoning about the world can be achieved by mere symbolmanipulation.

Most AI researchers believe that reasoning is symbol manipulation,nothing else. (After all, the human brain is a physical system itself.)

a i.e., physical entities such as marks on a piece of paper, states in a com-

puter’s memory, and so on.

22c:145 Artificial Intelligence, Fall’05 – p.5/20

Page 6: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Abstraction Levels

We can describe every reasoning agent (natural or not) at twodifferent abstraction levels.

Knowledge level: what the agent knows and what the agent’sgoals are.

Symbol (or implementation) level: what symbols the agent ismanipulates and how.

Agent’s

GoalsKnowledge and

Agent’s

GoalsKnowledge and

Internal

symbolsconfiguration of

Internal

symbolsconfiguration of

symbol manipulation

reasoning

Symbol Level

Knowledge Level

22c:145 Artificial Intelligence, Fall’05 – p.6/20

Page 7: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Abstraction Levels

We can describe every reasoning agent (natural or not) at twodifferent abstraction levels.

Knowledge level: what the agent knows and what the agent’sgoals are.

Symbol (or implementation) level: what symbols the agent ismanipulates and how.

At least for artificial agents, the knowledge level is a metaphor forexplaining the behavior of the agent, which is really at the symbollevel.

22c:145 Artificial Intelligence, Fall’05 – p.6/20

Page 8: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

A simple knowledge-basedagent

function KB-AGENT( percept) returns an actionstatic : KB, a knowledge base

t, a counter, initially 0, indicating time

TELL(KB, MAKE-PERCEPT-SENTENCE( percept, t))action← ASK(KB, MAKE-ACTION-QUERY(t))TELL(KB, MAKE-ACTION-SENTENCE(action, t))t← t + 1return action

The agent must be able to:

Represent states, actions, etc.Incorporate new perceptsUpdate internal representations of the worldDeduce hidden properties of the worldDeduce appropriate actions

22c:145 Artificial Intelligence, Fall’05 – p.7/20

Page 9: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

A Motivating Example: TheWumpus World!

Performance measure gold +1000, death -1000, -1 per step, -10 for using thearrow

Environment Squares adjacent to wum-pus are smelly. Squares adjacent topit are breezy. Glitter iff gold is in thesame square.Shooting kills wumpus if you are fac-ing it. Shooting uses up the only ar-row.Grabbing picks up gold if in samesquare. Releasing drops the gold insame square.

Actuators Left turn, Right turn, Forward,Grab, Release, Shoot.

Sensors Breeze, Glitter, Smell.

Breeze Breeze

Breeze

BreezeBreeze

Stench

Stench

BreezePIT

PIT

PIT

1 2 3 4

1

2

3

4

START

Gold

Stench

22c:145 Artificial Intelligence, Fall’05 – p.8/20

Page 10: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Wumpus WorldCharacterization

Observable?

Deterministic?

Episodic?

Static?

Discrete?

Single-agent?

22c:145 Artificial Intelligence, Fall’05 – p.9/20

Page 11: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Wumpus WorldCharacterization

Observable? No—only local perception

Deterministic?

Episodic?

Static?

Discrete?

Single-agent?

22c:145 Artificial Intelligence, Fall’05 – p.9/20

Page 12: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Wumpus WorldCharacterization

Observable? No—only local perception

Deterministic? Yes—outcomes exactly specified

Episodic?

Static?

Discrete?

Single-agent?

22c:145 Artificial Intelligence, Fall’05 – p.9/20

Page 13: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Wumpus WorldCharacterization

Observable? No—only local perception

Deterministic? Yes—outcomes exactly specified

Episodic? No—sequential at the level of actions

Static?

Discrete?

Single-agent?

22c:145 Artificial Intelligence, Fall’05 – p.9/20

Page 14: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Wumpus WorldCharacterization

Observable? No—only local perception

Deterministic? Yes—outcomes exactly specified

Episodic? No—sequential at the level of actions

Static? Yes—Wumpus and Pits do not move

Discrete?

Single-agent?

22c:145 Artificial Intelligence, Fall’05 – p.9/20

Page 15: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Wumpus WorldCharacterization

Observable? No—only local perception

Deterministic? Yes—outcomes exactly specified

Episodic? No—sequential at the level of actions

Static? Yes—Wumpus and Pits do not move

Discrete? Yes

Single-agent?

22c:145 Artificial Intelligence, Fall’05 – p.9/20

Page 16: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Wumpus WorldCharacterization

Observable? No—only local perception

Deterministic? Yes—outcomes exactly specified

Episodic? No—sequential at the level of actions

Static? Yes—Wumpus and Pits do not move

Discrete? Yes

Single-agent? Yes—Wumpus is essentially a natural feature

22c:145 Artificial Intelligence, Fall’05 – p.9/20

Page 17: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Exploring a Wumpus World

ABG

PS

W

= Agent = Breeze = Glitter, Gold

= Pit = Stench

= Wumpus

OK = Safe square

V = Visited

A

OK

1,1 2,1 3,1 4,1

1,2 2,2 3,2 4,2

1,3 2,3 3,3 4,3

1,4 2,4 3,4 4,4

OKOKB

P?

P?A

OK OK

OK

1,1 2,1 3,1 4,1

1,2 2,2 3,2 4,2

1,3 2,3 3,3 4,3

1,4 2,4 3,4 4,4

V

(a) (b)

What are the safe moves from (1,1)?Move to (1,2), (2,1), or stay in (1,1)

Move to (2,1) then.What are the safe moves from (2,1)?

B in (2,1)⇒ P in (2,2) or (3,1) or (1,1)Move to (1,2) then.

22c:145 Artificial Intelligence, Fall’05 – p.10/20

Page 18: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Exploring a Wumpus World

BB P!

A

OK OK

OK

1,1 2,1 3,1 4,1

1,2 2,2 3,2 4,2

1,3 2,3 3,3 4,3

1,4 2,4 3,4 4,4

V

OK

W!

VP!

A

OK OK

OK

1,1 2,1 3,1 4,1

1,2 2,2 3,2 4,2

1,3 2,3 3,3 4,3

1,4 2,4 3,4 4,4

V

S

OK

W!

V

V V

BS G

P?

P?

(b)(a)

S

ABG

PS

W

= Agent = Breeze = Glitter, Gold

= Pit = Stench

= Wumpus

OK = Safe square

V = Visited

S in (1,2)⇒W in (1,1) or (2,2) or (1,3)

Survived in (1,1) and no S in (2,1)⇒W in (1,3)

No B in (1,2)⇒ P in (3,1)Move to (2,2), then to (2,3).G in (2,3).Grab G and come home.

22c:145 Artificial Intelligence, Fall’05 – p.11/20

Page 19: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Knowledge Representation

An (artificial) agent represents knowledge as a collection ofsentences in some formal language, the knowledge representationlanguage.

A knowledge representation language is defined by its

syntax, which describes all the possible symbol configurationsthat constitute a sentence,

Ex: Arithmetic

x + y > 3 is a sentence; x+ > y is not.the semantics of x + y > 3 is either the fact “true” or the fact“false”.

22c:145 Artificial Intelligence, Fall’05 – p.12/20

Page 20: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Knowledge Representationand Reasoning

At the semantical level, reasoning is the process of deriving newfacts from previous ones.

At the syntactical level, this process is mirrored by that of producingnew sentences from previous ones.

The production of sentences from previous ones should not bearbitrary. Only entailed sentences should be derivable.

22c:145 Artificial Intelligence, Fall’05 – p.13/20

Page 21: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Entailment

Informally,a sentence ϕ is entailed by a set of sentences Γ

iffthe fact denoted by ϕ follows from the facts denoted by Γ.

Follows

Sentences

Facts

Sentence

Fact

Entails Sem

antics

Sem

antics

Representation

World

22c:145 Artificial Intelligence, Fall’05 – p.14/20

Page 22: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Entailment

NotationΓ |= ϕ if the set of sentences Γ entail the sentence ϕ.

Intuitive reading of Γ |= ϕ:Whenever Γ is true in the world, ϕ is also true.

ExamplesLet Γ consist of the axioms of arithmetic.

{x = y, y = z} |= x = z

Γ ∪ {x + y ≥ 0} |= x ≥ −y

Γ ∪ {x + y = 3, x− y = 1} |= x = 2

Γ ∪ {x + y = 3} 6|= x = 2

22c:145 Artificial Intelligence, Fall’05 – p.15/20

Page 23: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Inference Systems

At the knowledge representation level, reasoning is achieved by aninference system I, a computational device able to derive newsentences from previous ones.

NotationΓ ⊢I ϕ if I can derive the sentence ϕ from the set Γ.

To be useful at all, an inference system must be sound:

if Γ ⊢I ϕ then Γ |= ϕ holds as well.

Ideally, an inference system is also complete:

if Γ |= ϕ then Γ ⊢I ϕ holds as well.

22c:145 Artificial Intelligence, Fall’05 – p.16/20

Page 24: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Inference Rules

An inference system is typically described as a set of inference (orderivation) rules.

Each derivation rule has the form:

P1, . . . , Pn

C

←− premises

←− conclusion

22c:145 Artificial Intelligence, Fall’05 – p.17/20

Page 25: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Derivation Rules andSoundness

A derivation rule is sound if it derives true conclusions from truepremises.

All men are mortalAristotle is a manAristotle is mortal

Sound Inference

All men are mortalAristotle is mortalAll men are Aristotle

hspace7ex Unsound Inference!

22c:145 Artificial Intelligence, Fall’05 – p.18/20

Page 26: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Knowledge RepresentationLanguages

Why don’t we use natural language (e.g., English) to representknowledge?

Natural language is certainly expressive enough!

But it is also too ambiguous for automated reasoning.

Ex: I saw the boy on the hill with the telescope.

Why don’t we use programming languages?

They are well-defined and unambiguous.

But they are not expressive enough.

22c:145 Artificial Intelligence, Fall’05 – p.19/20

Page 27: 22c:145 Artificial Intelligence - University of Iowahomepage.cs.uiowa.edu/~tinelli/classes/145/Fall05/notes/7-logical... · 22c:145 Artificial Intelligence, Fall’05 – p.2/20.

Knowledge Representationand Logic

The field of Mathematical Logic provides powerful, formalknowledge representation languages and inference systems to buildreasoning agents.

We will consider two languages, and associated inference systems,from mathematical logic:

Propositional Logic

First-order Logic

22c:145 Artificial Intelligence, Fall’05 – p.20/20