Knowledge Repr e sent at i on

35
Knowledge Representation 1

description

Knowledge Repr e sent at i on. Outline. G e n e ral o n to l og y Ca t e g o r i e s a n d ob j e c t s E ve n t s a n d p r o cess e s Re a s on i n g s y s t e m s I n t e rnet s h op pi n g w o rld S u mm ary. Ontologies. - PowerPoint PPT Presentation

Transcript of Knowledge Repr e sent at i on

Page 1: Knowledge Repr e sent at i on

Knowledge Representation

1

Page 2: Knowledge Repr e sent at i on

Outline

2

General ontology Categories and objects Events and processes Reasoning systems Internet shopping world Summary

Page 3: Knowledge Repr e sent at i on

Ontologies

3

An ontology is a “vocabulary” and a “theory” of acertain “part of reality”

Special-purpose ontologies apply to restricteddomains (e.g. electronic circuits)

General-purpose ontologies have wider applicability across domains, i.e.

Must include concepts that cover many subdomains Cannot use special “short-cuts” (such as ignoring time) Must allow unification of different types of knowledge

GP ontologies are useful in widening applicabilityof reasoning systems, e.g. by including time

Page 4: Knowledge Repr e sent at i on

Ontological engineering

4

Representing a general-purpose ontology is adifficult task called ontology engineering

Existing GP ontologies have been created indifferent ways:

By team of trained ontologists By importing concepts from database(s) By extracting information from text documents By inviting anybody to enter commonsense knowledge

Ontological engineering has only been partially successful, and few large AI systems are based on GP ontologies (use special purpose ontologies)

Page 5: Knowledge Repr e sent at i on

Elements of a general ontology

5

Categories of objects Measures of quantities Composite objects Time, space, and change Events and processes Physical objects Substances Mental objects and beliefs

Page 6: Knowledge Repr e sent at i on

Top-level ontology of the world

6

Anything

EventsAbstractObjects

PhysicalObjectsPlacesIntervalsRepresentationObjectsSetsNumbers

Processes

Categories Sentences Measurements Moments

StuffThings

Liquid GasAgents SolidsAnimalsWeights

Times

Humans

Page 7: Knowledge Repr e sent at i on

Upper Ontology

• The general framework of concepts is called an upper ontology because of the convention of drawing graphs with the general concepts at the top and the more specific concepts below them• Of what use is an upper ontology?

• Consider the ontology for circuits that we studied• It makes many simplifying assumptions: time is omitted completely; signals

are fixed and do not propagate; the structure of the circuit remains constant.• more general ontology would consider signals at particular times, and would

include the wire lengths and propagation delays.• This would allow us to simulate the timing properties of the circuit, and

indeed such simulations are often carried out by circuit designers.

7

Page 8: Knowledge Repr e sent at i on

Categories and objects

8

Categories are used to classify objectsaccording to common properties or definitions

x x Tomates Re d (x ) Round (x )

Categories can be represented by Predicates: Tomato(x) Objects: The constant Tomatoes represents set of

tomatoes (reification) Roles of category representations Instance relations (is-a):

Taxonomical hierarchies (Subset): Tom atoes Fruit

Inheritance of properties (Exhaustive) decompositions

x1 Tomatoes

Page 9: Knowledge Repr e sent at i on

Categories using FOL

9

Page 10: Knowledge Repr e sent at i on

Properties of categories

• We say that two or more categories are disjoint if they have no members in common.• exhaustive decomposition• A disjoint exhaustive decomposition is known as a partition. • The following examples illustrate these three concepts:

• The predicates us to define these concepts are

• For example, a bachelor is an unmarried adult male:10

Page 11: Knowledge Repr e sent at i on

Objects and substance

11

Need to distinguish between substance anddiscrete objects

Substance (“stuff”) Mass nouns - not countable Intrinsic properties Part of a substance is (still) the same substance

Discrete objects (“things”) Count nouns - countable Extrinsic properties Parts are (generally) not of same category

Page 12: Knowledge Repr e sent at i on

Composite objects

12

A composite object is an object that has otherobjects as parts

The PartOf relation defines the object containment, and is transitive and reflexive PartOf (x, y) PartOf ( y, z) PartOf (x, z)

PartOf (x, x)

Objects can be grouped in PartOf hierarchies,similar to Subset hierarchies

The structure of the composite objectdescribes how the parts are related

Page 13: Knowledge Repr e sent at i on

Composite objects• For example, a biped has two legs attached to a body:

• For example, we might want to say “The apples in this bag weigh two pounds.”• we need a new concept, which we will call a bunch.• For example, if the apples are Apple1, Apple2, and Apple3, then

BunchOf ({Apple1,Apple2,Apple3})∀x x s PartOf (x, BunchOf (s))∈ ⇒

∀ y [ x x s PartOf (x, y)] PartOf (BunchOf (s), y) ∀ ∈ ⇒ ⇒

• logical minimization, which means defining an object as the smallest one satisfying certain conditions.

13

Page 14: Knowledge Repr e sent at i on

Measurements

14

Need to be able to represent properties like height, mass, cost, etc. Values for such properties are measures

Unit functions represent and convert measures

Length( L1) Inches(1.5) Centimeters(3.81)

l Centimeters(2.54 l) Inches(l) Measures can be used to describe objects

Mass(Tomato1) Ki log rams(0.16)

d d Days Duration(d ) Hours(24) Non-numerical measures can also be represen-

ted, but normally there is an order (e.g. >). Used in qualitative physics

Page 15: Knowledge Repr e sent at i on

Measurements

• Comparative difficulty

e1 Exercises e2 Exercises Wrote(Norvig, e1) Wrote(Russell, ∈ ∧ ∈ ∧ ∧e2) Difficulty(e1) > Difficulty(e2) ⇒

e1 Exercises e2 Exercises Difficulty(e1) > Difficulty(e2) ∈ ∧ ∈ ∧ ⇒ExpectedScore(e1) < ExpectedScore(e2)

15

Page 16: Knowledge Repr e sent at i on

Objects: Things and stuff

• The real world can be seen as consisting of primitive objects (e.g., atomic particles) and composite objects built from them.• There is, however, a significant portion of reality that seems to defy

any obvious individuation—division into distinct objects. We give this portion the generic name stuff.• count nouns, such as aardvarks, holes, and theorems, and mass

nouns, such as butter, water, and energy.• To represent stuff properly, we begin with the obvious. We need to

have as objects in our ontology at least the gross “lumps” of stuff we interact with.

b Butter PartOf (p, b) p Butter ∈ ∧ ⇒ ∈b Butter MeltingPoint(b,Centigrade(30))∈ ⇒

• Intrinsic properties and extrinsic properties

16

Page 17: Knowledge Repr e sent at i on

Event calculus

17

Event calculus: How to deal with change based onrepresenting points of time

Reifies fluents and events A fluent: At(Bilal, Berkeley) The fluent is true at time t: T(At(Bilal, IQRA),t)

Events are instances of event categoriesE1 Flyings Flyer(E1, Bilal) Origin(E1, SF ) Destination(E1, KHI)

Event E1 took place over interval i Happens(E1, i)

Time intervals represented by (start, end) pairs i = (t1, t2)

Page 18: Knowledge Repr e sent at i on

Event calculus predicates

18

T(f, t) Happens(e, i) Initiates(e, f, t)

Fluent f is true at time tEvent e happens over interval iEvent e causes fluent f to start at t

Terminates(e, f, t) Event e causes f to cease at t Clipped(f, t) Restored(f, i)

Fluent f ceases to be true in int. iFluent f becomes true in interval i

Page 19: Knowledge Repr e sent at i on

Events

• We assume a distinguished event, Start , that describes the initial state by saying which fluents are initiated or terminated at the start time.• We define T by saying that a fluent holds at a point in time if the fluent

was initiated by an event at some time in the past and was not made false (clipped) by an intervening event.• A fluent does not hold if it was terminated by an event and not made

true (restored) by another event.• Happens(e, (t1, t2)) Initiates(e, f, t1) ∧ ∧ ¬ Clipped(f, (t1, t)) t1 < t T(f, t)∧ ⇒• Happens(e, (t1, t2)) Terminates(e, f, t1) ∧ ∧ ¬ Restored (f, (t1, t)) t1 < t ∧ ⇒ ¬

T(f, t)

where Clipped and Restored are defined by• Clipped(f, (t1, t2)) ⇔ e, t, t3 Happens(e, (t, t3)) t1 ≤ t < t2 Terminates(e, f, ∃ ∧ ∧

t)• Restored (f, (t1, t2)) ⇔ e, t, t3 Happens(e, (t, t3)) t1 ≤ t < t2 Initiates(e, f, t)∃ ∧ ∧

19

Page 20: Knowledge Repr e sent at i on

Processes

• The events we have seen so far are what we call discrete events• Categories of events with sub-intervals are called process categories

or liquid event categories

20

Page 21: Knowledge Repr e sent at i on

Time intervals

21

Time intervals are partitioned into moments (zero duration) and extended intervals

Partition(Moments,ExtendedIntervals,Intervals)

i i Intervals (i Moments Duration(i) 0) Functions Start and End delimit intervals

i Interval(i) Duration(i) (Time(End (i)) Time(Start (i))) May use e.g. January 1, 1900 as arbitrary time 0

Time(Start(AD1900))=Seconds(0)

Page 22: Knowledge Repr e sent at i on

Relations between time intervals

22

Meet(i, j)

Before(i, j)After(j, i)

During(i, j )

Overlap(i, j)Overlap(j, i)

j

i

j

ij

i

j

Can be expressed logically, e.g.

i , j Meet (i, j) Time(End (i)) Time(Start ( j))

Page 23: Knowledge Repr e sent at i on

Mental events and mental objects

Need to represent beliefs in self and other agents, e.g. for controlling reasoning, or for planning actions that involve others

How are beliefs represented?• Beliefs are reified as mental objects• Mental objects are represented as strings in a

language• Inference rules for this language can be defined

Rules for reasoning about logical agents’ use their beliefs• a , p ,q LogicalAgent (a) Believes(a , p)

• Believes(a ," p q") Believes(a ,q)

23

a , p LogicalAgent (a) Believes(a , p)

Believes(a ,"Believes( Name(a), p)" )

Page 24: Knowledge Repr e sent at i on

Mental events

• propositional attitudes that an agent can have toward mental objects: attitudes such as Believes, Knows, Wants, Intends, and Informs• For example, suppose we try to assert that Lois knows that Superman

can fly:Knows(Lois, CanFly(Superman))

• if it is true that Superman is Clark Kent, then we must conclude that Lois knows that Clark can fly:

(Superman = Clark) Knows(Lois , CanFly(Superman)) |= Knows(Lois, ∧CanFly(Clark ))

• This property is called referential transparency

24

Page 25: Knowledge Repr e sent at i on

Modal Logic

• Modal logic is designed to address this problem. • Regular logic is concerned with a single modality, the modality of

truth, allowing us to express “P is true.” • Modal logic includes special modal operators that take sentences

(rather than terms) as arguments.• For example, “A knows P” is represented with the notation KAP,

where K is the modal operator for knowledge. It takes two arguments, an agent (written as the subscript) and a sentence.

25

Page 26: Knowledge Repr e sent at i on

Semantic networks

26

Graph representation of categories, objects,relations, etc. (i.e. essentially FOL)

Natural representation of inheritance and default values

∀x x∈ Persons ⇒ [∀ y HasMother (x, y) ⇒ y ∈ FemalePersons ] .

∀x x∈ Persons ⇒ Legs(x, 2) .

Page 27: Knowledge Repr e sent at i on

Semantic Network

Joe

Boy

Kay

Woman

Food

HumanBeing

School

Hasa child

NeedsGoes to

Is a

Is a

Is a

Is a

27

Page 28: Knowledge Repr e sent at i on

Other reasoning systems for categories

28

Description logics Derived from semantic networks, but more formal Supports subsumption, classification and consistency

Circumscription and default logic Formalizes reasoning about default values Assumes default in absence of other input; must be

able to retract assumption if new evidence occurs Truth maintenance systems

Supports belief revision in systems where retracting belief is permitted

Page 29: Knowledge Repr e sent at i on

Internet shopping world

29

An agent that understands and acts in aninternet shopping environment

The task is to shop for a product on the Web,given the user’s product description

The product description may be precise, in which case the agent should find the best price

In other cases the description is only partial, and the agent has to compare products

The shopping agent depends on having productknowledge, incl. category hierarchies

Page 30: Knowledge Repr e sent at i on

PEAS specification of shopping agent

30

Performance goal Recommend product(s) to match user’s description

Environment All of the Web

Actions Following links Retrieve page contents

Sensors Web pages: HTML, XML

Page 31: Knowledge Repr e sent at i on

Outline of agent behavior

31

Start at home page of known web store(s) Must have knowledge of relevant web addresses,

such as www.amazon.com etc. Spread out from home page, following links to

relevant pages containing product offers Must be able to identify page relevance, using

product category ontologies, as well parse page contents to detect product offers

Having located one or more product offers,agent must compare and recommend product

Comparison range from simple price ranking to

complex tradeoffs in several dimensions

Page 32: Knowledge Repr e sent at i on

Following links

• The agent will have knowledge of a number of stores, for example:Amazon OnlineStores Homepage(Amazon, “amazon.com”) .∈ ∧

Ebay OnlineStores Homepage(Ebay, “ebay.com”) .∈ ∧ExampleStore OnlineStores Homepage(ExampleStore, “example.com”)∈ ∧• a page is relevant to the query if it can be reached by a chain of zero or

more relevant category links from a store’s home page, and then from one more link to the product offer.

Relevant(page, query) store, home store OnlineStores ⇔ ∃ ∈ ∧Homepage(store, home) url , url 2 RelevantChain(home, url 2, query) ∧ ∃ ∧

Link(url 2, url ) page = Contents(url ) ∧

RelevantChain(start , end, query) (start = end) ( u, text LinkText(start, ⇔ ∨ ∃u, text ) RelevantCategoryName(query, text ) RelevantChain(u, end, ∧ ∧

query)) . 32

Page 33: Knowledge Repr e sent at i on

Following Links

33

Page 34: Knowledge Repr e sent at i on

Comparing offers

∃ c, offer c LaptopComputers offer ProductOffers ∈ ∧ ∈ ∧Manufacturer(c,IBM ) Model (c, ThinkBook970 ) ∧ ∧ScreenSize(c, Inches(14)) ScreenType(c, ColorLCD) ∧ ∧MemorySize(c,Gigabytes(2)) CPUSpeed (c,GHz (1.2)) ∧ ∧OfferedProduct(offer, c) Store(offer , GenStore) ∧ ∧URL(offer , “example.com/computers/34356.html”) ∧Price(offer , $(399)) Date(offer ,Today)∧

34

Page 35: Knowledge Repr e sent at i on

Summary

35

An ontology is an encoding of vocabulary and relationships. Special-purpose ontologies can be effective within limited domains

A general-purpose ontology needs to cover a wide variety of knowledge, and is based on categories and an event calculus

It covers structured objects, time and space, change, processes, substances, and beliefs

The general ontology can support agent reasoning in a wide variety of domains, including the Internet shopping world