1 Adam Pease Articulate Software apease at articulatesoftware dot com adampease/professional

14
1 Adam Pease Articulate Software apease at articulatesoftware dot com http://www.ontologyportal.org/ http://home.earthlink.net/~adampease/professional/ v 1.00 Formal Ontology for NIEM

Transcript of 1 Adam Pease Articulate Software apease at articulatesoftware dot com adampease/professional

Page 1: 1 Adam Pease Articulate Software apease at articulatesoftware dot com  adampease/professional

1

Adam PeaseArticulate Softwareapease at articulatesoftware dot com

http://www.ontologyportal.org/

http://home.earthlink.net/~adampease/professional/

v 1.00

Formal Ontology for NIEM

Page 2: 1 Adam Pease Articulate Software apease at articulatesoftware dot com  adampease/professional

2

Imagine...your view of the web

CV

name

education

work

private

Joe Smith

BS Case Western Reserve,1982MS UC Davis, 1984

1985-1990 ACME Software,programmer

Married, 2 children

Slide with thanks to Frank von Harmelan

Page 3: 1 Adam Pease Articulate Software apease at articulatesoftware dot com  adampease/professional

3

...and the Computer's View (assuming you don’t read Chinese)

name

CV

education

work

private

Page 4: 1 Adam Pease Articulate Software apease at articulatesoftware dot com  adampease/professional

4

But wait, we've got XML -

<job name=”Joe Smith” title=”Programmer”>

Page 5: 1 Adam Pease Articulate Software apease at articulatesoftware dot com  adampease/professional

5

But wait, we've got XML -

<job name=”Joe Smith” title=”Programmer”>

<x83 m92=”|||||||||” title=”..............”>

But the computer seesjust meaningless symbols -It doesn't know what “job”means.

Page 6: 1 Adam Pease Articulate Software apease at articulatesoftware dot com  adampease/professional

6

But wait, we've got Taxonomies -

Person

Mammal

JoeSmith

Page 7: 1 Adam Pease Articulate Software apease at articulatesoftware dot com  adampease/professional

7

But wait, we've got Taxonomies -

o4839

x931

i3729

Page 8: 1 Adam Pease Articulate Software apease at articulatesoftware dot com  adampease/professional

8

Wait, we've got semantics -

Person

Mammal

JoeSmith

instance

subclass

implies

Mammal

JoeSmith

instance

Page 9: 1 Adam Pease Articulate Software apease at articulatesoftware dot com  adampease/professional

9

Wait, we've got semantics -

Person

Mammal

JoeSmith

instance

subclass

implies

Mammal

JoeSmith

instance

u8475

x9834

p3489

r53

r22

implies

x9834

p3489

r53

Computer doesn't understand what “Person”means, but it can appear to, if it makes thesame inference as a human

Page 10: 1 Adam Pease Articulate Software apease at articulatesoftware dot com  adampease/professional

10

Semantics Helps a Machine Appear Smart

• A “smart” machine should be able to make the same inferences we do

• (let's not debate the philosophy about whether it would actually be smart)

• The more sophisticated the logical language, the more inferences that are possible

Page 11: 1 Adam Pease Articulate Software apease at articulatesoftware dot com  adampease/professional

11

Suggested Upper Merged Ontology

•1000 terms, 4000 axioms, 750 rules

•Mapped by hand to 100,000 English word senses of WordNet 1.6

• then ported to 2.1

• Also related to dozens of non-English languages, including Arabic

•Associated domain ontologies totalling 20,000 terms and 70,000 axioms

•Free

• SUMO is owned by IEEE but basically public domain

• Domain ontologies are released under GNU

• www.ontologyportal.org

Page 12: 1 Adam Pease Articulate Software apease at articulatesoftware dot com  adampease/professional

12

Formal Ontology Pilot

• Added hundreds of new terms to SUMO, covering NIEM concepts– Sophisticated formal representation with

logical rules– Content now included in MILO.kif and

Justice.kif

Page 13: 1 Adam Pease Articulate Software apease at articulatesoftware dot com  adampease/professional

13

Example NIEM Problem

• Misuse of subclassOf relation– Requirement subClassOf Technology– If asked to generate a list of known

technologies (known instances of the class Technologies), an OWL-compliant reasoner would produce a motley list containing individual requirements, air flow models, and barrier systems, with no obvious connection between them

– OWL lacks a “partOf” relation, so subClassOf is often misused

Page 14: 1 Adam Pease Articulate Software apease at articulatesoftware dot com  adampease/professional

14

Basic Issues with OWL

• restriction to only binary relations ("B" is between "A" and "C" is a prototypical problematic example)

• absence of functional terms– (GovernmentFn UnitedStates) instead of having

to create GovernmentOfUnitedStates, GovernmentOfSweden etc.

• absence of a facility for negating statements (i.e. "Adam is not a blond Swede.")

• These are just the most obvious issues...