MASTERS THESIS DEFENSE MASTERS THESIS DEFENSE Solving Winograd Schema Challenge: Using Semantic...

67
MASTERS THESIS DEFENSE Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning School of Computing, Informatics, and Decision Systems Engineering Arizona State University BY ARPIT SHARMA ADVISOR: DR. CHITTA BARAL OCTOBER 31ST 2014 1

Transcript of MASTERS THESIS DEFENSE MASTERS THESIS DEFENSE Solving Winograd Schema Challenge: Using Semantic...

1

MASTERS THESIS DEFENSE

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition

and Logical Reasoning

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

BY

ARPIT SHARMAA DV I S O R: D R. C H I TTA B A RA L

O C T O B E R 3 1 S T 2 0 1 4

2

Presentation Overview Background and Motivation Problem and Related Work The System Semantic Parser & Pronoun Extractor Automatic Background Knowledge Extractor Logical Reasoning Engine

System Evaluation and Error Analysis Contributions and Future Works

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

3School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Background and Motivation

4School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Background One of the goals of AI : simulation of human-level intelligence in machines Ability to think and reason, based on the commonsense knowledge about things How to measure ? Turing Test in 1950 (Deceive humans in conversation) Not an ideal test

A conversation with Scott Joel Aaronson, computer scientist the MIT

5School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Background

Hector J. Levesque suggested the Winograd Schema Challenge as an alternative to the Turing test in 2011

Its aim is not to deceive humans, but simulate human-like reasoning process

6

The town councilors refused to give the demonstrators a permit because they feared violence.

The town councilors refused to give the demonstrators a permit because they advocated violence

Contains a pair of sentences that differ in only one or two words The sentences contain an ambiguity that is resolved in opposite ways in the two sentences Requires the use of world knowledge and reasoning for its resolution

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Winograd SchemaExample

7

A Question Answering test A Collection of 141 Winograd Schemas. 282 Total Sentences A Question about each Sentence.

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

The Winograd Schema Challenge

The town councilors refused to give the demonstrators a permit because they feared violence.

Who feared violence ?

The town councilors refused to give the demonstrators a permit because they advocated violence.

Who advocated violence?

Example

8

Helpful in: Text Summarization Reading Comprehension Deep Question Answering Ultimate Thinking Machines

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Motivation

9School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Problem and Related Work

10School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

The Problem

The Fish ate the worm because it was hungryWho was hungry ?

Of course Quagmire, the answer is “the fish”

Hey Peter, can you answer the above

question based on the sentence ?

How did you know ? The sentence does not

mention it.

Ooo!!!!Does that mean

I am GOD!!!!

No Peter!!! You are just a fat

HUMAN!!

11School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

The Problem

Humans have commonsense or background knowledge about things and events

How do humans get this knowledge ?

And, from where ?

12School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Resolving Complex Cases of Definite Pronouns:The Winograd Schema Challenge

By Altaf Rahman and Vincent Ng, Human Language Technology Research Institute, 2012 Used statistical techniques and machine learning framework to combine their results (ranking-based approach) Created a new, Winograd Schema Challenge like, corpus.941 Winograd Schema (30% test set) 73% accuracyContains redundancy John shot Bill and he died.

The man shot his friend and he died.

Related Work

13School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

GoogleLions eat zebras because they are predators

Queries:“lions are predators”

“zebras are predators”

What if the sentence is, “Lions eat zebras because

they are hungry”

Resolving Complex Cases of Definite Pronouns:The Winograd Schema Challenge

Related Work

14School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Narrative Chains“partially ordered set of events centered around a common protagonist” - Nathanael Chambers, 2010

borrow-s invest-s spend-s pay-s raise-s lend-s

Drawbacks Only events (verbs) Less in number

The Fish ate the worm because it was hungryWho was hungry ?

Resolving Complex Cases of Definite Pronouns:The Winograd Schema Challenge

Related Work

15School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

By Peter Schuller, Marmara University, Department of Computer Engineering, 2014 Converted the given sentence into a dependency graph Manually created background knowledge graph Combined both graphs to get the answer Shows usability on 4 Winograd Schema

Background Knowledge

Tackling Winograd Schemasby Formalizing Relevance Theory in Knowledge

Graphs

Related Work

16School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

The System

17School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

The WorkflowGiven Sentence and Question

Answer

Automatic Background Knowledge Extractor

Logical Reasoning Module

Background Sentence

Semantic Representation of the Sentence and question

Semantic Representation of the Background Sentence

Pronoun Extractor

Semantic Parser

Pronoun to Be Resolved

18School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Semantic Parser & Pronoun Extractor

19School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Semantic ParserRepresent text into an Expressive Formal Representation

PreserveGrammatical

Structure

Syntactic Dependency Parse

Distinguish words with same

conceptual sense

Ontology (WordNet)

Uses General Set of Relations

Knowledge Machine (KM) Slot

Dictionary

20School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Semantic Parser

Stanford Dependency Parse of “The man loves his wife”

Syntactic Dependency Parse

lovesVBZ

manNN

wifeNN

hisPRP$

TheDT

det

nsubj dobj

poss

21School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Semantic Parser

Semantic Parse of “The man loves his wife”

Knowledge Machine Slot Dictionary Mapping

lovesVBZ

manNN

wifeNN

hisPRP$

agent recipient

possesed_by

Mapping Stanford Dependency

Relations to KM Slot Dictionary Using Intuitive

Rules

22

superclass

instance_of instance_of

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Semantic Parser

Ontology Addition to the Semantic Parse of “The man loves his wife because she loved him”

Ontology Addition

loves_3

man_2 wife_5

his_4

agent recipient

possesed_by

man

instance_of

instance_of

instance_of

loved_8

she_7 him_9

agentrecipient

caused_by

person

love

instance_of instance_of

person

personemotion

superclass

his

wife

superclass

23School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Pronoun Extractor

The man could not lift his son because he is so weak.

lift_5

man_2 son_7

his_6

agent recipient

possesed_by

man

instance_of

he_9

weak_12

trait

person

instance_of

instance_of

participantlift

not_4

negative

q_1

weak_3q

weak

trait

instance_of

instance_of

Who is weak?

weak

instance_of

he

son his

superclass

superclass

24School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Automatic Background Knowledge Extractor

25

The Idea is, to learn the usage of English words and the contexts in which they are used

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Automatic Background Knowledge Extractor

Creating query by using formal representation of the given sentence and the question

Extracting background knowledge sentences from a big source of raw text

That is done by,

26

Causal

Non Causal

Temporal

Locative

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Automatic Background Knowledge Extractor

The fish ate the worm because it was tasty.

Mary took out her flute and played one of her favorite pieces. She has had it since she was a child.

Jackson was greatly influenced by Arnold, though he lived two centuries earlier.

Sam’s drawing was hung just above Tina’s and it did look much better with another one above it.

Categorization of Winograd Schema

27

Two subtypes of Causal category are solved by the system

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Type2 : Causal AttributiveExample:-The man could not lift his son because he is so weak.

Who is weak?

Type1 : Direct Causal EventsExample:-Ann asked Mary what time the library closes, but she had forgotten.

Who had forgotten?

Automatic Background Knowledge Extractor

28School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Automatic Background Knowledge ExtractorCreating Queries

The man could not lift his son because he was so weak .

Who was weak ?

Query Set 1 (Q1):

“.*not.*lift.*because.*weak.*”

“.*not.*lift.*because.*so.*weak.*”

Queries Type1: Use semantic graph of the given sentence and the question

Trace all nodes of the question into the given sentence (except “Wh” nodes)

Extract semantically important words (except entities)

Also consider the connective words

Combine the words in their order of occurrence in the sentence and join them using wildcard (.*) and quotes (“”)

29School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Automatic Background Knowledge ExtractorCreating Queries

lift_4

man_2 son_7

his_6

he_9 weak_12

not_3

so_11

q_1

weak_3

man son

not

weak

so

he

hisperson

weakagent

negative

traitrecipient

instance_of

agent

instance_of instance_of

instance_of

instance_of

trait

instance_ofinstance_of possesed_by

superclass

superclass

superclass

Sentence Question

30

Queries Type2: Replace verbs with synonyms in query type 1.

Consider all combinations

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Automatic Background Knowledge ExtractorCreating Queries

The man could not lift his son because he was so weak .

Who was weak ?

A query among Q1 = “.*not.*lift.*because.*weak.*”

Query Set 2 (Q2):

“.*not.*pick.*because.*weak.*”

Final Queries: Final Set of Queries (Q) = Q1 Q2∪

Final Query Set (Q):

“.*not.*lift.*because.*weak.*”

“.*not.*lift.*because.*so.*weak.*”

“.*not.*pick.*because.*weak.*”

31

Using big source of raw text

Use search engine

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Automatic Background Knowledge ExtractorExtracting Background Knowledge

Sentences

32

Two ways in which sentences are extracted from WWW

Example Query: “.*not.*lift.*because.*weak.*”

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Automatic Background Knowledge ExtractorExtracting Background Knowledge

Sentences

33

Filtering the extracted sentences Should not contain the original sentence Should contain all the words in the query (in

any form) Should not contain partial sentences

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Automatic Background Knowledge ExtractorExtracting Background Knowledge

Sentences

The man could not lift his son because he was so weak .

Query: “.*not.*lift.*because.*weak.*”

Filtered sentences: She could not lift it off the floor because she is a weak girl She could not even lift her head because she was so weak I could not even lift my leg to turn over because the muscles were

weak after surgery …..

34School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Automatic Background Knowledge ExtractorParsing the Background Sentences

She could not lift it off the floor because she is a weak girl

35School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning Engine

36School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning Engine

Given Sentence

Logical Reasoning Engine

(ASP Rules)

Answer

Background Knowledge Sentence

Background Knowledge Sentences

Pronoun

37

Answer Set Programming Represent the Semantic Representation of the

Given Sentence and the Background sentence in ASP predicates Use ASP Reasoning Rules

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning Engine

38

Winograd SentenceAnn asked Mary what time the library closes, but she had forgotten

has(winograd,asked_2,agent,ann_1).has(winograd,asked_2,recipient,mary_3).has(winograd,asked_2,instance_of,ask).……..

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineRepresenting the Winograd and the

Background SentencesBackground SentenceBut you asked me the security question but I forgotten

has(background,asked_103,agent,you_102).has(background,asked_103,instance_of,ask).has(background,ask,superclass,communication).……..

asked_2

ann_1 Mary_3ask

agent

instance_of

agent

asked_2

you_102

agent

ask

instance_of

communication

instance_of

39School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineASP rules to capture general properties in Background and

Winograd sentences Reachability (Transitivity within context)

Cross context siblings (words belonging to same class in different contexts)

Negative Words (words with negative word associated with them)

40School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineReachability

Background

reachableFrom(background, asked_3,forgotten_10)

41

Basic transitivity relationship between event nodes in a particular context.

reachableFrom(C,X,Y) :- has(C,X,REL,Y), context(C), eventRelation(REL).

reachableFrom(C,X,Z) :- reachableFrom(C,X,Y), has(C,Y,REL,Z), context(C) , eventRelation(REL), X!=Y, Y!=Z.

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineReachability

Event Relations from KM

causes

caused_by

defeats

defeated_by

enables

enabled_by

inhibits

inhibited_by

……. (15 more)

42School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineCross-Context Siblings

Winograd Background

crossContextSiblings(asked_2,asked_3)

43

Words in different sentences (Winograd or Background) are instances of the same conceptual class then they are defined as cross context siblings

crossContextSiblings(E1,E2) :- has(background,E1,instance_of,C),

has(winograd,E2,instance_of,C),

E1!=E2.

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineCross-Context Siblings

44School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineNegative Polarity

negativePolarity(lift_4)

45

Words associated with a negation word like ”not”, are defined by negativePolarity predicate.

negativePolarity(E) :- has(C,E,negative,N1), context(C).

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineNegative Polarity

46School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineType Specific Reasoning

Type1: Direct Causal Events

A B P

EVENT1 EVENT2

X Y X

EVENT1’ EVENT2’

rel1 rel1rel2rel3rel3 rel4

Winograd Background

Ann asked Mary what time the library closes, but she had forgotten.

Who had forgotten?

47School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineType Specific Reasoning

Type1: Direct Causal Events

matchingEvents(asked_2,forgotten_13,asked_3,forgotten10)

Winograd Background

48School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

matchingEvents(A,B,A1,B1) :- crossContextSiblings(A,A1), reachableFrom(winograd,A,B),crossContextSiblings(B,B1) ,reachableFrom(background,A1,B1) , negativePolarity(A),not negativePolarity(B),negativePolarity(A1),not negativePolarity(B1).

Logical Reasoning EngineType1: Direct Causal Events

Step1: A and B are the reachable nodes in the sentence graph which has A1 and B1 as crossContextSibling, reachable events respectively from Background sentence graph.

49

Pronoun to be resolved

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

eventSubgraph(winograd,forgotten_13,agent,she_11)

Logical Reasoning EngineType1: Direct Causal Events

Winograd

50

Step2: Extract the sub graph from the Winograd sentence which contains the pronoun to be resolved, the event in which it participates and their relation.

eventSubgraph(winograd,A,S,X) :- matchingEvents(A,B,C,D), has(winograd,A,S,X),

toBeResolved(X).

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineType1: Direct Causal Events

51School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

forgotten_110

entity1_109

agent

forget

instance_of

forget

instance_of

Background

eventSubgraph(background,forgotten_110,agent,entity1_109)

Logical Reasoning EngineType1: Direct Causal Events

52

Step3: Extract the sub graph from Background sentence which contains a matching event of the event to which the pronoun to be resolved is related in the Winograd sentence.

eventsubgraph(background,A1,S,X1) :- eventSubgraph(winograd,A,S,X),

matchingEvents(A,B,A1,B1),

has(background,A1,S,X1).

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineType1: Direct Causal Events

53School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

forgotten_110

entity1_109

agent

forget

instance_of

entity1

instance_of

Background

asked_103

entity1_104

recipient

entity1

instance_of

next_event

eventPronounRelation(background,asked_103,recipient)

Logical Reasoning EngineType1: Direct Causal Events

54

Step4: Extract the event and relation from the Background graph. It is helpful in getting the final answer.

eventPronounRelation(background,D,S1) :- matchingEvents(A,B,C,D),

eventSubgraph(background,C,S,X1),

has(background,D,S1,X2),

has(background,X1,instance_of,X),

has(background,X2,instance_of,X).

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineType1: Direct Causal Events

55School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineType1: Direct Causal Events

hasCoreferent(she_11,mary_3)

Winograd

56

Step5: Extract the co-referent of the pronoun to be resolved from the Winograd sentence graph.

hasCoreferent(P,X) :- eventPronounRelation(background,C,S),

matchingEvents(A,B,C,D),

has(winograd,A,S,X),

toBeResolved(P), P!=X.

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineType1: Direct Causal Events

57

The ASP implementation is similar to the Type1 implementation

Some more type specific rules are used along with the general rules

School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Logical Reasoning EngineType2: Causal Attributive

58School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

System Evaluation & Error Analysis

59School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

System Evaluation

Total 282 sentences in WSC Causal category has >200 Causal sub-categories, Type1 and Type2, combined have 100 sentences Results

Total Number of Sentences Evaluated

Answered Background Knowledge Not Found

Answered Correctly

Answered Incorrectly

Percentage Correct

100 80 20 70 10 87.5

60School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Error Analysis

20 out of 100 not answeredSuitable background knowledge was not found

Mark ceded the presidency to John because he was less popular

61School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Error Analysis

Bob paid for Charlie’s college education, he is very grateful

10 out of 80 incorrectly answered Deeper analysis of background knowledge is required

I paid the price for my stupidity. How grateful I am.Background Sentence:

Winograd Sentence:

62School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Contributions and Future Works

63School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Contributions

Implemented a system to solve the Winograd Schema Challenge by using Background Knowledge

Implemented an approach to automatically extract commonsense knowledge

Co-Implemented a new semantic representation system (available at www.kparser.org)

64School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

Future Works

Solving other WSC categories

Participate in NUANCE’s competition

Creating a commonsense Knowledge Base

Solve Reading Comprehension and other problems

65School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

THANK YOU!!!

66School of Computing, Informatics, and Decision Systems Engineering Arizona State University

Solving Winograd Schema Challenge: Using Semantic Parsing, Automatic Knowledge Acquisition and Logical Reasoning

67

THANK YOU!!!