Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The...
-
Upload
daniel-wilkins -
Category
Documents
-
view
213 -
download
0
Transcript of Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The...
![Page 1: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/1.jpg)
Commonsense Reasoning in Commonsense Reasoning in and over Natural Languageand over Natural Language
Hugo Liu, Push SinghHugo Liu, Push Singh
Media Laboratory of MITMedia Laboratory of MIT
The 8The 8thth International Conference on International Conference on Knowledge-Based Intelligent Information & Knowledge-Based Intelligent Information &
Engineering Systems (KES’04)Engineering Systems (KES’04)
![Page 2: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/2.jpg)
2
AbstractAbstract
ConceptNet is a very large semantic network of commonseConceptNet is a very large semantic network of commonsense knowledge suitable for making various kinds of practicnse knowledge suitable for making various kinds of practical inferences over text.al inferences over text.
To meet the dual challenge of having to To meet the dual challenge of having to encode complex higheencode complex higher-order conceptsr-order concepts, and , and maintaining ease-of-usemaintaining ease-of-use, we introduce a n, we introduce a novel use of ovel use of semi-structuredsemi-structured natural language fragments as natural language fragments as the knowledge representation of commonsense concepts.the knowledge representation of commonsense concepts.
![Page 3: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/3.jpg)
3
IntroductionIntroduction
What is ConceptNet?What is ConceptNet? The largest freely available, machine-useable commonsense The largest freely available, machine-useable commonsense
resource.resource. Structured as a network of semi-structured natural language Structured as a network of semi-structured natural language
fragments.fragments. Consists of over 250,000 elements of commonsense knowledConsists of over 250,000 elements of commonsense knowled
ge.ge. Inspired dually by the range of commonsense concepts and rInspired dually by the range of commonsense concepts and r
elations in Cyc, and by the ease-of-use of WordNet.elations in Cyc, and by the ease-of-use of WordNet.
![Page 4: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/4.jpg)
4
IntroductionIntroduction
The work of the author for ConceptNetThe work of the author for ConceptNet1.1. Extending WordNet’s lexical notion of nodes to a conceptuExtending WordNet’s lexical notion of nodes to a conceptu
al notion of nodesal notion of nodes Semi-structured natural language fragments according to an oSemi-structured natural language fragments according to an o
ntology of allowable syntactic patterns.ntology of allowable syntactic patterns.
2.2. Extending WordNet’s small ontology of semantic relations Extending WordNet’s small ontology of semantic relations to include a richer set of relations appropriate to concept-leto include a richer set of relations appropriate to concept-level nodes.vel nodes.
3.3. We supplement the ConceptNet semantic network with somWe supplement the ConceptNet semantic network with some methodology for reasoning.e methodology for reasoning.
4.4. We supplement the ConceptNet semantic network with a toWe supplement the ConceptNet semantic network with a toolkit and API which supports making practical commonsensolkit and API which supports making practical commonsense inferences about text.e inferences about text.
![Page 5: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/5.jpg)
5
Fig 1.Fig 1. An excerpt from ConceptNet’s semantic network of co An excerpt from ConceptNet’s semantic network of commonsense knowledgemmonsense knowledge
![Page 6: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/6.jpg)
6
Origin of ConceptNetOrigin of ConceptNet
ConceptNet is mined out of the ConceptNet is mined out of the Open Mind Commonsense Open Mind Commonsense (OMCS) corpus.(OMCS) corpus. A collection of nearly 700,000 semi-structured English sentenA collection of nearly 700,000 semi-structured English senten
ces of commonsense facts.ces of commonsense facts. An automatic process which applies a set of ‘An automatic process which applies a set of ‘commonsense commonsense
extraction rulesextraction rules’.’. A pattern matching parser uses roughly 40 mapping rules.A pattern matching parser uses roughly 40 mapping rules.
![Page 7: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/7.jpg)
7
Structure of ConceptStructure of Concept
ConceptNet nodes are fragments semi-structured to conforConceptNet nodes are fragments semi-structured to conform to preferred syntactic patterns.m to preferred syntactic patterns. 3 general classes:3 general classes:
Noun Phrases: things, places, peopleNoun Phrases: things, places, people Attributes: modifiersAttributes: modifiers Activity Phrases: actions and actions compounded with a NP or PActivity Phrases: actions and actions compounded with a NP or P
PP
ConceptNet edges are described by an ontology of 19 binarConceptNet edges are described by an ontology of 19 binary relations.y relations. The syntactic and/or semantic type of the arguments are not fThe syntactic and/or semantic type of the arguments are not f
ormally constrained.ormally constrained.
![Page 8: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/8.jpg)
8
Table 2.Table 2. Semantic relation types currently in ConceptNet Semantic relation types currently in ConceptNet
![Page 9: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/9.jpg)
9
Methodology for Reasoning over Methodology for Reasoning over Natural Language ConceptsNatural Language Concepts
Computing Conceptual SimilarityComputing Conceptual Similarity
Flexible InferenceFlexible Inference Context findingContext finding Inference chainingInference chaining Conceptual analogyConceptual analogy
![Page 10: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/10.jpg)
10
Computing Conceptual Computing Conceptual Similarity (1/3)Similarity (1/3)
1.1. The concept is decomposed into first order atomic concepThe concept is decomposed into first order atomic concepts to compute its meaning.ts to compute its meaning.Ex: “buy good cheese” → ”buy”, “good”, “cheese”Ex: “buy good cheese” → ”buy”, “good”, “cheese”
2.2. Each atom is situated within the conceptual frameworks of Each atom is situated within the conceptual frameworks of several resources.several resources. WordNetWordNet Longman’s Dictionary of Contempory English (LDOCE)Longman’s Dictionary of Contempory English (LDOCE) Beth Levin’s English Verb ClassesBeth Levin’s English Verb Classes FrameNetFrameNet
![Page 11: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/11.jpg)
11
Computing Conceptual SimilaritComputing Conceptual Similarity (2/3)y (2/3)
3.3. Within each resource, a similarity score is produced for eacWithin each resource, a similarity score is produced for each pair of corresponding atoms. (verb matching verb, etc)h pair of corresponding atoms. (verb matching verb, etc) The similarity score is inversely proportional to The similarity score is inversely proportional to inference distinference dist
anceance in WordNet, LDOCE, or FrameNet’s inheritance structur in WordNet, LDOCE, or FrameNet’s inheritance structure.e.
In Levin’s Verb Classes, the similarity score is proportional to In Levin’s Verb Classes, the similarity score is proportional to the the percentage of alternation classes sharedpercentage of alternation classes shared..
4.4. The weighted sum of the similarity scores is produced for eaThe weighted sum of the similarity scores is produced for each atom using each of the resources.ch atom using each of the resources. Weight on each resource is proportional to the predictive accWeight on each resource is proportional to the predictive acc
uracy of that resource.uracy of that resource.
5.5. Weight on a atom is proportional to the relative importance Weight on a atom is proportional to the relative importance of the different atom type.of the different atom type.
![Page 12: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/12.jpg)
12
Computing Conceptual SimilaritComputing Conceptual Similarity (3/3)y (3/3)
Computing conceptual similarity using lexical inferential diComputing conceptual similarity using lexical inferential distance is very difficult, so we can only make heuristic apprstance is very difficult, so we can only make heuristic approximations.oximations.
Table 3.Table 3. Some pairwise similarities in ConceptNet Some pairwise similarities in ConceptNet
![Page 13: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/13.jpg)
13
Flexible InferenceFlexible Inference
One of the strengths of representing concepts in natural laOne of the strengths of representing concepts in natural language is the ability to add nguage is the ability to add flexibilityflexibility and and fuzzinessfuzziness to impr to improve inference.ove inference.
Inferences in semantic networks are based on graph reasoInferences in semantic networks are based on graph reasoning methods like ning methods like spreading activationspreading activation, , structure mappingstructure mapping, , and and network traversalnetwork traversal.. Basic spreading activationBasic spreading activation
activation_scoreactivation_score((BB) = ) = activation_scoreactivation_score((AA)*)*weightweight((edgeedge((AA,,BB))))
![Page 14: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/14.jpg)
14
Flexible Inference – Context Flexible Inference – Context FindingFinding
Determining the context around a concept, or around the iDetermining the context around a concept, or around the intersection of several concepts is useful.ntersection of several concepts is useful.
The contextual neighborhood around a node is found by peThe contextual neighborhood around a node is found by performing rforming spreading activationspreading activation from that source node. from that source node. Pairwise similarity of nodes leading to a more accurate estimPairwise similarity of nodes leading to a more accurate estim
ation of contextual neighborhood.ation of contextual neighborhood.
![Page 15: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/15.jpg)
15
Flexible Inference – Inference CFlexible Inference – Inference Chaininghaining
Inference chain is a basic type of inference on a graph: travInference chain is a basic type of inference on a graph: traversing the graph from one node to another via some path.ersing the graph from one node to another via some path.
A temporal chain between “buy food” and “fall asleep”:A temporal chain between “buy food” and “fall asleep”:““buy food” buy food” “have food” “have food” “eat food” “eat food” “feel full” “feel full”
“feel sleepy” “feel sleepy” “fall asleep” “fall asleep” The pairwise conceptual similarity is particularly crucial to The pairwise conceptual similarity is particularly crucial to
the robustness of inference chaining.the robustness of inference chaining.ex: “buy steak” instead of “buy food”ex: “buy steak” instead of “buy food”
(Liu, 2003) used inference chaining for affective text classifi(Liu, 2003) used inference chaining for affective text classification.cation.
![Page 16: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/16.jpg)
16
Flexible Inference – Conceptual Flexible Inference – Conceptual AnalogyAnalogy
Structural analogyStructural analogy is not just a measure of semantic is not just a measure of semantic distance.distance.
Ex: “wedding” is much more like “funeral” than “bride”Ex: “wedding” is much more like “funeral” than “bride”
Structure-mappingStructure-mapping methods are employed to generate methods are employed to generate simple conceptual analogies.simple conceptual analogies.
We can emphasize We can emphasize functional similarityfunctional similarity versus versus temporal temporal similaritysimilarity by biasing the weights of particular semantic by biasing the weights of particular semantic relations.relations.
![Page 17: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/17.jpg)
17
Some Applications of ConceptNetSome Applications of ConceptNet
MAKEBELIEVEMAKEBELIEVE A story-generator that allows a person to interactively invent A story-generator that allows a person to interactively invent
a story with the system.a story with the system. GloBuddyGloBuddy
A dynamic foreign language phrasebook.A dynamic foreign language phrasebook. AAAAAA
A profiling and recommendation system that recommends prA profiling and recommendation system that recommends products from Amazon.com by using ConceptNet to reason abooducts from Amazon.com by using ConceptNet to reason about a person’s goals and desires.ut a person’s goals and desires.
![Page 18: Commonsense Reasoning in and over Natural Language Hugo Liu, Push Singh Media Laboratory of MIT The 8 th International Conference on Knowledge- Based Intelligent.](https://reader036.fdocuments.us/reader036/viewer/2022082821/5697c0291a28abf838cd7844/html5/thumbnails/18.jpg)
18
ConclusionConclusion
ConceptNet is presently the largest freely available commoConceptNet is presently the largest freely available commonsense resource, with a set of tools to support several kindnsense resource, with a set of tools to support several kinds of practical inferences over text.s of practical inferences over text.
ConceptNet maintains an easy-to-use knowledge representConceptNet maintains an easy-to-use knowledge representation and incorporates more complex higher-order commoation and incorporates more complex higher-order commonsense concepts and relations.nsense concepts and relations.
A novel methodology for computing the pairwise similarity A novel methodology for computing the pairwise similarity of concepts is presented.of concepts is presented.
ConceptNet has been widely used in a number of research ConceptNet has been widely used in a number of research projects.projects.