A Brief Introduction to Semantics
Transcript of A Brief Introduction to Semantics
![Page 1: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/1.jpg)
A Brief Introduction to Semantics
CMSC 473/673
UMBC
![Page 2: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/2.jpg)
Outline
Recap: dependency grammars and arc-standard dependency parsing
Meaning from Syntax
Structured Meaning: Semantic Frames and RolesWhat problem do they solve?TheoryComputational resources: FrameNet, VerbNet, PropbankComputational Task: Semantic Role Labeling
Selectional RestrictionsWhat problem do they solve?Computational resources: WordNetSome simple approaches
![Page 3: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/3.jpg)
Labeled Dependencies
Word-to-word labeled relations
governor (head)
dependentChris ate
nsubj
Constituency trees/analyses (PCFGs): based on hierarchical
structure
Dependency analyses: based on word relations
![Page 4: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/4.jpg)
(Labeled) Dependency Parse
Directed graphs
Vertices: linguistic blobs in a sentence
Edges: (labeled) arcs
Often directed trees
1. A single root node with no incoming arcs
2. Each vertex except root has exactly one incoming arc
3. Unique path from the root node to each vertex
![Page 5: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/5.jpg)
Are CFGs for Naught?
Nope! Simple algorithm from Xia and Palmer (2011)
1. Mark the head child of each node in a phrase structure, using “appropriate” head rules.
2. In the dependency structure, make the head of each non-head child depend on the head of the head-child.
Papa ate the caviar with a spoon
NP V D N P D N
NP NP
PPVP
VP
S
ate spoon
spooncaviar
ate
ate
![Page 6: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/6.jpg)
Shift-Reduce Dependency Parsing
Tools: input words, some special root symbol ($), and a stack to hold configurations
Shift:– move tokens onto the stack
– decide if top two elements of the stack form a valid (good) grammatical dependency
Reduce:– If there’s a valid relation, place head on the stack
decide how?Search problem!
what is valid?Learn it!
what are the possible actions?
![Page 7: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/7.jpg)
Arc Standard Parsing
state {[root], [words], [] }
while state ≠ {[root], [], [(deps)]} {
t ← ORACLE(state)
state ← APPLY(t, state)
}
return state
PossibilityActionName
Action Meaning
Assign the current word as the head of some previously seen word
LEFTARC
Assert a head-dependent relation between the word at the top of stack and the word directly beneath it; remove
the lower word from the stack
Assign some previously seen word as the head of
the current wordRIGHTARC
Assert a head-dependent relation between the second word on the stack and the word at the top; remove the
word at the top of the stack
Wait processing the current word; add it for
laterSHIFT
Remove the word from the front of the input buffer and push it onto the stack
![Page 8: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/8.jpg)
Arc Standard Parsing
state {[root], [words], [] }
while state ≠ {[root], [], [(deps)]} {
t ← ORACLE(state)
state ← APPLY(t, state)
}
return state
Q: What is the time complexity?
A: Linear
Q: What’s potentially problematic?
A: This is a greedy algorithm
![Page 9: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/9.jpg)
Learning An Oracle (Predictor)
Training data: dependency treebank
Input: configuration
Output: {LEFTARC, RIGHTARC, SHIFT}
t ← ORACLE(state)
• Choose LEFTARC if it produces a correct head-dependent relation given the reference parse and the current configuration
• Choose RIGHTARC if • it produces a correct head-dependent relation given the reference parse and• all of the dependents of the word at the top of the stack have already been
assigned• Otherwise, choose SHIFT
![Page 10: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/10.jpg)
Training the Predictor
Predict action t give configuration s
t = φ(s)
Extract features of the configurationExamples: word forms, lemmas, POS,
morphological features
How? Perceptron, Maxent, Support Vector Machines, Multilayer Perceptrons, Neural Networks
Take CMSC 478 (678) to learn more about these
![Page 11: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/11.jpg)
Semantics
Represent the “meaning” of an utterance
Papa ate the caviar with a spoon.
What does this mean?
![Page 12: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/12.jpg)
Some Approaches for Representing Meaning
1. Extract it directly from syntax➔ Open Information Extraction
2. Add interpretation rules to syntax, and extract meaning from them➔ Logical form parsing; CCG parsing
3. Create new tree-/graph-like semantic parses➔ Semantic role labeling; {FrameNet, PropBank, VerbNet} parsing
4. Develop/obtain lexical resources and use them to represent semantic features of things➔ Leverage WordNet; Selectional preferences
![Page 13: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/13.jpg)
Outline
Recap: dependency grammars and arc-standard dependency parsing
Meaning from Syntax
Structured Meaning: Semantic Frames and RolesWhat problem do they solve?TheoryComputational resources: FrameNet, VerbNet, PropbankComputational Task: Semantic Role Labeling
Selectional RestrictionsWhat problem do they solve?Computational resources: WordNetSome simple approaches
![Page 14: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/14.jpg)
From Dependencies to Shallow Semantics
Core idea: a syntactic parse already encodes some amount of meaning
“Papa” is the subject“the caviar” is the object
…
![Page 15: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/15.jpg)
From Syntax to Shallow Semantics
Angeli et al. (2015)
“Open Information Extraction”
![Page 16: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/16.jpg)
From Syntax to Shallow Semantics
http://corenlp.run/ (constituency & dependency)
https://github.com/hltcoe/predpatt
http://openie.allenai.org/
http://www.cs.rochester.edu/research/knext/browse/ (constituency trees)
http://rtw.ml.cmu.edu/rtw/
Angeli et al. (2015)
“Open Information Extraction”
a sampling of efforts
![Page 17: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/17.jpg)
Logical Forms of Sentences
Core idea: find a (first order) logical form that
corresponds to the sentence
and evaluates to TRUE
“Papa ate the caviar”
∃𝑒 Eating 𝑒 ∧ Agent 𝑒, Papa ∧ Theme(𝑒, caviar)
(Or instantiated….)
![Page 18: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/18.jpg)
Logical Forms of Sentences
Core idea: find a (first order) logical form that
corresponds to the sentence
and evaluates to TRUE
“Papa ate the caviar”
This means assigning/learning a (partial)
logical form for each word
∃𝑒 Eating 𝑒 ∧ Agent 𝑒, Papa ∧ Theme(𝑒, caviar)
(Or instantiated….)
![Page 19: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/19.jpg)
Get Logical Forms from Parses
Papa ate the caviar
Papa ate the caviar
NP V D N
NP
VP
S
ate
ate
![Page 20: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/20.jpg)
Get Logical Forms from Parses
Papa ate the caviar
Papa ate the caviar
NP V D N
NP
VP
S
ate
ateLogical form of ate
![Page 21: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/21.jpg)
Get Logical Forms from Parses
Papa ate the caviar
Papa ate the caviar
NP V D N
NP
VP
S
ate
ate
![Page 22: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/22.jpg)
Get Logical Forms from Parses
Papa ate the caviar
Papa ate the caviar
NP V D N
NP
VP
S
ate
ate
∃𝑒 Eating 𝑒 ∧ Agent 𝑒, Papa ∧ Theme(𝑒, caviar)
![Page 23: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/23.jpg)
Outline
Recap: dependency grammars and arc-standard dependency parsing
Meaning from Syntax
Structured Meaning: Semantic Frames and RolesWhat problem do they solve?TheoryComputational resources: FrameNet, VerbNet, PropbankComputational Task: Semantic Role Labeling
Selectional RestrictionsWhat problem do they solve?Computational resources: WordNetSome simple approaches
![Page 24: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/24.jpg)
Semantic RolesApplications
Question & answer systems
Who did what to whom at where?
30
The police officer detained the suspect at the scene of the crime
ARG0 ARG2 AM-loc V Agent ThemePredicate Location
Following slides adapted from SLP3
![Page 25: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/25.jpg)
Predicate Alternations
XYZ corporation bought the stock.
They sold the stock to XYZ corporation.
The stock was bought by XYZ corporation.
The purchase of the stock by XYZ corporation...
The stock purchase by XYZ corporation...
![Page 26: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/26.jpg)
A Shallow Semantic Representation: Semantic Roles
Predicates (bought, sold, purchase) represent a situation
Semantic (thematic) roles express the abstract role that arguments of a predicate can take in the event
Different schemes/annotation styles have different specificities
buyer proto-agentagent
More specific More general
These terms are labels different annotation schemes might use
![Page 27: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/27.jpg)
Thematic roles
Sasha broke the window
Pat opened the door
Subjects of break and open: Breaker and Opener
Specific to each event
![Page 28: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/28.jpg)
Thematic roles
Sasha broke the window
Pat opened the door
Subjects of break and open: Breaker and Opener
Specific to each event
Breaker and Opener have something in common!
Volitional actors
Often animate
Direct causal responsibility for their events
Thematic roles are a way to capture this semantic commonality between Breakers and Eaters.
![Page 29: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/29.jpg)
Thematic roles
Sasha broke the window
Pat opened the door
Subjects of break and open: Breaker and Opener
Specific to each event
Breaker and Opener have something in common!
Volitional actorsOften animateDirect causal responsibility for their events
Thematic roles are a way to capture this semantic commonality between Breakers and Eaters.
They are both AGENTS.
The BrokenThing and OpenedThing, are THEMES.
prototypically inanimate objects affected in some way by the action
![Page 30: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/30.jpg)
Thematic roles
Sasha broke the window
Pat opened the door
Subjects of break and open: Breaker and Opener
Specific to each event
Breaker and Opener have something in common!Volitional actorsOften animateDirect causal responsibility for their events
Thematic roles are a way to capture this semantic commonality between Breakers and Eaters.
They are both AGENTS.
The BrokenThing and OpenedThing, are THEMES.prototypically inanimate objects affected in some way by the action
Modern formulation fromFillmore (1966, 1968), Gruber (1965)
Fillmore influenced by Lucien Tesnière’s (1959) Eléments de Syntaxe Structurale,the book that introduced dependency grammar
![Page 31: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/31.jpg)
“Standard” Thematic Roles
![Page 32: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/32.jpg)
Thematic Roles Help Capture Verb Alternations (Diathesis Alternations)
Break: AGENT, INSTRUMENT, or THEME as subject
Give: THEME and GOAL in either order
![Page 33: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/33.jpg)
Thematic Roles Help Capture Verb Alternations (Diathesis Alternations)
Levin (1993): 47 semantic classes (“Levin classes”) for
3100 English verbs and alternations. In online resource
VerbNet.
Break: AGENT, INSTRUMENT, or THEME as subject
Give: THEME and GOAL in either order
![Page 34: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/34.jpg)
Issues with Thematic Roles
Hard to create (define) a standard set of roles
Role fragmentation
![Page 35: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/35.jpg)
Issues with Thematic Roles
Hard to create (define) a standard set of roles
Role fragmentationFor example: Levin and Rappaport Hovav (2015): two kinds of INSTRUMENTS
intermediary instruments that can appear as subjects
The cook opened the jar with the new gadget.
The new gadget opened the jar.
enabling instruments that cannot
Shelly ate the sliced banana with a fork.
*The fork ate the sliced banana.
![Page 36: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/36.jpg)
Alternatives to Thematic Roles
1. Fewer roles: generalized semantic roles, defined as prototypes (Dowty, 1991)
PROTO-AGENT
PROTO-PATIENT
2. More roles: Define roles specific to a group of predicates
FrameNet
PropBank
![Page 37: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/37.jpg)
PropBank Frame Files
Palmer, Martha, Daniel Gildea, and Paul Kingsbury. 2005. The Proposition Bank: An Annotated Corpus of Semantic Roles. Computational Linguistics, 31(1):71–106
![Page 38: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/38.jpg)
View Commonalities Across Sentences
![Page 39: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/39.jpg)
Human Annotated PropBank Data
Penn English TreeBank, OntoNotes 5.0. Total ~2 million words
Penn Chinese TreeBankHindi/Urdu PropBankArabic PropBank
Verb Frames Coverage By Language –
Current Count of Senses (lexical units)
Language Final Count Estimated Coverage
in Running Text
English 10,615* 99%
Chinese 24, 642 98%
Arabic 7,015 99%
• Only 111 English adjectives
54
2013 Verb Frames Coverage Count of word sense (lexical units)
From Martha Palmer 2013 Tutorial
![Page 40: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/40.jpg)
FrameNet
Roles in PropBank are specific to a verb
Role in FrameNet are specific to a framea background knowledge structure that defines a set of frame-specific semantic roles, called frame elements
Frames can be related (inherited, demonstrate alternations, etc.)
Each frame can be triggered by different “lexical units”
See: Baker et al. 1998, Fillmore et al. 2003, Fillmore and Baker 2009, Ruppenhofer et al. 2006
![Page 41: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/41.jpg)
Example:The “Change position on a scale” Frame
This frame consists of words that indicate the change of an ITEM’s position on a scale (the
ATTRIBUTE) from a starting point (INITIAL VALUE) to an end point (FINAL VALUE)
![Page 42: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/42.jpg)
Lexical Triggers: Vocabulary Items that Instantiate a Frame
The “Change position on a scale” Frame
![Page 43: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/43.jpg)
Frame Roles (Elements)
The “Change position on a scale” Frame
![Page 44: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/44.jpg)
FrameNet and PropBank representations
PropBank annotations are layered on CFG parses
![Page 45: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/45.jpg)
FrameNet and PropBank representations
PropBank annotations are layered on CFG parses
FrameNet annotations can be layered on either CFG or dependency parses
![Page 46: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/46.jpg)
Automatic Semantic Parses
English Gigaword, v5
Annotated NYT English Wikipedia
Total
Documents 8.74M 1.81M 5.06M 15.61M
Sentences 170M 70M 154M 422M
Tokens 4.3B 1.4B 2.3B 8B
Vocabulary (≥ 100) 225K 120K 264K 91K
Semantic Frames 2.6B 780M 1.1B 4.4B
Ferraro et al. (2014)
https://goo.gl/BrsG4x(or Globus---talk to me)
talk to me
2x FrameNet1x PropBank
![Page 47: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/47.jpg)
Outline
Recap: dependency grammars and arc-standard dependency parsing
Meaning from Syntax
Structured Meaning: Semantic Frames and RolesWhat problem do they solve?TheoryComputational resources: FrameNet, VerbNet, PropbankComputational Task: Semantic Role Labeling
Selectional RestrictionsWhat problem do they solve?Computational resources: WordNetSome simple approaches
![Page 48: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/48.jpg)
Semantic Role Labeling (SRL)
Find the semantic roles of
each argument of
each predicate
in a sentence.
![Page 49: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/49.jpg)
Why Semantic Role Labeling
A useful shallow semantic representation
Improves NLP tasks:
question answering (Shen and Lapata 2007, Surdeanu et al. 2011)
machine translation (Liu and Gildea 2010, Lo et al. 2013)
![Page 50: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/50.jpg)
A Simple Parse-Based Algorithm
Input: sentenceOutput: Labeled tree
parse = GE TPA R S E (sentence)for each predicate in parse {
for each node in parse {fv = EX T R A C T FE A T U R ES (node, predicate, parse)CLA S S IF Y NO D E (node, fv, parse)
}}
![Page 51: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/51.jpg)
Simple Predicate Prediction
PropBank: choose all verbs
FrameNet: choose every word that was labeled as a target in training data
![Page 52: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/52.jpg)
SRL Features
Headword of constituent
Examiner
Headword POS
NNP
Voice of the clause
Active
Subcategorization of pred
VP -> VBD NP PP
Named Entity type of constituent
ORGANIZATION
First and last words of constituent
The, Examiner
Linear position re: predicate
before
Path Features
![Page 53: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/53.jpg)
Path Features
Path in the parse tree from the constituent to the predicate
![Page 54: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/54.jpg)
Path Features
Path in the parse tree from the constituent to the predicate
![Page 55: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/55.jpg)
Frequent Path Features
Palmer, Gildea, Xue (2010)
![Page 56: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/56.jpg)
3-step SRL
1. Pruning: use simple heuristics to prune unlikely constituents.
2. Identification: a binary classification of each node as an argument to be labeled or a NONE.
3. Classification: a 1-of-N classification of all the constituents that were labeled as arguments by the previous stage
![Page 57: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/57.jpg)
3-step SRL
1. Pruning: use simple heuristics to prune unlikely constituents.
2. Identification: a binary classification of each node as an argument to be labeled or a NONE.
3. Classification: a 1-of-N classification of all the constituents that were labeled as arguments by the previous stage
Pruning & IdentificationPrune the very unlikely constituents first, and then use a classifier to get rid of the rest
Very few of the nodes in the tree could possible be arguments of that one predicate
Imbalance between positive samples (constituents that are arguments of predicate)negative samples (constituents that are not arguments of predicate)
![Page 58: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/58.jpg)
Features for Frame Identification
Das et al (2014)
![Page 59: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/59.jpg)
Joint-Inference SRL: Reranking
Stage 1: SRL system produces multiple possible labels for each constituent
Stage 2: Find the best global label for all constituents
![Page 60: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/60.jpg)
Joint-Inference SRL: Factor Graph
Make a large, probabilistic factor graph
Run (loopy) belief propagation
Take CMSC 678/691 to learn more
![Page 61: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/61.jpg)
Joint-Inference SRL: Neural/Deep SRL
Make a large (deep) neural network
Run back propagation
Take CMSC 678/691 to learn more
![Page 62: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/62.jpg)
PropBank: Not Just English
![Page 63: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/63.jpg)
Not Just Verbs: NomBank
Meyers et al. 2004
Figure from Jiang and Ng 2006
![Page 64: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/64.jpg)
Additional Issues for Nouns
Features:Nominalization lexicon (employment→ employ)
Morphological stem
Different positionsMost arguments of nominal predicates occur inside the NP
Others are introduced by support verbs
Especially light verbs “X made an argument”, “Y took a nap”
![Page 65: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/65.jpg)
Outline
Recap: dependency grammars and arc-standard dependency parsing
Meaning from Syntax
Structured Meaning: Semantic Frames and RolesWhat problem do they solve?TheoryComputational resources: FrameNet, VerbNet, PropbankComputational Task: Semantic Role Labeling
Selectional RestrictionsWhat problem do they solve?Computational resources: WordNetSome simple approaches
![Page 66: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/66.jpg)
Selectional Restrictions
I want to eat someplace nearby.
![Page 67: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/67.jpg)
Selectional Restrictions
I want to eat someplace nearby.
(a)
![Page 68: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/68.jpg)
Selectional Restrictions
I want to eat someplace nearby.
(a)
(b)
![Page 69: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/69.jpg)
Selectional Restrictions
I want to eat someplace nearby.
(a)
(b)
How do we know speaker didn’t mean (b)?
![Page 70: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/70.jpg)
Selectional Restrictions
I want to eat someplace nearby.
(a)
(b)
How do we know speaker didn’t mean (b)?
The THEME of eating tends to be
something edible
![Page 71: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/71.jpg)
Selectional Restrictions and Word Senses
The restaurant serves green-lipped mussels. THEME is some kind of food
Which airlines serve Denver? THEME is an appropriate location
![Page 72: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/72.jpg)
One Way to Represent Selectional Restrictions
but do have a large knowledge base of facts about edible things?!
(do we know a hamburger is edible? sort of)
![Page 73: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/73.jpg)
WordNet
Knowledge graph containing concept relations
hamburger
sandwich
hero gyro
![Page 74: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/74.jpg)
WordNet
Knowledge graph containing concept relations
hamburger
sandwich
hero gyro
hypernym:specific to general
a hamburger is-a sandwich
![Page 75: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/75.jpg)
WordNet
Knowledge graph containing concept relations
hamburger
sandwich
hero gyro
hyponym:general to specific
a hamburger is-a sandwich
![Page 76: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/76.jpg)
WordNet
Knowledge graph containing concept relations
hamburger
sandwich
hero gyro
Other relationships too:• meronymy, holonymy
(part of whole, whole of part)• troponymy
(describing manner of an event)• entailment
(what else must happen in an event)
![Page 77: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/77.jpg)
WordNet Knows About Hamburgers
hamburger
sandwich
snack food
dish
nutriment
food
substance
matter
physical entity
entity
![Page 78: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/78.jpg)
WordNet Synsets for Selectional Restrictions
“The THEME of eat must be WordNet synset {food, nutrient}”
SimilarlyTHEME of imagine: synset {entity}
THEME of lift: synset {physical entity}
THEME of diagonalize: synset {matrix}
Allows:imagine a hamburger and lift a hamburger,
Correctly rules out:diagonalize a hamburger.
![Page 79: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/79.jpg)
Selectional Preferences
Initially: strict constraints (Katz and Fodor 1963)
Eat [+FOOD]
which turned into preferences (Wilks 1975)
“But it fell apart in 1931, perhaps because people realized you can’t eat gold for lunch if you’re hungry.”
![Page 80: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/80.jpg)
Computing Selectional Association (Resnik 1993)
A probabilistic measure of the strength of association between a predicate and a semantic class of its argument
Parse a corpus
Count all the times each predicate appears with each argument word
Assume each word is a partial observation of all the WordNetconcepts associated with that word
Some high and low associations:
![Page 81: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/81.jpg)
A Simpler Model of Selectional Association (Brockmann and Lapata, 2003)
Model just the association of predicate v with a single noun n
Parse a huge corpus
Count how often a noun n occurs in relation r with verb v:
log count(n,v,r)
(or the probability)
![Page 82: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/82.jpg)
A Simpler Model of Selectional Association (Brockmann and Lapata, 2003)
Model just the association of predicate v with a single noun n
Parse a huge corpus
Count how often a noun n occurs in relation r with verb v:
log count(n,v,r)
(or the probability)
See: Bergsma, Lin, Goebel (2008) for evaluation/comparison
![Page 83: A Brief Introduction to Semantics](https://reader034.fdocuments.us/reader034/viewer/2022042715/62671212eead5a2efd151c59/html5/thumbnails/83.jpg)
Outline
Recap: dependency grammars and arc-standard dependency parsing
Meaning from Syntax
Structured Meaning: Semantic Frames and RolesWhat problem do they solve?TheoryComputational resources: FrameNet, VerbNet, PropbankComputational Task: Semantic Role Labeling
Selectional RestrictionsWhat problem do they solve?Computational resources: WordNetSome simple approaches