AnInference-rulesbasedCategorialGrammar ...cs.jhu.edu/~xuchen/paper/Yao2009benelearn.slides.pdfX BnA...
Transcript of AnInference-rulesbasedCategorialGrammar ...cs.jhu.edu/~xuchen/paper/Yao2009benelearn.slides.pdfX BnA...
Introduction Learning by Inference Rules Experiment Conclusion
An Inference-rules based Categorial GrammarLearner for Simulating Language Acquisition
Xuchen Yao, Jianqiang Ma, Sergio Duarte, Çağrı Çöltekin
University of Groningen
18 May 2009
Introduction Learning by Inference Rules Experiment Conclusion
Outline
IntroductionCombinatory Categorial GrammarLanguage Acquisition
Learning by Inference RulesGrammar Induction by Inference RulesThe Learning Architecture
ExperimentLearning an Artificial GrammarLearning Auxiliary Verb FrontingLearning Correct Word Order
Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
Outline
IntroductionCombinatory Categorial GrammarLanguage Acquisition
Learning by Inference RulesGrammar Induction by Inference RulesThe Learning Architecture
ExperimentLearning an Artificial GrammarLearning Auxiliary Verb FrontingLearning Correct Word Order
Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
Categorial Grammar
• basic categories: S(sentence), NP (nounphrase), N (noun)
• Complex categories:NP/N, S\NP and(S\NP)\(S\NP)
• Slash operators: / \
Peter saw a bookNP VP DT N
NPVP
SPeter saw a book
NP (S\NP)/NP NP/N N>
NP>
S\NP<
S
Figure: Example derivation forsentence Peter saw a book.
Introduction Learning by Inference Rules Experiment Conclusion
Categorial Grammar
• basic categories: S(sentence), NP (nounphrase), N (noun)
• Complex categories:NP/N, S\NP and(S\NP)\(S\NP)
• Slash operators: / \
Peter saw a bookNP VP DT N
NPVP
SPeter saw a book
NP (S\NP)/NP NP/N N>
NP>
S\NP<
S
Figure: Example derivation forsentence Peter saw a book.
Introduction Learning by Inference Rules Experiment Conclusion
Different Operation Rules
• Function application rules (CG)Forward A/B B → A (>)Backward B A\B → A (<)
• Function composition rules (CCG)Forward A/B B/C → A/C (> B)Backward B\C A\B → A\C (< B)
• Type raising rules (CCG)Forward A → T/(T\A) (> T)Backward A → T\(T/A) (< T)
• Substitution rules (CCG)Forward (A/B)/C B/C → A/C (>S)Backward B\C (A\B)\C → A\C (<S)
Introduction Learning by Inference Rules Experiment Conclusion
Outline
IntroductionCombinatory Categorial GrammarLanguage Acquisition
Learning by Inference RulesGrammar Induction by Inference RulesThe Learning Architecture
ExperimentLearning an Artificial GrammarLearning Auxiliary Verb FrontingLearning Correct Word Order
Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
Nativist vs. Empiricist• Auxiliary Verb Fronting
• Peter is awake.• Is Peter awake?• Peter who is sleepy is awake.
• Is Peter who is sleepy awake?• *Is Peter who sleepy is awake?
• Word Order• I should go.• I have gone.• I am going.• I have been going.• I should have gone.• I should be going.
• I should have been going.• *I have should been going.
Introduction Learning by Inference Rules Experiment Conclusion
Nativist vs. Empiricist• Auxiliary Verb Fronting
• Peter is awake.• Is Peter awake?• Peter who is sleepy is awake.
• Is Peter who is sleepy awake?• *Is Peter who sleepy is awake?
• Word Order• I should go.• I have gone.• I am going.• I have been going.• I should have gone.• I should be going.
• I should have been going.• *I have should been going.
Introduction Learning by Inference Rules Experiment Conclusion
Nativist vs. Empiricist• Auxiliary Verb Fronting
• Peter is awake.• Is Peter awake?• Peter who is sleepy is awake.
• Is Peter who is sleepy awake?• *Is Peter who sleepy is awake?
• Word Order• I should go.• I have gone.• I am going.• I have been going.• I should have gone.• I should be going.
• I should have been going.• *I have should been going.
Introduction Learning by Inference Rules Experiment Conclusion
Nativist vs. Empiricist• Auxiliary Verb Fronting
• Peter is awake.• Is Peter awake?• Peter who is sleepy is awake.
• Is Peter who is sleepy awake?• *Is Peter who sleepy is awake?
• Word Order• I should go.• I have gone.• I am going.• I have been going.• I should have gone.• I should be going.
• I should have been going.• *I have should been going.
Introduction Learning by Inference Rules Experiment Conclusion
Research Questions
1. Can we give a computational simulation of the acquisition ofsyntactic structures?
• How do we derive the category of an unknown word in asentence?
2. Can we give a judgement of the Nativist-Empiricist debatefrom the perspective of CCG?
• How important is experience? Or the innate ability is moreimportant?
Introduction Learning by Inference Rules Experiment Conclusion
Outline
IntroductionCombinatory Categorial GrammarLanguage Acquisition
Learning by Inference RulesGrammar Induction by Inference RulesThe Learning Architecture
ExperimentLearning an Artificial GrammarLearning Auxiliary Verb FrontingLearning Correct Word Order
Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
Level 0/1 Inference Rules
• Level 0 inference rulesB/A X → B ⇒ X = A ifA 6= SX B\A → B ⇒ X = A ifA 6= S
• Level 1 inference rulesA X → B ⇒ X = B\A ifA 6= SX A → B ⇒ X = B/A ifA 6= S
Peter worksNP X
(S\NP)<
S
Figure: Example of level 1 indefrence rules: Peter works.
Introduction Learning by Inference Rules Experiment Conclusion
Level 2 Inference Rules• Level 2 side inference rules
X A B → C ⇒ X = (C/B)/AA B X → C ⇒ X = (C\A)\B
• Level 2 middle inference ruleA X B → C ⇒ X = (C\A)/B
Peter saw a book
NP X NP/N N((S\NP)/NP)
>
NP>
S\NP<
S
Figure: Example of level 2 inference rules: Peter saw a book.
Introduction Learning by Inference Rules Experiment Conclusion
Level 3 Inference Rules
• Level 3 side inference rulesX A B C → D ⇒ X = ((D/C )/B)/AA B C X → D ⇒ X = ((D\A)\B)\C
• Level 3 middle inference rulesA X B C → D ⇒ X = ((D\A)/C )/BA B X C → D ⇒ X = ((D\A)\B)/C
• Inference rules of up to level 3 can derive most categories ofcommon English words.
Introduction Learning by Inference Rules Experiment Conclusion
Outline
IntroductionCombinatory Categorial GrammarLanguage Acquisition
Learning by Inference RulesGrammar Induction by Inference RulesThe Learning Architecture
ExperimentLearning an Artificial GrammarLearning Auxiliary Verb FrontingLearning Correct Word Order
Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
The Learning Architecture
CorpusEdge
GeneratorRecursiveLearner
Level 0IR canparse?
Level 1IR canparse?
Level 2IR canparse?
Level 3IR canparse?
NN NN
Applying level 0 and 1 inference
rule recursively
SCP Principle
Rightcombining
rule
Output SelectorY
Generationrule set
Testrule set
Cannot learn
NN
YY YYYYYY
Figure: Learning process using inference rules
Introduction Learning by Inference Rules Experiment Conclusion
Outline
IntroductionCombinatory Categorial GrammarLanguage Acquisition
Learning by Inference RulesGrammar Induction by Inference RulesThe Learning Architecture
ExperimentLearning an Artificial GrammarLearning Auxiliary Verb FrontingLearning Correct Word Order
Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
Target Grammar
Peter := NP with := (N\N)/NPMary := NP with := (NP\NP)/NPbig := N/N with := ((S\NP)\(S\NP))/NPcolorless := N/N sleep := S\NPbook := N a := NP/Ntelescope := N give := ((S\NP)/NP)/NPthe := NP/N saw := (S\NP)/NPrun := S\NP read := (S\NP)/NPbig := N/N furiously := (S\NP)\(S\NP)
Table: Target Grammar Rules
• Recursive & ambiguous• Assume only NP and N are known to the learner
Introduction Learning by Inference Rules Experiment Conclusion
Result
Peter saw Mary with a big big telescope
NP (S\NP)/NP NP (NP\NP)/NP NP/N N/N N/N N>
N>
N<
NP>
NP\NP<
NP>
S\NP<
S
(a)
Peter saw Mary with a big big telescope
NP (S\NP)/NP NP ((S\NP)\(S\NP))/NP NP/N N/N N/N N> >
S\NP N>
N<
NP>
(S\NP)\(S\NP)<
S\NP<
S
(b)
Figure: Two ambiguous parses of the sentence
Introduction Learning by Inference Rules Experiment Conclusion
Result
Peter saw Mary with a big big telescope
NP (S\NP)/NP NP (NP\NP)/NP NP/N N/N N/N N>
N>
N<
NP>
NP\NP<
NP>
S\NP<
S
Figure: Ambiguous parse 1
Introduction Learning by Inference Rules Experiment Conclusion
Result
Peter saw Mary with a big big telescope
NP (S\NP)/NP NP ((S\NP)\(S\NP))/NP NP/N N/N N/N N> >
S\NP N>
N<
NP>
(S\NP)\(S\NP)<
S\NP<
S
Figure: Ambiguous parse 2
Introduction Learning by Inference Rules Experiment Conclusion
Outline
IntroductionCombinatory Categorial GrammarLanguage Acquisition
Learning by Inference RulesGrammar Induction by Inference RulesThe Learning Architecture
ExperimentLearning an Artificial GrammarLearning Auxiliary Verb FrontingLearning Correct Word Order
Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
Learning Auxiliary Verb Fronting 1
Peter is sleepy
NP (S\NP)/(Sadj\NP) Sadj\NP>
S\NP<
S(a)
Is Peter awake
(Sq/(Sadj\NP))/NP NP Sadj\NP>
Sq/(Sadj\NP)<
Sq(b)
Peter who is sleepy is awake
NP (NP\NP)/(S\NP) (S\NP)/(Sadj\NP) Sadj\NP (S\NP)/(Sadj\NP) Sadj\NP> >
S\NP S\NP>
NP\NP<
NP<
S(c)
Figure: Learning Auxiliary Verb Fronting 1
Introduction Learning by Inference Rules Experiment Conclusion
Learning Auxiliary Verb Fronting 2
Is Peter who is sleepy awake
(Sq/(Sadj\NP))/NP NP (NP\NP)/(S\NP) (S\NP)/(Sadj\NP) Sadj\NP Sadj\NP>
S\NP>
NP\NP<
NP>
Sq/(Sadj\NP)>
Sq
Figure: Learning Auxiliary Verb Fronting 2
• is := (S\NP)/(Sadj\NP)Is := (Sq/(Sadj\NP))/NP
Introduction Learning by Inference Rules Experiment Conclusion
Outline
IntroductionCombinatory Categorial GrammarLanguage Acquisition
Learning by Inference RulesGrammar Induction by Inference RulesThe Learning Architecture
ExperimentLearning an Artificial GrammarLearning Auxiliary Verb FrontingLearning Correct Word Order
Conclusion
Introduction Learning by Inference Rules Experiment Conclusion
Learning Correct Word Order• I should go.• I have gone.• I am going.• I have been going.• I should have gone.• I should be going.• I should have been going.• *I have should been going.
should := (Ss\NP)/(S\NP)should := (Ss\NP)/(Sh\NP)should := (Ss\NP)/(Sb\NP)have := (Sh\NP)/(S\NP)have := (Sh\NP)/(Sb\NP)be := (Sb\NP)/(S\NP)
I should have been going
NP (Ss\NP)/(Sh\NP) (Sh\NP)/(Sb\NP) (Sb\NP)/(S\NP) S\NP>
Sb\NP>
Sh\NP>
Ss\NP<
Ss
Figure: Learning Correct Word Order
Introduction Learning by Inference Rules Experiment Conclusion
Conclusion
1. Can we give a computational simulation of the acquisition ofsyntactic structures?
• How do we derive the category of an unknown word in asentence?
• This paper presents a simple and intuitive method to achievethis.
2. Can we give a judgement of the Nativist-Empiricist debatefrom the perspective of CCG?
• How important is experience? Or the innate ability is moreimportant?
• Simple and intuitive logical rules can also help resolve thecelebrated linguistic phenomena.
• Logic gives a third way beside experience and innateness.
Introduction Learning by Inference Rules Experiment Conclusion
Conclusion
1. Can we give a computational simulation of the acquisition ofsyntactic structures?
• How do we derive the category of an unknown word in asentence?
• This paper presents a simple and intuitive method to achievethis.
2. Can we give a judgement of the Nativist-Empiricist debatefrom the perspective of CCG?
• How important is experience? Or the innate ability is moreimportant?
• Simple and intuitive logical rules can also help resolve thecelebrated linguistic phenomena.
• Logic gives a third way beside experience and innateness.