NLP_session-3_Alexandra

132
NLP Training Session 3 Dr. Alexandra M. Liguori Incubio The Big Data Academy Barcelona, April 22, 2015 Dr. Alexandra M. Liguori NLP Training Session 3

Transcript of NLP_session-3_Alexandra

Page 1: NLP_session-3_Alexandra

NLP Training – Session 3

Dr. Alexandra M. Liguori

Incubio – The Big Data Academy

Barcelona, April 22, 2015

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 2: NLP_session-3_Alexandra

Welcome back!!!

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 3: NLP_session-3_Alexandra

Outline

1 Clarification about corpus2 Recap: Typical NLP tasks3 Automatic Question Answering4 Reference resolution5 Named Entity Recognition (NER)6 Keyword / topic / information extraction

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 4: NLP_session-3_Alexandra

NLP: Ambiguities and Solutions

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 5: NLP_session-3_Alexandra

NLP: Ambiguities and Solutions

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 6: NLP_session-3_Alexandra

Corpus

DefinitionCorpus = Large and structured set of texts.

NLPTwo types of corpora:

Training corpus ↔ to make the list of rules or to get thestatistical dataTest corpus ↔ to test the results found with the trainingcorpus

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 7: NLP_session-3_Alexandra

Corpus

DefinitionCorpus = Large and structured set of texts.

NLPTwo types of corpora:

Training corpus ↔ to make the list of rules or to get thestatistical dataTest corpus ↔ to test the results found with the trainingcorpus

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 8: NLP_session-3_Alexandra

Typical NLP tasks: Basic and simpler tasks

Tokenization RegEx

Sentence splitting RegEx

POS-tagging POS-tagging algorithms andtag sets

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 9: NLP_session-3_Alexandra

Typical NLP tasks: Basic and simpler tasks

Tokenization

RegEx

Sentence splitting RegEx

POS-tagging POS-tagging algorithms andtag sets

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 10: NLP_session-3_Alexandra

Typical NLP tasks: Basic and simpler tasks

Tokenization RegEx

Sentence splitting RegEx

POS-tagging POS-tagging algorithms andtag sets

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 11: NLP_session-3_Alexandra

Typical NLP tasks: Basic and simpler tasks

Tokenization RegEx

Sentence splitting

RegEx

POS-tagging POS-tagging algorithms andtag sets

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 12: NLP_session-3_Alexandra

Typical NLP tasks: Basic and simpler tasks

Tokenization RegEx

Sentence splitting RegEx

POS-tagging POS-tagging algorithms andtag sets

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 13: NLP_session-3_Alexandra

Typical NLP tasks: Basic and simpler tasks

Tokenization RegEx

Sentence splitting RegEx

POS-tagging

POS-tagging algorithms andtag sets

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 14: NLP_session-3_Alexandra

Typical NLP tasks: Basic and simpler tasks

Tokenization RegEx

Sentence splitting RegEx

POS-tagging POS-tagging algorithms andtag sets

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 15: NLP_session-3_Alexandra

Typical NLP tasks: Complex tasks

Lemmatization or StemmingImplementations of PorterStemmer (e.g. in Java),

Stanford NLP tool, GATE, ...

Syntactic parsing

Early algorithm, CYKalgorithm, GHR algorithm,

Stanford Parser (Javaimplementation of

probabilistic algorithm)

Question answeringTopic extractionNERSemantic analysis...

Ad hoc tools, e.g.dictionaries, ontologies,Frames, GATE, NLTK,

Lappin & Leass algorithm...

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 16: NLP_session-3_Alexandra

Typical NLP tasks: Complex tasks

Lemmatization or Stemming

Implementations of PorterStemmer (e.g. in Java),

Stanford NLP tool, GATE, ...

Syntactic parsing

Early algorithm, CYKalgorithm, GHR algorithm,

Stanford Parser (Javaimplementation of

probabilistic algorithm)

Question answeringTopic extractionNERSemantic analysis...

Ad hoc tools, e.g.dictionaries, ontologies,Frames, GATE, NLTK,

Lappin & Leass algorithm...

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 17: NLP_session-3_Alexandra

Typical NLP tasks: Complex tasks

Lemmatization or StemmingImplementations of PorterStemmer (e.g. in Java),

Stanford NLP tool, GATE, ...

Syntactic parsing

Early algorithm, CYKalgorithm, GHR algorithm,

Stanford Parser (Javaimplementation of

probabilistic algorithm)

Question answeringTopic extractionNERSemantic analysis...

Ad hoc tools, e.g.dictionaries, ontologies,Frames, GATE, NLTK,

Lappin & Leass algorithm...

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 18: NLP_session-3_Alexandra

Typical NLP tasks: Complex tasks

Lemmatization or StemmingImplementations of PorterStemmer (e.g. in Java),

Stanford NLP tool, GATE, ...

Syntactic parsing

Early algorithm, CYKalgorithm, GHR algorithm,

Stanford Parser (Javaimplementation of

probabilistic algorithm)

Question answeringTopic extractionNERSemantic analysis...

Ad hoc tools, e.g.dictionaries, ontologies,Frames, GATE, NLTK,

Lappin & Leass algorithm...

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 19: NLP_session-3_Alexandra

Typical NLP tasks: Complex tasks

Lemmatization or StemmingImplementations of PorterStemmer (e.g. in Java),

Stanford NLP tool, GATE, ...

Syntactic parsing

Early algorithm, CYKalgorithm, GHR algorithm,

Stanford Parser (Javaimplementation of

probabilistic algorithm)

Question answeringTopic extractionNERSemantic analysis...

Ad hoc tools, e.g.dictionaries, ontologies,Frames, GATE, NLTK,

Lappin & Leass algorithm...

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 20: NLP_session-3_Alexandra

Typical NLP tasks: Complex tasks

Lemmatization or StemmingImplementations of PorterStemmer (e.g. in Java),

Stanford NLP tool, GATE, ...

Syntactic parsing

Early algorithm, CYKalgorithm, GHR algorithm,

Stanford Parser (Javaimplementation of

probabilistic algorithm)

Question answeringTopic extractionNERSemantic analysis...

Ad hoc tools, e.g.dictionaries, ontologies,Frames, GATE, NLTK,

Lappin & Leass algorithm...

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 21: NLP_session-3_Alexandra

Typical NLP tasks: Complex tasks

Lemmatization or StemmingImplementations of PorterStemmer (e.g. in Java),

Stanford NLP tool, GATE, ...

Syntactic parsing

Early algorithm, CYKalgorithm, GHR algorithm,

Stanford Parser (Javaimplementation of

probabilistic algorithm)

Question answeringTopic extractionNERSemantic analysis...

Ad hoc tools, e.g.dictionaries, ontologies,Frames, GATE, NLTK,

Lappin & Leass algorithm...

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 22: NLP_session-3_Alexandra

Question Answering

Video on Bush Jr. and Condoleezza Rice from Who’s on first

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 23: NLP_session-3_Alexandra

Question Answering

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 24: NLP_session-3_Alexandra

Simple Question Answering

ELIZAUser_1: Men are all alike.

ELIZA_1: IN WHAT WAY

User_2: They’re always bugging us about something or other..

ELIZA_2: CAN YOU THINK OF A SPECIFIC EXAMPLE

User_3: Well, my boyfriend made me come here.

ELIZA_3: YOUR BOYFRIEND MADE YOU COME HERE

User_4: Yes, he says I’m depressed much of the time.

ELIZA_4: I AM SORRY TO HEAR YOU ARE DEPRESSED.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 25: NLP_session-3_Alexandra

ELIZA

Regular expression substitutions

change all instances of my to YOUR, and I’m to YOU ARE,etc., e.g.:

1 User_3: Well, my boyfriend made me come here.ELIZA_3: YOUR BOYFRIEND MADE YOU COME HERE

2 User_4: ... I’m depressed ... .ELIZA_4: ... YOU ARE DEPRESSED.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 26: NLP_session-3_Alexandra

ELIZA

Regular expression substitutions

change all instances of my to YOUR, and I’m to YOU ARE,etc., e.g.:

1 User_3: Well, my boyfriend made me come here.ELIZA_3: YOUR BOYFRIEND MADE YOU COME HERE

2 User_4: ... I’m depressed ... .ELIZA_4: ... YOU ARE DEPRESSED.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 27: NLP_session-3_Alexandra

ELIZA

Regular expression substitutions

change all instances of my to YOUR, and I’m to YOU ARE,etc., e.g.:

1 User_3: Well, my boyfriend made me come here.ELIZA_3: YOUR BOYFRIEND MADE YOU COME HERE

2 User_4: ... I’m depressed ... .ELIZA_4: ... YOU ARE DEPRESSED.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 28: NLP_session-3_Alexandra

ELIZA

Regular expression substitutionsrelevant patterns in the input → creat an appropriateoutput; e.g.:

1 s/.* YOU ARE (depressed | sad) .*/I AM SORRY TO HEARYOU ARE \1 /

2 s/.* YOU ARE (depressed | sad) .*/WHY DO YOU THINKYOU ARE \1 /

3 s/.* all .*/IN WHAT WAY/

4 s/.* always .*/CAN YOU THINK OF A SPECIFIC EXAMPLE/

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 29: NLP_session-3_Alexandra

ELIZA

Regular expression substitutionsrelevant patterns in the input → creat an appropriateoutput; e.g.:

1 s/.* YOU ARE (depressed | sad) .*/I AM SORRY TO HEARYOU ARE \1 /

2 s/.* YOU ARE (depressed | sad) .*/WHY DO YOU THINKYOU ARE \1 /

3 s/.* all .*/IN WHAT WAY/

4 s/.* always .*/CAN YOU THINK OF A SPECIFIC EXAMPLE/

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 30: NLP_session-3_Alexandra

ELIZA

Regular expression substitutionsrelevant patterns in the input → creat an appropriateoutput; e.g.:

1 s/.* YOU ARE (depressed | sad) .*/I AM SORRY TO HEARYOU ARE \1 /

2 s/.* YOU ARE (depressed | sad) .*/WHY DO YOU THINKYOU ARE \1 /

3 s/.* all .*/IN WHAT WAY/

4 s/.* always .*/CAN YOU THINK OF A SPECIFIC EXAMPLE/

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 31: NLP_session-3_Alexandra

ELIZA

Regular expression substitutionsrelevant patterns in the input → creat an appropriateoutput; e.g.:

1 s/.* YOU ARE (depressed | sad) .*/I AM SORRY TO HEARYOU ARE \1 /

2 s/.* YOU ARE (depressed | sad) .*/WHY DO YOU THINKYOU ARE \1 /

3 s/.* all .*/IN WHAT WAY/

4 s/.* always .*/CAN YOU THINK OF A SPECIFIC EXAMPLE/

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 32: NLP_session-3_Alexandra

ELIZA

Regular expression substitutionsrelevant patterns in the input → creat an appropriateoutput; e.g.:

1 s/.* YOU ARE (depressed | sad) .*/I AM SORRY TO HEARYOU ARE \1 /

2 s/.* YOU ARE (depressed | sad) .*/WHY DO YOU THINKYOU ARE \1 /

3 s/.* all .*/IN WHAT WAY/

4 s/.* always .*/CAN YOU THINK OF A SPECIFIC EXAMPLE/

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 33: NLP_session-3_Alexandra

Quizlyse Example

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 34: NLP_session-3_Alexandra

Quizlyse Example

1) InputAffirmative sentence, e.g.

Cristiano chuta el balon.

2) Intermediate outputParsed text:

Cristiano/NPMS000 chuta/VMIS3S0 el/DI0MS0balon/NCMS000 ./.

Cristiano/SUBJ chuta/VERB [el balon]/DIRECT-OBJ ./.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 35: NLP_session-3_Alexandra

Quizlyse Example

1) InputAffirmative sentence, e.g.

Cristiano chuta el balon.

2) Intermediate outputParsed text:

Cristiano/NPMS000 chuta/VMIS3S0 el/DI0MS0balon/NCMS000 ./.

Cristiano/SUBJ chuta/VERB [el balon]/DIRECT-OBJ ./.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 36: NLP_session-3_Alexandra

Quizlyse Example

1) InputAffirmative sentence, e.g.

Cristiano chuta el balon.

2) Intermediate outputParsed text:

Cristiano/NPMS000 chuta/VMIS3S0 el/DI0MS0balon/NCMS000 ./.

Cristiano/SUBJ chuta/VERB [el balon]/DIRECT-OBJ ./.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 37: NLP_session-3_Alexandra

Quizlyse Example

1) InputAffirmative sentence, e.g.

Cristiano chuta el balon.

2) Intermediate outputParsed text:

Cristiano/NPMS000 chuta/VMIS3S0 el/DI0MS0balon/NCMS000 ./.

Cristiano/SUBJ chuta/VERB [el balon]/DIRECT-OBJ ./.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 38: NLP_session-3_Alexandra

Quizlyse Example

1) InputAffirmative sentence, e.g.

Cristiano chuta el balon.

2) Intermediate outputParsed text:

Cristiano/NPMS000 chuta/VMIS3S0 el/DI0MS0balon/NCMS000 ./.

Cristiano/SUBJ chuta/VERB [el balon]/DIRECT-OBJ ./.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 39: NLP_session-3_Alexandra

Quizlyse Example

1) InputAffirmative sentence, e.g.

Cristiano chuta el balon.

2) Intermediate outputParsed text:

Cristiano/NPMS000 chuta/VMIS3S0 el/DI0MS0balon/NCMS000 ./.

Cristiano/SUBJ chuta/VERB [el balon]/DIRECT-OBJ ./.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 40: NLP_session-3_Alexandra

Quizlyse Example

3) SubstitutionsRelevant patterns in the input → create an appropriate output;

e.g.:

1 s/.* (NPMS000) (VMIS3S0) (DI0MS0 NCMS000) .*/Qué \2 \1 ? /

2 SUBJ VERB DIRECT-OBJ → Qué VERB SUBJ ?

4) Final OutputAutomatically generated question as output; e.g.:

Qué chuta Cristiano?

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 41: NLP_session-3_Alexandra

Quizlyse Example

3) SubstitutionsRelevant patterns in the input → create an appropriate output;

e.g.:

1 s/.* (NPMS000) (VMIS3S0) (DI0MS0 NCMS000) .*/Qué \2 \1 ? /

2 SUBJ VERB DIRECT-OBJ → Qué VERB SUBJ ?

4) Final OutputAutomatically generated question as output; e.g.:

Qué chuta Cristiano?

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 42: NLP_session-3_Alexandra

Quizlyse Example

3) SubstitutionsRelevant patterns in the input → create an appropriate output;

e.g.:

1 s/.* (NPMS000) (VMIS3S0) (DI0MS0 NCMS000) .*/Qué \2 \1 ? /

2 SUBJ VERB DIRECT-OBJ → Qué VERB SUBJ ?

4) Final OutputAutomatically generated question as output; e.g.:

Qué chuta Cristiano?

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 43: NLP_session-3_Alexandra

Quizlyse Example

3) SubstitutionsRelevant patterns in the input → create an appropriate output;

e.g.:

1 s/.* (NPMS000) (VMIS3S0) (DI0MS0 NCMS000) .*/Qué \2 \1 ? /

2 SUBJ VERB DIRECT-OBJ → Qué VERB SUBJ ?

4) Final OutputAutomatically generated question as output; e.g.:

Qué chuta Cristiano?

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 44: NLP_session-3_Alexandra

Quizlyse Example

3) SubstitutionsRelevant patterns in the input → create an appropriate output;

e.g.:

1 s/.* (NPMS000) (VMIS3S0) (DI0MS0 NCMS000) .*/Qué \2 \1 ? /

2 SUBJ VERB DIRECT-OBJ → Qué VERB SUBJ ?

4) Final OutputAutomatically generated question as output; e.g.:

Qué chuta Cristiano?

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 45: NLP_session-3_Alexandra

Quizlyse Example

3) SubstitutionsRelevant patterns in the input → create an appropriate output;

e.g.:

1 s/.* (NPMS000) (VMIS3S0) (DI0MS0 NCMS000) .*/Qué \2 \1 ? /

2 SUBJ VERB DIRECT-OBJ → Qué VERB SUBJ ?

4) Final OutputAutomatically generated question as output; e.g.:

Qué chuta Cristiano?

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 46: NLP_session-3_Alexandra

Reference resolution

Discourse

Gracie: Oh yeah... And then Mr. and Mrs. Jones were havingmatrimonial trouble, and my brother was hired to watch Mrs. Jones.George: Well, I imagine she was a very attractive woman.Gracie: She was, and my brother watched her day and night for sixmonths.George: Well, what happened?Gracie: She finally got a divorce.George: Mrs. Jones?Gracie: No, my brother’s wife.

Jordi se fué al restaurante de Xavi para comer pescado. Esteestaba fresco y le gustó.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 47: NLP_session-3_Alexandra

Reference resolution

Discourse

Gracie: Oh yeah... And then Mr. and Mrs. Jones were havingmatrimonial trouble, and my brother was hired to watch Mrs. Jones.George: Well, I imagine she was a very attractive woman.Gracie: She was, and my brother watched her day and night for sixmonths.George: Well, what happened?Gracie: She finally got a divorce.George: Mrs. Jones?Gracie: No, my brother’s wife.

Jordi se fué al restaurante de Xavi para comer pescado. Esteestaba fresco y le gustó.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 48: NLP_session-3_Alexandra

Reference resolution

Discourse

Gracie: Oh yeah... And then Mr. and Mrs. Jones were havingmatrimonial trouble, and my brother was hired to watch Mrs. Jones.George: Well, I imagine she was a very attractive woman.Gracie: She was, and my brother watched her day and night for sixmonths.George: Well, what happened?Gracie: She finally got a divorce.George: Mrs. Jones?Gracie: No, my brother’s wife.

Jordi se fué al restaurante de Xavi para comer pescado. Esteestaba fresco y le gustó.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 49: NLP_session-3_Alexandra

Reference resolution

1 Reference phenomena

2 Constraints on coreference

3 Preferences in pronoun interpretation

4 Example of algorithm for pronoun resolution

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 50: NLP_session-3_Alexandra

Reference resolution

1 Reference phenomena

2 Constraints on coreference

3 Preferences in pronoun interpretation

4 Example of algorithm for pronoun resolution

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 51: NLP_session-3_Alexandra

Reference resolution

1 Reference phenomena

2 Constraints on coreference

3 Preferences in pronoun interpretation

4 Example of algorithm for pronoun resolution

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 52: NLP_session-3_Alexandra

Reference resolution

1 Reference phenomena

2 Constraints on coreference

3 Preferences in pronoun interpretation

4 Example of algorithm for pronoun resolution

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 53: NLP_session-3_Alexandra

Reference resolution

1 Reference phenomena

2 Constraints on coreference

3 Preferences in pronoun interpretation

4 Example of algorithm for pronoun resolution

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 54: NLP_session-3_Alexandra

Reference resolution

Reference phenomena

1 Indefinite noun phrases ↔ Pedro comió unos pastelesayer.

2 Definite noun phrases ↔ Pedro comió unos pastelesayer. Los pasteles eran muy dulces.

3 Pronouns ↔ Ayer Pedro comió unos pasteles que eranmuy dulces.

4 Demonstratives ↔ Pedro hizo unos pasteles: estos sonde chocolate, aquellos son de almendra.

5 Anaphora con uno/una/unos/unas ↔ Ayer Pedro hizouna tarta. Hoy quiero hacer una yo también.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 55: NLP_session-3_Alexandra

Reference resolution

Reference phenomena1 Indefinite noun phrases ↔ Pedro comió unos pasteles

ayer.

2 Definite noun phrases ↔ Pedro comió unos pastelesayer. Los pasteles eran muy dulces.

3 Pronouns ↔ Ayer Pedro comió unos pasteles que eranmuy dulces.

4 Demonstratives ↔ Pedro hizo unos pasteles: estos sonde chocolate, aquellos son de almendra.

5 Anaphora con uno/una/unos/unas ↔ Ayer Pedro hizouna tarta. Hoy quiero hacer una yo también.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 56: NLP_session-3_Alexandra

Reference resolution

Reference phenomena1 Indefinite noun phrases ↔ Pedro comió unos pasteles

ayer.2 Definite noun phrases ↔ Pedro comió unos pasteles

ayer. Los pasteles eran muy dulces.

3 Pronouns ↔ Ayer Pedro comió unos pasteles que eranmuy dulces.

4 Demonstratives ↔ Pedro hizo unos pasteles: estos sonde chocolate, aquellos son de almendra.

5 Anaphora con uno/una/unos/unas ↔ Ayer Pedro hizouna tarta. Hoy quiero hacer una yo también.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 57: NLP_session-3_Alexandra

Reference resolution

Reference phenomena1 Indefinite noun phrases ↔ Pedro comió unos pasteles

ayer.2 Definite noun phrases ↔ Pedro comió unos pasteles

ayer. Los pasteles eran muy dulces.3 Pronouns ↔ Ayer Pedro comió unos pasteles que eran

muy dulces.

4 Demonstratives ↔ Pedro hizo unos pasteles: estos sonde chocolate, aquellos son de almendra.

5 Anaphora con uno/una/unos/unas ↔ Ayer Pedro hizouna tarta. Hoy quiero hacer una yo también.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 58: NLP_session-3_Alexandra

Reference resolution

Reference phenomena1 Indefinite noun phrases ↔ Pedro comió unos pasteles

ayer.2 Definite noun phrases ↔ Pedro comió unos pasteles

ayer. Los pasteles eran muy dulces.3 Pronouns ↔ Ayer Pedro comió unos pasteles que eran

muy dulces.4 Demonstratives ↔ Pedro hizo unos pasteles: estos son

de chocolate, aquellos son de almendra.

5 Anaphora con uno/una/unos/unas ↔ Ayer Pedro hizouna tarta. Hoy quiero hacer una yo también.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 59: NLP_session-3_Alexandra

Reference resolution

Reference phenomena1 Indefinite noun phrases ↔ Pedro comió unos pasteles

ayer.2 Definite noun phrases ↔ Pedro comió unos pasteles

ayer. Los pasteles eran muy dulces.3 Pronouns ↔ Ayer Pedro comió unos pasteles que eran

muy dulces.4 Demonstratives ↔ Pedro hizo unos pasteles: estos son

de chocolate, aquellos son de almendra.5 Anaphora con uno/una/unos/unas ↔ Ayer Pedro hizo

una tarta. Hoy quiero hacer una yo también.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 60: NLP_session-3_Alexandra

Reference resolution

Constraints on coreference

1 Number agreement ↔ Los pasteles que comí ayer loshizo Ana. / Los pasteles que comí ayer lo hizo Ana.

2 Person and case agreement ↔ Ana y Carmen hicieronunos pastels. Les gustan.

3 Gender agreement ↔ La tarta que comí ayer la hizo Ana./ La tarta que comí ayer lo hizo Ana.

4 Syntactic constraints ↔ Ana se hizo una tarta. / Ana lehizo una tarta.

5 Selectional restrictions ↔ Ana puso el pastel en elhorno. Es redondo.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 61: NLP_session-3_Alexandra

Reference resolution

Constraints on coreference1 Number agreement ↔ Los pasteles que comí ayer los

hizo Ana. / Los pasteles que comí ayer lo hizo Ana.

2 Person and case agreement ↔ Ana y Carmen hicieronunos pastels. Les gustan.

3 Gender agreement ↔ La tarta que comí ayer la hizo Ana./ La tarta que comí ayer lo hizo Ana.

4 Syntactic constraints ↔ Ana se hizo una tarta. / Ana lehizo una tarta.

5 Selectional restrictions ↔ Ana puso el pastel en elhorno. Es redondo.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 62: NLP_session-3_Alexandra

Reference resolution

Constraints on coreference1 Number agreement ↔ Los pasteles que comí ayer los

hizo Ana. / Los pasteles que comí ayer lo hizo Ana.2 Person and case agreement ↔ Ana y Carmen hicieron

unos pastels. Les gustan.

3 Gender agreement ↔ La tarta que comí ayer la hizo Ana./ La tarta que comí ayer lo hizo Ana.

4 Syntactic constraints ↔ Ana se hizo una tarta. / Ana lehizo una tarta.

5 Selectional restrictions ↔ Ana puso el pastel en elhorno. Es redondo.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 63: NLP_session-3_Alexandra

Reference resolution

Constraints on coreference1 Number agreement ↔ Los pasteles que comí ayer los

hizo Ana. / Los pasteles que comí ayer lo hizo Ana.2 Person and case agreement ↔ Ana y Carmen hicieron

unos pastels. Les gustan.3 Gender agreement ↔ La tarta que comí ayer la hizo Ana.

/ La tarta que comí ayer lo hizo Ana.

4 Syntactic constraints ↔ Ana se hizo una tarta. / Ana lehizo una tarta.

5 Selectional restrictions ↔ Ana puso el pastel en elhorno. Es redondo.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 64: NLP_session-3_Alexandra

Reference resolution

Constraints on coreference1 Number agreement ↔ Los pasteles que comí ayer los

hizo Ana. / Los pasteles que comí ayer lo hizo Ana.2 Person and case agreement ↔ Ana y Carmen hicieron

unos pastels. Les gustan.3 Gender agreement ↔ La tarta que comí ayer la hizo Ana.

/ La tarta que comí ayer lo hizo Ana.4 Syntactic constraints ↔ Ana se hizo una tarta. / Ana le

hizo una tarta.

5 Selectional restrictions ↔ Ana puso el pastel en elhorno. Es redondo.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 65: NLP_session-3_Alexandra

Reference resolution

Constraints on coreference1 Number agreement ↔ Los pasteles que comí ayer los

hizo Ana. / Los pasteles que comí ayer lo hizo Ana.2 Person and case agreement ↔ Ana y Carmen hicieron

unos pastels. Les gustan.3 Gender agreement ↔ La tarta que comí ayer la hizo Ana.

/ La tarta que comí ayer lo hizo Ana.4 Syntactic constraints ↔ Ana se hizo una tarta. / Ana le

hizo una tarta.5 Selectional restrictions ↔ Ana puso el pastel en el

horno. Es redondo.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 66: NLP_session-3_Alexandra

Reference resolution

Preferences in pronoun interpretation

1 Recency ↔ Pedro hizo un pastel. Juan hizo una tarta. AAna le gusta.

2 Grammatical role ↔ Pedro hizo un pastel con Juan. Él selo comió todo. / Juan hizo un pastel con Pedro. Él se locomió todo.

3 Repeated mention ↔ Anne needed a car to drive to hernew job. She decided she wanted something roomy. Carolwent to the Honda dealership with her. She bought a Civic.

4 Parallelism ↔ Pedro llamó Juan por la mañana. Carlos lellamó por la tarde.

5 Verb semantics ↔ Pedro hizo un pastel para Juan. Legustan los dulces. / Pedro pidió un pastel a Juan. Legustan los dulces.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 67: NLP_session-3_Alexandra

Reference resolution

Preferences in pronoun interpretation1 Recency ↔ Pedro hizo un pastel. Juan hizo una tarta. A

Ana le gusta.

2 Grammatical role ↔ Pedro hizo un pastel con Juan. Él selo comió todo. / Juan hizo un pastel con Pedro. Él se locomió todo.

3 Repeated mention ↔ Anne needed a car to drive to hernew job. She decided she wanted something roomy. Carolwent to the Honda dealership with her. She bought a Civic.

4 Parallelism ↔ Pedro llamó Juan por la mañana. Carlos lellamó por la tarde.

5 Verb semantics ↔ Pedro hizo un pastel para Juan. Legustan los dulces. / Pedro pidió un pastel a Juan. Legustan los dulces.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 68: NLP_session-3_Alexandra

Reference resolution

Preferences in pronoun interpretation1 Recency ↔ Pedro hizo un pastel. Juan hizo una tarta. A

Ana le gusta.2 Grammatical role ↔ Pedro hizo un pastel con Juan. Él se

lo comió todo. / Juan hizo un pastel con Pedro. Él se locomió todo.

3 Repeated mention ↔ Anne needed a car to drive to hernew job. She decided she wanted something roomy. Carolwent to the Honda dealership with her. She bought a Civic.

4 Parallelism ↔ Pedro llamó Juan por la mañana. Carlos lellamó por la tarde.

5 Verb semantics ↔ Pedro hizo un pastel para Juan. Legustan los dulces. / Pedro pidió un pastel a Juan. Legustan los dulces.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 69: NLP_session-3_Alexandra

Reference resolution

Preferences in pronoun interpretation1 Recency ↔ Pedro hizo un pastel. Juan hizo una tarta. A

Ana le gusta.2 Grammatical role ↔ Pedro hizo un pastel con Juan. Él se

lo comió todo. / Juan hizo un pastel con Pedro. Él se locomió todo.

3 Repeated mention ↔ Anne needed a car to drive to hernew job. She decided she wanted something roomy. Carolwent to the Honda dealership with her. She bought a Civic.

4 Parallelism ↔ Pedro llamó Juan por la mañana. Carlos lellamó por la tarde.

5 Verb semantics ↔ Pedro hizo un pastel para Juan. Legustan los dulces. / Pedro pidió un pastel a Juan. Legustan los dulces.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 70: NLP_session-3_Alexandra

Reference resolution

Preferences in pronoun interpretation1 Recency ↔ Pedro hizo un pastel. Juan hizo una tarta. A

Ana le gusta.2 Grammatical role ↔ Pedro hizo un pastel con Juan. Él se

lo comió todo. / Juan hizo un pastel con Pedro. Él se locomió todo.

3 Repeated mention ↔ Anne needed a car to drive to hernew job. She decided she wanted something roomy. Carolwent to the Honda dealership with her. She bought a Civic.

4 Parallelism ↔ Pedro llamó Juan por la mañana. Carlos lellamó por la tarde.

5 Verb semantics ↔ Pedro hizo un pastel para Juan. Legustan los dulces. / Pedro pidió un pastel a Juan. Legustan los dulces.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 71: NLP_session-3_Alexandra

Reference resolution

Preferences in pronoun interpretation1 Recency ↔ Pedro hizo un pastel. Juan hizo una tarta. A

Ana le gusta.2 Grammatical role ↔ Pedro hizo un pastel con Juan. Él se

lo comió todo. / Juan hizo un pastel con Pedro. Él se locomió todo.

3 Repeated mention ↔ Anne needed a car to drive to hernew job. She decided she wanted something roomy. Carolwent to the Honda dealership with her. She bought a Civic.

4 Parallelism ↔ Pedro llamó Juan por la mañana. Carlos lellamó por la tarde.

5 Verb semantics ↔ Pedro hizo un pastel para Juan. Legustan los dulces. / Pedro pidió un pastel a Juan. Legustan los dulces.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 72: NLP_session-3_Alexandra

Reference resolutionAlgorithm for pronoun resolution (Lappin & Leass, 1994)

Divide discourse into sentences and analyzeone sentence at a time

Sentencesplitting

Tokenization

Parse 1st sentence and identify nouns andpronouns POS-tagging

Assign weights to all nouns and pronouns Lappin & Leassweights

Reference pronoun to noun with highestweight, otherwise, if there are no pronouns,

divide all weights by 2

Lappin & Leassalgorithm

Proceed to 2nd sentence and repeat all steps asabove, adding all the weights along the way

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 73: NLP_session-3_Alexandra

Reference resolutionAlgorithm for pronoun resolution (Lappin & Leass, 1994)

Divide discourse into sentences and analyzeone sentence at a time

Sentencesplitting

Tokenization

Parse 1st sentence and identify nouns andpronouns POS-tagging

Assign weights to all nouns and pronouns Lappin & Leassweights

Reference pronoun to noun with highestweight, otherwise, if there are no pronouns,

divide all weights by 2

Lappin & Leassalgorithm

Proceed to 2nd sentence and repeat all steps asabove, adding all the weights along the way

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 74: NLP_session-3_Alexandra

Reference resolutionAlgorithm for pronoun resolution (Lappin & Leass, 1994)

Divide discourse into sentences and analyzeone sentence at a time

Sentencesplitting

Tokenization

Parse 1st sentence and identify nouns andpronouns POS-tagging

Assign weights to all nouns and pronouns Lappin & Leassweights

Reference pronoun to noun with highestweight, otherwise, if there are no pronouns,

divide all weights by 2

Lappin & Leassalgorithm

Proceed to 2nd sentence and repeat all steps asabove, adding all the weights along the way

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 75: NLP_session-3_Alexandra

Reference resolutionAlgorithm for pronoun resolution (Lappin & Leass, 1994)

Divide discourse into sentences and analyzeone sentence at a time

Sentencesplitting

Tokenization

Parse 1st sentence and identify nouns andpronouns

POS-tagging

Assign weights to all nouns and pronouns Lappin & Leassweights

Reference pronoun to noun with highestweight, otherwise, if there are no pronouns,

divide all weights by 2

Lappin & Leassalgorithm

Proceed to 2nd sentence and repeat all steps asabove, adding all the weights along the way

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 76: NLP_session-3_Alexandra

Reference resolutionAlgorithm for pronoun resolution (Lappin & Leass, 1994)

Divide discourse into sentences and analyzeone sentence at a time

Sentencesplitting

Tokenization

Parse 1st sentence and identify nouns andpronouns POS-tagging

Assign weights to all nouns and pronouns Lappin & Leassweights

Reference pronoun to noun with highestweight, otherwise, if there are no pronouns,

divide all weights by 2

Lappin & Leassalgorithm

Proceed to 2nd sentence and repeat all steps asabove, adding all the weights along the way

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 77: NLP_session-3_Alexandra

Reference resolutionAlgorithm for pronoun resolution (Lappin & Leass, 1994)

Divide discourse into sentences and analyzeone sentence at a time

Sentencesplitting

Tokenization

Parse 1st sentence and identify nouns andpronouns POS-tagging

Assign weights to all nouns and pronouns

Lappin & Leassweights

Reference pronoun to noun with highestweight, otherwise, if there are no pronouns,

divide all weights by 2

Lappin & Leassalgorithm

Proceed to 2nd sentence and repeat all steps asabove, adding all the weights along the way

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 78: NLP_session-3_Alexandra

Reference resolutionAlgorithm for pronoun resolution (Lappin & Leass, 1994)

Divide discourse into sentences and analyzeone sentence at a time

Sentencesplitting

Tokenization

Parse 1st sentence and identify nouns andpronouns POS-tagging

Assign weights to all nouns and pronouns Lappin & Leassweights

Reference pronoun to noun with highestweight, otherwise, if there are no pronouns,

divide all weights by 2

Lappin & Leassalgorithm

Proceed to 2nd sentence and repeat all steps asabove, adding all the weights along the way

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 79: NLP_session-3_Alexandra

Reference resolutionAlgorithm for pronoun resolution (Lappin & Leass, 1994)

Divide discourse into sentences and analyzeone sentence at a time

Sentencesplitting

Tokenization

Parse 1st sentence and identify nouns andpronouns POS-tagging

Assign weights to all nouns and pronouns Lappin & Leassweights

Reference pronoun to noun with highestweight, otherwise, if there are no pronouns,

divide all weights by 2

Lappin & Leassalgorithm

Proceed to 2nd sentence and repeat all steps asabove, adding all the weights along the way

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 80: NLP_session-3_Alexandra

Reference resolutionAlgorithm for pronoun resolution (Lappin & Leass, 1994)

Divide discourse into sentences and analyzeone sentence at a time

Sentencesplitting

Tokenization

Parse 1st sentence and identify nouns andpronouns POS-tagging

Assign weights to all nouns and pronouns Lappin & Leassweights

Reference pronoun to noun with highestweight, otherwise, if there are no pronouns,

divide all weights by 2

Lappin & Leassalgorithm

Proceed to 2nd sentence and repeat all steps asabove, adding all the weights along the way

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 81: NLP_session-3_Alexandra

Reference resolutionAlgorithm for pronoun resolution (Lappin & Leass, 1994)

Divide discourse into sentences and analyzeone sentence at a time

Sentencesplitting

Tokenization

Parse 1st sentence and identify nouns andpronouns POS-tagging

Assign weights to all nouns and pronouns Lappin & Leassweights

Reference pronoun to noun with highestweight, otherwise, if there are no pronouns,

divide all weights by 2

Lappin & Leassalgorithm

Proceed to 2nd sentence and repeat all steps asabove, adding all the weights along the way

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 82: NLP_session-3_Alexandra

Algorithm for pronoun resolutionWeighting scheme ↔ recency and syntactical preferences (Lappin& Leass, 1994):

1 Sentence recency ↔ 1002 Subject emphasis ↔ 80

e.g. El pastel está en la mesa de la cocina.

3 Existential emphasis ↔ 70e.g. Hay un pastel en la mesa de la cocina.

4 Direct object emphasis ↔ 50e.g. Ana hizo un pastel ayer.

5 Indirect object emphasis ↔ 40e.g. Ana regaló el pastel a Carmen.

6 Non-adverbial emphasis ↔ 50e.g. Ana puso un poco de chocolate en el pastel.

7 Head noun emphasis ↔ 80e.g. El libro de recetas para el pastel de chocolate está en lamesa de la cocina.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 83: NLP_session-3_Alexandra

Algorithm for pronoun resolutionWeighting scheme ↔ recency and syntactical preferences (Lappin& Leass, 1994):

1 Sentence recency ↔ 100

2 Subject emphasis ↔ 80e.g. El pastel está en la mesa de la cocina.

3 Existential emphasis ↔ 70e.g. Hay un pastel en la mesa de la cocina.

4 Direct object emphasis ↔ 50e.g. Ana hizo un pastel ayer.

5 Indirect object emphasis ↔ 40e.g. Ana regaló el pastel a Carmen.

6 Non-adverbial emphasis ↔ 50e.g. Ana puso un poco de chocolate en el pastel.

7 Head noun emphasis ↔ 80e.g. El libro de recetas para el pastel de chocolate está en lamesa de la cocina.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 84: NLP_session-3_Alexandra

Algorithm for pronoun resolutionWeighting scheme ↔ recency and syntactical preferences (Lappin& Leass, 1994):

1 Sentence recency ↔ 1002 Subject emphasis ↔ 80

e.g. El pastel está en la mesa de la cocina.

3 Existential emphasis ↔ 70e.g. Hay un pastel en la mesa de la cocina.

4 Direct object emphasis ↔ 50e.g. Ana hizo un pastel ayer.

5 Indirect object emphasis ↔ 40e.g. Ana regaló el pastel a Carmen.

6 Non-adverbial emphasis ↔ 50e.g. Ana puso un poco de chocolate en el pastel.

7 Head noun emphasis ↔ 80e.g. El libro de recetas para el pastel de chocolate está en lamesa de la cocina.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 85: NLP_session-3_Alexandra

Algorithm for pronoun resolutionWeighting scheme ↔ recency and syntactical preferences (Lappin& Leass, 1994):

1 Sentence recency ↔ 1002 Subject emphasis ↔ 80

e.g. El pastel está en la mesa de la cocina.

3 Existential emphasis ↔ 70e.g. Hay un pastel en la mesa de la cocina.

4 Direct object emphasis ↔ 50e.g. Ana hizo un pastel ayer.

5 Indirect object emphasis ↔ 40e.g. Ana regaló el pastel a Carmen.

6 Non-adverbial emphasis ↔ 50e.g. Ana puso un poco de chocolate en el pastel.

7 Head noun emphasis ↔ 80e.g. El libro de recetas para el pastel de chocolate está en lamesa de la cocina.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 86: NLP_session-3_Alexandra

Algorithm for pronoun resolutionWeighting scheme ↔ recency and syntactical preferences (Lappin& Leass, 1994):

1 Sentence recency ↔ 1002 Subject emphasis ↔ 80

e.g. El pastel está en la mesa de la cocina.

3 Existential emphasis ↔ 70e.g. Hay un pastel en la mesa de la cocina.

4 Direct object emphasis ↔ 50e.g. Ana hizo un pastel ayer.

5 Indirect object emphasis ↔ 40e.g. Ana regaló el pastel a Carmen.

6 Non-adverbial emphasis ↔ 50e.g. Ana puso un poco de chocolate en el pastel.

7 Head noun emphasis ↔ 80e.g. El libro de recetas para el pastel de chocolate está en lamesa de la cocina.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 87: NLP_session-3_Alexandra

Algorithm for pronoun resolutionWeighting scheme ↔ recency and syntactical preferences (Lappin& Leass, 1994):

1 Sentence recency ↔ 1002 Subject emphasis ↔ 80

e.g. El pastel está en la mesa de la cocina.

3 Existential emphasis ↔ 70e.g. Hay un pastel en la mesa de la cocina.

4 Direct object emphasis ↔ 50e.g. Ana hizo un pastel ayer.

5 Indirect object emphasis ↔ 40e.g. Ana regaló el pastel a Carmen.

6 Non-adverbial emphasis ↔ 50e.g. Ana puso un poco de chocolate en el pastel.

7 Head noun emphasis ↔ 80e.g. El libro de recetas para el pastel de chocolate está en lamesa de la cocina.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 88: NLP_session-3_Alexandra

Algorithm for pronoun resolutionWeighting scheme ↔ recency and syntactical preferences (Lappin& Leass, 1994):

1 Sentence recency ↔ 1002 Subject emphasis ↔ 80

e.g. El pastel está en la mesa de la cocina.

3 Existential emphasis ↔ 70e.g. Hay un pastel en la mesa de la cocina.

4 Direct object emphasis ↔ 50e.g. Ana hizo un pastel ayer.

5 Indirect object emphasis ↔ 40e.g. Ana regaló el pastel a Carmen.

6 Non-adverbial emphasis ↔ 50e.g. Ana puso un poco de chocolate en el pastel.

7 Head noun emphasis ↔ 80e.g. El libro de recetas para el pastel de chocolate está en lamesa de la cocina.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 89: NLP_session-3_Alexandra

Algorithm for pronoun resolutionWeighting scheme ↔ recency and syntactical preferences (Lappin& Leass, 1994):

1 Sentence recency ↔ 1002 Subject emphasis ↔ 80

e.g. El pastel está en la mesa de la cocina.

3 Existential emphasis ↔ 70e.g. Hay un pastel en la mesa de la cocina.

4 Direct object emphasis ↔ 50e.g. Ana hizo un pastel ayer.

5 Indirect object emphasis ↔ 40e.g. Ana regaló el pastel a Carmen.

6 Non-adverbial emphasis ↔ 50e.g. Ana puso un poco de chocolate en el pastel.

7 Head noun emphasis ↔ 80e.g. El libro de recetas para el pastel de chocolate está en lamesa de la cocina.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 90: NLP_session-3_Alexandra

Algorithm for reference resolution

Example discourse

Pedro se comió una tarta de chocolate. Él se la había pedido aJuan. Le gustan los dulces.

Step 1

1 Take first sentence: Pedro se comió una tarta de chocolate.

2 Parse this first sentence → parsing result:

Pedro/NP000P0 se/PP3CN000 comió/VMIS3S0una/DI0FS0 tarta/NCFS000 de/SPS00chocolate/NCMS000 ./.Pedro/SUBJ [se comió]/VERB [una tarta]/OBJde chocolate/COMPL ./.

3 Calculate weights for all nouns and pronouns appearing in thisfirst sentence:

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 91: NLP_session-3_Alexandra

Algorithm for reference resolution

Example discourse

Pedro se comió una tarta de chocolate. Él se la había pedido aJuan. Le gustan los dulces.

Step 1

1 Take first sentence: Pedro se comió una tarta de chocolate.

2 Parse this first sentence → parsing result:

Pedro/NP000P0 se/PP3CN000 comió/VMIS3S0una/DI0FS0 tarta/NCFS000 de/SPS00chocolate/NCMS000 ./.Pedro/SUBJ [se comió]/VERB [una tarta]/OBJde chocolate/COMPL ./.

3 Calculate weights for all nouns and pronouns appearing in thisfirst sentence:

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 92: NLP_session-3_Alexandra

Algorithm for reference resolution

Example discourse

Pedro se comió una tarta de chocolate. Él se la había pedido aJuan. Le gustan los dulces.

Step 1

1 Take first sentence: Pedro se comió una tarta de chocolate.

2 Parse this first sentence → parsing result:

Pedro/NP000P0 se/PP3CN000 comió/VMIS3S0una/DI0FS0 tarta/NCFS000 de/SPS00chocolate/NCMS000 ./.Pedro/SUBJ [se comió]/VERB [una tarta]/OBJde chocolate/COMPL ./.

3 Calculate weights for all nouns and pronouns appearing in thisfirst sentence:

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 93: NLP_session-3_Alexandra

Algorithm for reference resolution

Example discourse

Pedro se comió una tarta de chocolate. Él se la había pedido aJuan. Le gustan los dulces.

Step 1

1 Take first sentence: Pedro se comió una tarta de chocolate.

2 Parse this first sentence → parsing result:

Pedro/NP000P0 se/PP3CN000 comió/VMIS3S0una/DI0FS0 tarta/NCFS000 de/SPS00chocolate/NCMS000 ./.Pedro/SUBJ [se comió]/VERB [una tarta]/OBJde chocolate/COMPL ./.

3 Calculate weights for all nouns and pronouns appearing in thisfirst sentence:

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 94: NLP_session-3_Alexandra

Algorithm for reference resolution

Weights for the nouns and pronouns from the firstsentence:

(PRO)NOUNS Rec. Subj. Exist Obj. Ind.-Obj. Non -Adv. Head N TOT.Pedro 100 80 0 0 0 50 80 310tarta 100 0 0 50 0 50 80 280

chocolate 100 0 0 0 0 0 80 180

No pronouns whose reference needs to be resolved →divide all the results by 2:

(PRO)NOUNS TOT.Pedro 310/2 = 155tarta 280/2 = 140

chocolate 180/2 = 90

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 95: NLP_session-3_Alexandra

Algorithm for reference resolution

Weights for the nouns and pronouns from the firstsentence:

(PRO)NOUNS Rec. Subj. Exist Obj. Ind.-Obj. Non -Adv. Head N TOT.Pedro 100 80 0 0 0 50 80 310tarta 100 0 0 50 0 50 80 280

chocolate 100 0 0 0 0 0 80 180

No pronouns whose reference needs to be resolved →divide all the results by 2:

(PRO)NOUNS TOT.Pedro 310/2 = 155tarta 280/2 = 140

chocolate 180/2 = 90

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 96: NLP_session-3_Alexandra

Algorithm for reference resolution

Example discourse

Pedro se comió una tarta de chocolate. Él se la había pedido aJuan. Le gustan los dulces.

Step 2

1 Take second sentence: Él se la había pedido a Juan.

2 Parse this second sentence → parsing result:

Pedro/NP000P0 se/PP3CN000 comió/VMIS3S0una/DI0FS0 tarta/NCFS000 de/SPS00chocolate/NCMS000 ./.Pedro/SUBJ [se comió]/VERB [una tarta]/OBJde chocolate/COMPL ./.

3 Calculate weights for all new nouns and pronouns appearing inthis second sentence:

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 97: NLP_session-3_Alexandra

Algorithm for reference resolution

Example discourse

Pedro se comió una tarta de chocolate. Él se la había pedido aJuan. Le gustan los dulces.

Step 2

1 Take second sentence: Él se la había pedido a Juan.

2 Parse this second sentence → parsing result:

Pedro/NP000P0 se/PP3CN000 comió/VMIS3S0una/DI0FS0 tarta/NCFS000 de/SPS00chocolate/NCMS000 ./.Pedro/SUBJ [se comió]/VERB [una tarta]/OBJde chocolate/COMPL ./.

3 Calculate weights for all new nouns and pronouns appearing inthis second sentence:

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 98: NLP_session-3_Alexandra

Algorithm for reference resolution

Example discourse

Pedro se comió una tarta de chocolate. Él se la había pedido aJuan. Le gustan los dulces.

Step 2

1 Take second sentence: Él se la había pedido a Juan.

2 Parse this second sentence → parsing result:

Pedro/NP000P0 se/PP3CN000 comió/VMIS3S0una/DI0FS0 tarta/NCFS000 de/SPS00chocolate/NCMS000 ./.Pedro/SUBJ [se comió]/VERB [una tarta]/OBJde chocolate/COMPL ./.

3 Calculate weights for all new nouns and pronouns appearing inthis second sentence:

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 99: NLP_session-3_Alexandra

Algorithm for reference resolution

Example discourse

Pedro se comió una tarta de chocolate. Él se la había pedido aJuan. Le gustan los dulces.

Step 2

1 Take second sentence: Él se la había pedido a Juan.

2 Parse this second sentence → parsing result:

Pedro/NP000P0 se/PP3CN000 comió/VMIS3S0una/DI0FS0 tarta/NCFS000 de/SPS00chocolate/NCMS000 ./.Pedro/SUBJ [se comió]/VERB [una tarta]/OBJde chocolate/COMPL ./.

3 Calculate weights for all new nouns and pronouns appearing inthis second sentence:

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 100: NLP_session-3_Alexandra

Algorithm for reference resolution

Weights for the nouns and pronouns from the secondsentence:

(PRO)NOUNS Rec. Subj. Exist Obj. Ind.-Obj. Non -Adv. Head N TOT.Él 100 80 0 0 0 50 80 310la 100 0 0 50 0 50 80 280

Juan 100 0 0 0 40 50 80 270

The two pronouns Él and la have to be referred to nounsfrom the first sentence

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 101: NLP_session-3_Alexandra

Algorithm for reference resolution

Weights for the nouns and pronouns from the secondsentence:

(PRO)NOUNS Rec. Subj. Exist Obj. Ind.-Obj. Non -Adv. Head N TOT.Él 100 80 0 0 0 50 80 310la 100 0 0 50 0 50 80 280

Juan 100 0 0 0 40 50 80 270

The two pronouns Él and la have to be referred to nounsfrom the first sentence

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 102: NLP_session-3_Alexandra

Algorithm for reference resolution

Results from first two sentences:

(PRO)NOUNS TOT.Pedro 310/2 = 155tarta 280/2 = 140

chocolate 180/2 = 90

(PRO)NOUNS TOT.Él 310la 280

Juan 220

1 pronoun la is referred to noun tarta because of genderconstraints (i.e. only feminines here)

2 pronoun Él is referred to the noun from the previoussentence with the highest value, i.e. Pedro

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 103: NLP_session-3_Alexandra

Algorithm for reference resolution

Results from first two sentences:

(PRO)NOUNS TOT.Pedro 310/2 = 155tarta 280/2 = 140

chocolate 180/2 = 90

(PRO)NOUNS TOT.Él 310la 280

Juan 220

1 pronoun la is referred to noun tarta because of genderconstraints (i.e. only feminines here)

2 pronoun Él is referred to the noun from the previoussentence with the highest value, i.e. Pedro

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 104: NLP_session-3_Alexandra

Algorithm for reference resolution

Results from first two sentences:

(PRO)NOUNS TOT.Pedro 310/2 = 155tarta 280/2 = 140

chocolate 180/2 = 90

(PRO)NOUNS TOT.Él 310la 280

Juan 220

1 pronoun la is referred to noun tarta because of genderconstraints (i.e. only feminines here)

2 pronoun Él is referred to the noun from the previoussentence with the highest value, i.e. Pedro

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 105: NLP_session-3_Alexandra

Algorithm for reference resolution

Combined results of reference from first two sentences:

(PRO)NOUNS TOT.Pedro + Él (155+310)/2 = 232.5tarta + la 140+280/2 = 210chocolate 180/2 = 90

Juan 220/2 = 110

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 106: NLP_session-3_Alexandra

Algorithm for reference resolution

Example discourse

Pedro se comió una tarta de chocolate. Él se la había pedido aJuan. Le gustan los dulces.

Step 3

1 Take third sentence: Le gustan los dulces.

2 Parse this third sentence → parsing result:

Le/PP3CSD00 gustan/VMII3P0 los/DA0MP0dulces/NCMP000 ./.Le/IND-OBJ gustan/VERB [los dulces]/SUBJ ./.

3 Calculate weights for all new nouns and pronouns appearing inthis third sentence:

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 107: NLP_session-3_Alexandra

Algorithm for reference resolution

Example discourse

Pedro se comió una tarta de chocolate. Él se la había pedido aJuan. Le gustan los dulces.

Step 3

1 Take third sentence: Le gustan los dulces.

2 Parse this third sentence → parsing result:

Le/PP3CSD00 gustan/VMII3P0 los/DA0MP0dulces/NCMP000 ./.Le/IND-OBJ gustan/VERB [los dulces]/SUBJ ./.

3 Calculate weights for all new nouns and pronouns appearing inthis third sentence:

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 108: NLP_session-3_Alexandra

Algorithm for reference resolution

Example discourse

Pedro se comió una tarta de chocolate. Él se la había pedido aJuan. Le gustan los dulces.

Step 3

1 Take third sentence: Le gustan los dulces.

2 Parse this third sentence → parsing result:

Le/PP3CSD00 gustan/VMII3P0 los/DA0MP0dulces/NCMP000 ./.Le/IND-OBJ gustan/VERB [los dulces]/SUBJ ./.

3 Calculate weights for all new nouns and pronouns appearing inthis third sentence:

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 109: NLP_session-3_Alexandra

Algorithm for reference resolution

Example discourse

Pedro se comió una tarta de chocolate. Él se la había pedido aJuan. Le gustan los dulces.

Step 3

1 Take third sentence: Le gustan los dulces.

2 Parse this third sentence → parsing result:

Le/PP3CSD00 gustan/VMII3P0 los/DA0MP0dulces/NCMP000 ./.Le/IND-OBJ gustan/VERB [los dulces]/SUBJ ./.

3 Calculate weights for all new nouns and pronouns appearing inthis third sentence:

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 110: NLP_session-3_Alexandra

Algorithm for reference resolution

Weights for the nouns and pronouns from the thirdsentence:

(PRO)NOUNS Rec. Subj. Exist Obj. Ind.-Obj. Non -Adv. Head N TOT.Le 100 0 0 0 50 50 80 280

dulces 100 100 0 0 0 50 80 330

There is only the pronoun Le that needs to be referred to aprevious noun...

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 111: NLP_session-3_Alexandra

Algorithm for reference resolution

Weights for the nouns and pronouns from the thirdsentence:

(PRO)NOUNS Rec. Subj. Exist Obj. Ind.-Obj. Non -Adv. Head N TOT.Le 100 0 0 0 50 50 80 280

dulces 100 100 0 0 0 50 80 330

There is only the pronoun Le that needs to be referred to aprevious noun...

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 112: NLP_session-3_Alexandra

Algorithm for reference resolution

Combined results of reference from first two sentences:

NOUNS TOT.Pedro + Él (155+310)/2 = 232.5tarta + la 140+280/2 = 210chocolate 180/2 = 90

Juan 220/2 = 110

Singular masculine or feminine pronoun Le could bereferred to all singular, masculine and feminine, nouns:Pedro, Juan, tarta, or chocolate.we refer Le to previous noun with highest weight, i.e.Pedroreferencing is completed!!!

Lappin & Leass algorithm has nearly 90% accuracy.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 113: NLP_session-3_Alexandra

Algorithm for reference resolution

Combined results of reference from first two sentences:

NOUNS TOT.Pedro + Él (155+310)/2 = 232.5tarta + la 140+280/2 = 210chocolate 180/2 = 90

Juan 220/2 = 110

Singular masculine or feminine pronoun Le could bereferred to all singular, masculine and feminine, nouns:Pedro, Juan, tarta, or chocolate.

we refer Le to previous noun with highest weight, i.e.Pedroreferencing is completed!!!

Lappin & Leass algorithm has nearly 90% accuracy.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 114: NLP_session-3_Alexandra

Algorithm for reference resolution

Combined results of reference from first two sentences:

NOUNS TOT.Pedro + Él (155+310)/2 = 232.5tarta + la 140+280/2 = 210chocolate 180/2 = 90

Juan 220/2 = 110

Singular masculine or feminine pronoun Le could bereferred to all singular, masculine and feminine, nouns:Pedro, Juan, tarta, or chocolate.we refer Le to previous noun with highest weight, i.e.Pedro

referencing is completed!!!

Lappin & Leass algorithm has nearly 90% accuracy.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 115: NLP_session-3_Alexandra

Algorithm for reference resolution

Combined results of reference from first two sentences:

NOUNS TOT.Pedro + Él (155+310)/2 = 232.5tarta + la 140+280/2 = 210chocolate 180/2 = 90

Juan 220/2 = 110

Singular masculine or feminine pronoun Le could bereferred to all singular, masculine and feminine, nouns:Pedro, Juan, tarta, or chocolate.we refer Le to previous noun with highest weight, i.e.Pedroreferencing is completed!!!

Lappin & Leass algorithm has nearly 90% accuracy.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 116: NLP_session-3_Alexandra

Algorithm for reference resolution

Combined results of reference from first two sentences:

NOUNS TOT.Pedro + Él (155+310)/2 = 232.5tarta + la 140+280/2 = 210chocolate 180/2 = 90

Juan 220/2 = 110

Singular masculine or feminine pronoun Le could bereferred to all singular, masculine and feminine, nouns:Pedro, Juan, tarta, or chocolate.we refer Le to previous noun with highest weight, i.e.Pedroreferencing is completed!!!

Lappin & Leass algorithm has nearly 90% accuracy.

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 117: NLP_session-3_Alexandra

NER

Named Entity Recognition

Can be broken down in two distinct problems, i.e.:

1 detection of names2 classification of the names by the type of entity to which

they refer → 4 standard types:1 person (e.g. ”Carol”, ”Tom Hanks”; ”Pedro”, ”Juan Carlos I”,

etc.)2 organization (e.g. ”WWF”, ”IBM”, ”El Mundo”, etc.)3 location (e.g. ”Madrid”, "Washington D.C.”, ”L.A.”,

”Barcelona”, etc.)4 other (e.g. ”Hotel Catalunya”, etc. )

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 118: NLP_session-3_Alexandra

NER

Named Entity Recognition

Can be broken down in two distinct problems, i.e.:

1 detection of names2 classification of the names by the type of entity to which

they refer → 4 standard types:1 person (e.g. ”Carol”, ”Tom Hanks”; ”Pedro”, ”Juan Carlos I”,

etc.)2 organization (e.g. ”WWF”, ”IBM”, ”El Mundo”, etc.)3 location (e.g. ”Madrid”, "Washington D.C.”, ”L.A.”,

”Barcelona”, etc.)4 other (e.g. ”Hotel Catalunya”, etc. )

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 119: NLP_session-3_Alexandra

NER

Named Entity Recognition

Can be broken down in two distinct problems, i.e.:1 detection of names

2 classification of the names by the type of entity to whichthey refer → 4 standard types:

1 person (e.g. ”Carol”, ”Tom Hanks”; ”Pedro”, ”Juan Carlos I”,etc.)

2 organization (e.g. ”WWF”, ”IBM”, ”El Mundo”, etc.)3 location (e.g. ”Madrid”, "Washington D.C.”, ”L.A.”,

”Barcelona”, etc.)4 other (e.g. ”Hotel Catalunya”, etc. )

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 120: NLP_session-3_Alexandra

NER

Named Entity Recognition

Can be broken down in two distinct problems, i.e.:1 detection of names2 classification of the names by the type of entity to which

they refer → 4 standard types:

1 person (e.g. ”Carol”, ”Tom Hanks”; ”Pedro”, ”Juan Carlos I”,etc.)

2 organization (e.g. ”WWF”, ”IBM”, ”El Mundo”, etc.)3 location (e.g. ”Madrid”, "Washington D.C.”, ”L.A.”,

”Barcelona”, etc.)4 other (e.g. ”Hotel Catalunya”, etc. )

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 121: NLP_session-3_Alexandra

NER

Named Entity Recognition

Can be broken down in two distinct problems, i.e.:1 detection of names2 classification of the names by the type of entity to which

they refer → 4 standard types:1 person (e.g. ”Carol”, ”Tom Hanks”; ”Pedro”, ”Juan Carlos I”,

etc.)

2 organization (e.g. ”WWF”, ”IBM”, ”El Mundo”, etc.)3 location (e.g. ”Madrid”, "Washington D.C.”, ”L.A.”,

”Barcelona”, etc.)4 other (e.g. ”Hotel Catalunya”, etc. )

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 122: NLP_session-3_Alexandra

NER

Named Entity Recognition

Can be broken down in two distinct problems, i.e.:1 detection of names2 classification of the names by the type of entity to which

they refer → 4 standard types:1 person (e.g. ”Carol”, ”Tom Hanks”; ”Pedro”, ”Juan Carlos I”,

etc.)2 organization (e.g. ”WWF”, ”IBM”, ”El Mundo”, etc.)

3 location (e.g. ”Madrid”, "Washington D.C.”, ”L.A.”,”Barcelona”, etc.)

4 other (e.g. ”Hotel Catalunya”, etc. )

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 123: NLP_session-3_Alexandra

NER

Named Entity Recognition

Can be broken down in two distinct problems, i.e.:1 detection of names2 classification of the names by the type of entity to which

they refer → 4 standard types:1 person (e.g. ”Carol”, ”Tom Hanks”; ”Pedro”, ”Juan Carlos I”,

etc.)2 organization (e.g. ”WWF”, ”IBM”, ”El Mundo”, etc.)3 location (e.g. ”Madrid”, "Washington D.C.”, ”L.A.”,

”Barcelona”, etc.)

4 other (e.g. ”Hotel Catalunya”, etc. )

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 124: NLP_session-3_Alexandra

NER

Named Entity Recognition

Can be broken down in two distinct problems, i.e.:1 detection of names2 classification of the names by the type of entity to which

they refer → 4 standard types:1 person (e.g. ”Carol”, ”Tom Hanks”; ”Pedro”, ”Juan Carlos I”,

etc.)2 organization (e.g. ”WWF”, ”IBM”, ”El Mundo”, etc.)3 location (e.g. ”Madrid”, "Washington D.C.”, ”L.A.”,

”Barcelona”, etc.)4 other (e.g. ”Hotel Catalunya”, etc. )

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 125: NLP_session-3_Alexandra

NER

Tools for Named Entity RecognitionGATE for English, Spanish, and many more, via graphicalinterface and Java API (development at the University ofSheffield, UK)https://gate.ac.uk/NETagger: Java based Illinois Named Entity Recognition(development by Cognitive Computation Group at University ofIllinois at Urbana - Champaign)http://cogcomp.cs.illinois.edu/page/software_view/NETaggerOpenNLP: rule based and statistical Named Entity Recognition(development by Apache)http://opennlp.apache.org/index.htmlStanford CoreNLP: Java-based CRF Named Entity Recognition(development by Stanford Natural Language Processing Group)http://nlp.stanford.edu/software/CRF-NER.shtml

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 126: NLP_session-3_Alexandra

Keyword / topic / information extraction

Tools

Keyword extraction: e.g.1 GATE (ANNIE tool) for English, Spanish, and many more,

via graphical interface and Java APIhttps://gate.ac.uk/→ simply using jape files for the LUs

2 pattern.vector module from CLiPS in Pythonhttp://www.clips.ua.ac.be/pages/luceneapi_node/pattern.vector

Topic / information extraction: e.g. GATE (ANNIE tool)for English, Spanish, and many more, via graphicalinterface and Java API→ using jape files for the LUs, FEs, and FRAMES

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 127: NLP_session-3_Alexandra

Keyword / topic / information extraction

Tools

Keyword extraction: e.g.1 GATE (ANNIE tool) for English, Spanish, and many more,

via graphical interface and Java APIhttps://gate.ac.uk/→ simply using jape files for the LUs

2 pattern.vector module from CLiPS in Pythonhttp://www.clips.ua.ac.be/pages/luceneapi_node/pattern.vector

Topic / information extraction: e.g. GATE (ANNIE tool)for English, Spanish, and many more, via graphicalinterface and Java API→ using jape files for the LUs, FEs, and FRAMES

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 128: NLP_session-3_Alexandra

Keyword / topic / information extraction

ToolsKeyword extraction: e.g.

1 GATE (ANNIE tool) for English, Spanish, and many more,via graphical interface and Java APIhttps://gate.ac.uk/→ simply using jape files for the LUs

2 pattern.vector module from CLiPS in Pythonhttp://www.clips.ua.ac.be/pages/luceneapi_node/pattern.vector

Topic / information extraction: e.g. GATE (ANNIE tool)for English, Spanish, and many more, via graphicalinterface and Java API→ using jape files for the LUs, FEs, and FRAMES

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 129: NLP_session-3_Alexandra

Keyword / topic / information extraction

ToolsKeyword extraction: e.g.

1 GATE (ANNIE tool) for English, Spanish, and many more,via graphical interface and Java APIhttps://gate.ac.uk/→ simply using jape files for the LUs

2 pattern.vector module from CLiPS in Pythonhttp://www.clips.ua.ac.be/pages/luceneapi_node/pattern.vector

Topic / information extraction: e.g. GATE (ANNIE tool)for English, Spanish, and many more, via graphicalinterface and Java API→ using jape files for the LUs, FEs, and FRAMES

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 130: NLP_session-3_Alexandra

Keyword / topic / information extraction

ToolsKeyword extraction: e.g.

1 GATE (ANNIE tool) for English, Spanish, and many more,via graphical interface and Java APIhttps://gate.ac.uk/→ simply using jape files for the LUs

2 pattern.vector module from CLiPS in Pythonhttp://www.clips.ua.ac.be/pages/luceneapi_node/pattern.vector

Topic / information extraction: e.g. GATE (ANNIE tool)for English, Spanish, and many more, via graphicalinterface and Java API→ using jape files for the LUs, FEs, and FRAMES

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 131: NLP_session-3_Alexandra

Keyword / topic / information extraction

ToolsKeyword extraction: e.g.

1 GATE (ANNIE tool) for English, Spanish, and many more,via graphical interface and Java APIhttps://gate.ac.uk/→ simply using jape files for the LUs

2 pattern.vector module from CLiPS in Pythonhttp://www.clips.ua.ac.be/pages/luceneapi_node/pattern.vector

Topic / information extraction: e.g. GATE (ANNIE tool)for English, Spanish, and many more, via graphicalinterface and Java API→ using jape files for the LUs, FEs, and FRAMES

Dr. Alexandra M. Liguori NLP Training – Session 3

Page 132: NLP_session-3_Alexandra

What next?

Another practical session on GATE this summer?

Dr. Alexandra M. Liguori NLP Training – Session 3