Building WordSpaces via Random Indexing from simple to complex spaces

53
Building WordSpaces via Random Indexing from simple to complex spaces Pierpaolo Basile [email protected] Panzerotti Talk, Dipartimento di Informatica, UNIBA 6 Marzo 2015

Transcript of Building WordSpaces via Random Indexing from simple to complex spaces

Page 1: Building WordSpaces via Random Indexing from simple to complex spaces

Building WordSpaces via Random Indexing

from simple to complex spaces

Pierpaolo Basile [email protected]

Panzerotti Talk, Dipartimento di Informatica, UNIBA 6 Marzo 2015

Page 2: Building WordSpaces via Random Indexing from simple to complex spaces

What’s Tezguno?

Page 3: Building WordSpaces via Random Indexing from simple to complex spaces

A bottle of Tezguno is on the table.

Everyone likes Tezguno.

Tezguno makes you drunk.

We make Tezguno out of corn.

What’s Tezguno?

Page 4: Building WordSpaces via Random Indexing from simple to complex spaces

Tezguno

Page 5: Building WordSpaces via Random Indexing from simple to complex spaces

Distributional Semantic Models (DSM)

You shall know a word by the company it keeps!

John Rupert Firth

5

Meaning of a word is determined by its usage Ludwig Wittgenstein

Methods in structural linguistics Distributional structure Mathematical structures of language

Zellig Harris

Page 6: Building WordSpaces via Random Indexing from simple to complex spaces

Distributional Semantic Models

Page 7: Building WordSpaces via Random Indexing from simple to complex spaces

Distributional Semantic Models

• Computational models able to build semantic representation of words analyzing large corpora:

– Words are represented by mathematical points in a geometric space

– Vectors are built taking into account linguistic contexts in which words occur

Page 8: Building WordSpaces via Random Indexing from simple to complex spaces

The simplest DSM

• Matrix: words X linguistic contexts

C1 C2 C3 C4 C5 C6 C7

dog 5 0 11 2 2 9 1

cat 4 1 7 1 1 7 2

bread 0 12 0 0 9 1 9

pasta 0 8 1 2 14 0 10

meat 0 7 1 1 11 1 8

mouse 4 0 8 0 1 8 1

Page 9: Building WordSpaces via Random Indexing from simple to complex spaces

The simplest DSM

• Matrix: words X linguistic contexts

C1 C2 C3 C4 C5 C6 C7

dog 5 0 11 2 2 9 1

cat 4 1 7 1 1 7 2

bread 0 12 0 0 9 1 9

pasta 0 8 1 2 14 0 10

meat 0 7 1 1 11 1 8

mouse 4 0 8 0 1 8 1

Page 10: Building WordSpaces via Random Indexing from simple to complex spaces

The simplest DSM C3

C5

pasta

mouse dog

cat

Similar words are represented close in the space

Page 11: Building WordSpaces via Random Indexing from simple to complex spaces

DSM generalization

• A DSM is defined as <T, C, R, W, M, d, S>

– T: target elements (words)

– C: contexts

– R: the relation between T and C

– W: weighting schema

– M: geometric space TxC

– d: matrix reduction M -> M’

– S: similarity function in M’

Page 12: Building WordSpaces via Random Indexing from simple to complex spaces

Build a DSM

1. Corpus pre-processing

2. Identify words and contexts

3. Count co-occurrences (words in contexts)

4. Weight (optional)

5. Space reduction (optional)

Page 13: Building WordSpaces via Random Indexing from simple to complex spaces

Parameters

• The definition of context

– surrounding words, phrase, sentence, paragraph, document, syntactic context

• Weighting schema

• Similarity function

Page 14: Building WordSpaces via Random Indexing from simple to complex spaces

1. Pre-processing

• Tokenization is necessary!

– PoS-tagging

– Lemmatization

– Parsing

• A deep analysis

– Introduces errors

– Requires other parameters

– Language dependent

Page 15: Building WordSpaces via Random Indexing from simple to complex spaces

2. The context

• Document – the whole document

– paragraph, sentence, passage

• Word – Most n frequent words

– Where? • surrounding words (window)

• pattern

• syntactic dependency

Page 16: Building WordSpaces via Random Indexing from simple to complex spaces

3. Weighting schema

• Occurrences

• log(occurrences): relax most frequent words

• Mutual Information, Log-Likelihood Ratio

• Tf-Idf, word-entropy, …

Page 17: Building WordSpaces via Random Indexing from simple to complex spaces

Point-wise Mutual Information

N

wfreqwP

N

wwfreqwwP

wPwP

wwPwwMI

ii

)()(

),(),(

)()(

),(log),(

2121

21

21221

),(),( 2121 wwMIwwfreq

Local Mutual-Information

P(drink) = 100/106

P(beer) = 25/106 P(water)=150/106

P(drink, water)=60/106 P(drink, beer)=20/106

MI(drink, beer)=log2(2*106/250) = 12,96 MI(drink, water)=log2(6*106/1500) = 11,96

LMI(drink, beer) = 0,0002592 LMI(drink, water) = 0,0007176

Page 18: Building WordSpaces via Random Indexing from simple to complex spaces

5. Matrix reduction

• DSM is high dimensional and very sparse:

1. matrix reduction: LSI, PCA

2. Random Indexing

Page 19: Building WordSpaces via Random Indexing from simple to complex spaces

Why sparse?

WORD RANK

WORD F

REQ

UENCY

Page 20: Building WordSpaces via Random Indexing from simple to complex spaces

Random Indexing

A kind of magic

Page 21: Building WordSpaces via Random Indexing from simple to complex spaces

The idea

• Assign a random vector to each context: random/context vector

• The semantic vector assigned to each word is the sum of random/context vectors in which the word occurs

Page 22: Building WordSpaces via Random Indexing from simple to complex spaces

Random/context vector

• sparse

• high dimensional

• ternary, values in {-1, 0, +1}

• a small number of no-zero elements randomly distributed

0 0 0 0 0 0 0 -1 0 0 0 0 1 0 0 -1 0 1 0 0 0 0 1 0 0 0 0 -1

22

Page 23: Building WordSpaces via Random Indexing from simple to complex spaces

Random Indexing (formal)

kmmnkn RAB ,,,

B preserves the Euclidean distance between points (Johnson-Lindenstrauss lemma)

mk

),()1(),(),()1( uvduvduvd rr

23

Page 24: Building WordSpaces via Random Indexing from simple to complex spaces

Example

John eats a red apple Rjohn -> (0, 0, 0, 0, 0, 0, 1, 0, -1, 0) Reat -> (1, 0, 0, 0, -1, 0, 0 ,0 ,0 ,0) Rred-> (0, 0, 0, 1, 0, 0, 0, -1, 0, 0)

SVapple= Rjohn+ Reat+Rred=(1, 0, 0, 1, -1, 0, 1, -1, -1, 0)

24

Page 25: Building WordSpaces via Random Indexing from simple to complex spaces

Permutation

• Permutation of elements in the context vector to encode information

– Words order

– Syntactic dependencies

• The context vector is permutated before the sum according to the context

Page 26: Building WordSpaces via Random Indexing from simple to complex spaces

Example (words order)

John eats a red apple Rjohn -> (0, 0, 0, 0, 0, 0, 1, 0, -1, 0) Reat -> (1, 0, 0, 0, -1, 0, 0 ,0 ,0 ,0) Rapple-> (0, 1, 0, 0, 0, 0, 0, 0, 0, -1)

SVred= R-2john+ R-1eat+R+1apple=

(0,0,0,0,1,0,-1,0,0,0)+ (0,0,0,-1,0,0,0,0,0,1)+ (-1,0,1,0,0,0,0,0,0,0)=(-1,0,1,-1,1,0,-1,0,0,1)

26

Page 27: Building WordSpaces via Random Indexing from simple to complex spaces

Permutation (query)

• At query time apply inverse permutation according to the context

• Word order: <t> ? (search the most similar terms to the right of the term <t>)

– inverse permutation (-1) applied to the random vector of t R-1t

• t must occur to the left

– Compute the similarity of R-1t wrt all vectors

Page 28: Building WordSpaces via Random Indexing from simple to complex spaces

SIMPLE DSMS AND SIMPLE OPERATORS

Page 29: Building WordSpaces via Random Indexing from simple to complex spaces

Simple DSM

• Term-term co-occurrence matrix

• Latent Semantic Analysis (LSA): relies on the Singular Value Decomposition (SVD) of the co-occurrence matrix

• Random Indexing (RI): based on the Random Projection

• Latent Semantic Analysis over Random Indexing (RILSA)

Page 30: Building WordSpaces via Random Indexing from simple to complex spaces

Simple operators…

Addition (+): pointwise sum of components

Multiplication (∘): pointwise multiplication of components

Addition and multiplication are commutative

– do not take into account word order

Complex structures represented summing or multiplying words which compose them

Page 31: Building WordSpaces via Random Indexing from simple to complex spaces

…Simple operators

Given two word vectors u and v

– composition by sum p = u + v

– composition by multiplication p = u ∘ v

Can be applied to any sequence of words

Page 32: Building WordSpaces via Random Indexing from simple to complex spaces

SYNTACTIC DEPENDENCIES IN DSMS

Complex DSMs

Page 33: Building WordSpaces via Random Indexing from simple to complex spaces

Syntactic dependencies…

John eats a red apple.

John eats apple

red

modifier

object subject

33

Page 34: Building WordSpaces via Random Indexing from simple to complex spaces

…Syntactic dependencies

John eats a red apple.

John eats apple

red

modifier

object subject

34

HEAD DEPENDENT

Page 35: Building WordSpaces via Random Indexing from simple to complex spaces

Representing dependences

Use filler/role binding approach to represent a dependency dep(u, v)

rd⊚u + rh⊚v

rd and rh are vectors which represent respectively the role of dependent and head

⊚ is a placeholder for a composition operator

Page 36: Building WordSpaces via Random Indexing from simple to complex spaces

Representing dependences (example)

obj(apple, eat)

rd⊚apple + rh⊚eat

role vectors

Page 37: Building WordSpaces via Random Indexing from simple to complex spaces

Structured DSMs

1. Vector permutation in RI (PERM) to encode dependencies

2. Circular convolution (CONV) as filler/binding operator to represent syntactic dependencies in DSMs

3. LSA over PERM and CONV carries out two spaces: PERMLSA and CONVLSA

Page 38: Building WordSpaces via Random Indexing from simple to complex spaces

Vector permutation in RI (PERM)

Using permutation of elements in context vectors to encode dependencies

– right rotation of n elements to encode dependents (permutation)

– left rotation of n elements to encode heads (inverse permutation)

38

Page 39: Building WordSpaces via Random Indexing from simple to complex spaces

PERM (method)

Create and assign a context vector to each term

Assign a rotation function Π+1 to the dependent and Π-1 to the head

Each term is represented by a vector which is – the sum of the permuted vectors of all the dependent

terms

– the sum of the inverse permuted vectors of all the head terms

– the sum of the no-permuted vectors of both dependent and head words

39

Page 40: Building WordSpaces via Random Indexing from simple to complex spaces

PERM (example…)

John eats a red apple

John -> (0, 0, 0, 0, 0, 0, 1, 0, -1, 0) eat -> (1, 0, 0, 0, -1, 0, 0 ,0 ,0 ,0) red -> (0, 0, 0, 1, 0, 0, 0, -1, 0, 0) apple -> (1, 0, 0, 0, 0, 0, 0, -1, 0, 0)

TVapple = Π+1(CVred) + Π-1(CVeat) + CVred + CVeat

40

apple red

eats

Page 41: Building WordSpaces via Random Indexing from simple to complex spaces

PERM (…example)

John eats a red apple

John -> (0, 0, 0, 0, 0, 0, 1, 0, -1, 0) eat -> (1, 0, 0, 0, -1, 0, 0 ,0 ,0 ,0) red-> (0, 0, 0, 1, 0, 0, 0, -1, 0, 0) apple -> (1, 0, 0, 0, 0, 0, 0, -1, 0, 0)

TVapple = Π+1(CVred) + Π-1(CVeat) + CVred + CVeat=… …=(0, 0, 0, 0, 1, 0, 0, 0, -1, 0) + (0, 0, 0, -1, 0, 0, 0, 0, 1) + + (0, 0, 0, 1, 0, 0, 0, -1, 0, 0) + (1, 0, 0, 0, -1, 0, 0 ,0 ,0 ,0)

right shift left shift

41

Page 42: Building WordSpaces via Random Indexing from simple to complex spaces

Convolution (CONV)

Create and assign a context vector to each term

Create two context vectors for head and dependent roles

Each term is represented by a vector which is – the sum of the convolution between dependent terms

and the dependent role vector

– the sum of the convolution between head terms and the head role vector

– the sum of the vectors of both dependent and head words

Page 43: Building WordSpaces via Random Indexing from simple to complex spaces

Circular convolution operator

Circular convolution

p=u⊛v

U0 U1 U2 U3 U4

V0 1 1 -1 -1 1

V1 -1 -1 1 1 -1

V2 1 1 -1 -1 1

V3 -1 -1 1 1 -1

V4 -1 -1 1 1 -1

U=<1, 1, -1, -1, 1> V=<1, -1, 1, -1, -1> P=<-1, 3, -1, -1, -1>

P0

P1

P2

P3

P4

Page 44: Building WordSpaces via Random Indexing from simple to complex spaces

Circular convolution by FFTs

Circular convolution is computed in O(n2)

– using FFTs is computed in O(nlogn)

Given f the discrete FFTs and f -1 its inverse

– u ⊛v = f -1( f(u) ∘ f (v) )

Page 45: Building WordSpaces via Random Indexing from simple to complex spaces

CONV (example)

John eats a red apple

John -> (0, 0, 0, 0, 0, 0, 1, 0, -1, 0) eat -> (1, 0, 0, 0, -1, 0, 0 ,0 ,0 ,0) red-> (0, 0, 0, 1, 0, 0, 0, -1, 0, 0) apple -> (1, 0, 0, 0, 0, 0, 0, -1, 0, 0) rd -> (0, 0, 1, 0, -1, 0, 0, 0, 0, 0) rh -> (0, -1, 1, 0, 0, 0, 0, 0, 0, 0)

apple = eat +red + (rd⊛ red) + (rh ⊛ eat)

45

Context vector for dependent role

Context vector for head role

Page 46: Building WordSpaces via Random Indexing from simple to complex spaces

Complex operators

Based on filler/role binding taking into account syntactic role: rd ⊚ u + rh ⊚ v

– u and v could be recursive structures

Two vector operators to bind the role: – convolution (⊛)

– tensor (⊗)

– convolution (⊛+): exploits also the sum of term vectors

rd ⊛ u + rh ⊛ v + v + u

Page 47: Building WordSpaces via Random Indexing from simple to complex spaces

Complex operators (remarks)

Existing operators

– t1 ⊚ t2 ⊚ … ⊚ tn: does not take into account syntactic role

– t1 ⊛ t2 is commutative

– t1⊗t2 ⊗ … ⊗tn: tensor order depends on the phrase length

• two phrases with different length are not comparable

– t1 ⊗ r1 ⊗t2 ⊗r2 ⊗ … ⊗ tn ⊗rn : also depends on the sentence length

Page 48: Building WordSpaces via Random Indexing from simple to complex spaces

System setup

• Corpus

– WaCkypedia EN based on a 2009 dump of Wikipedia

– about 800 million tokens

– dependency parse by MaltParser

• DSMs

– 500 vector dimension (LSA/RI/RILSA)

– 1,000 vector dimension (PERM/CONV/PERMLSA/CONVLSA)

– 50,000 most frequent words

– co-occurrence distance: 4

48

Page 49: Building WordSpaces via Random Indexing from simple to complex spaces

Evaluation

• GEMS 2011 Shared Task for compositional semantics – list of two pairs of words combination

(support offer) (help provide) 7 (old person) (right hand) 1

• rated by humans • 5,833 rates • 3 types involved: noun-noun (NN), adjective-noun (AN),

verb-object (VO)

– GOAL: compare the system performance against humans scores • Spearman correlation

49

Page 50: Building WordSpaces via Random Indexing from simple to complex spaces

Results (simple spaces)… NN AN VO

TTM

LSA

RI

RILS

A

TTM

LSA

RI

RILS

A

TTM

LSA

RI

RILS

A

+ .21 .36 .25 .42 .22 .35 .33 .41 .23 .31 .28 .31

∘ .31 .15 .23 .22 .21 .20 .22 .18 .13 .10 .18 .21

⊛ .21 .38 .26 .35 .20 .33 .31 .44 .15 .31 .24 .34

⊛+

.21 .34 .28 .43 .23 .32 .31 .37 .20 .31 .25 .29

⊗ .21 .38 .25 .39 .22 .38 .33 .43 .15 .34 .26 .32

human .49 .52 .55

Simple Semantic Spaces

Page 51: Building WordSpaces via Random Indexing from simple to complex spaces

…Results (structured spaces) NN AN VO

CO

NV

PER

M

CO

NV

LSA

PER

MLS

A

CO

NV

PER

M

CO

NV

LSA

PER

MLS

A

CO

NV

PER

M

CO

NV

LSA

PER

MLS

A

+ .36 .39 .43 .42 .34 .39 .42 .45 .27 .23 .30 .31

∘ .22 .17 .10 .13 .23 .27 .13 .15 .20 .15 .06 .14

⊛ .31 .36 .37 .35 .39 .39 .45 .44 .28 .23 .27 .28

⊛+

.30 .36 .40 .36 .38 .32 .48 .44 .27 .22 .30 .32

⊗ .34 .37 .37 .40 .36 .40 .45 .45 .27 .24 .31 .32

human .49 .52 .55

Structured Semantic Spaces

Page 52: Building WordSpaces via Random Indexing from simple to complex spaces

Final remarks

• Best results are obtained when complex operators/spaces (or both) are involved

• No best combination of operator/space exists – depend on the type of relation (NN, AN, VO)

• Tensor product and convolution provide good results in spite of previous results – filler/role binding is effective

• Future work – generate several rd and rh vectors for each kind of

dependency – apply this approach to other direct graph-based

representations

Page 53: Building WordSpaces via Random Indexing from simple to complex spaces

Thank you for your attention

<[email protected]>