Download - Aspect-Specific Polarity-Aware Summarization of Online Reviews

Transcript
Page 1: Aspect-Specific Polarity-Aware Summarization of Online Reviews

1Aspect-Specific Polarity-Aware

Summarization of Online Reviews

Gaoyan Ou ([email protected])

School of EECS, Peking University

Page 2: Aspect-Specific Polarity-Aware Summarization of Online Reviews

2 Outline

Motivation

Related work

The proposed APSM and ME-APSM model

Experiments and results

Conclusion

Page 3: Aspect-Specific Polarity-Aware Summarization of Online Reviews

3 Outline

Motivation

Related work

The proposed APSM and ME-APSM model

Experiments and results

Conclusion

Page 4: Aspect-Specific Polarity-Aware Summarization of Online Reviews

4 Motivation

A large amount of reviews which contain people’s opinions

However, there are too many reviews to read!

Techniques to discover and summarize aspects and sentiments from online reviews are needed

It is still a challenging task useless to do analysis manually because of the huge number of

reviews the reviews are composed of unstructured texts

Page 5: Aspect-Specific Polarity-Aware Summarization of Online Reviews

5

Aspect & Sentiment Extraction

...

Aspect 1 (room) pos large clean safe comfortable ...

bathroom towels bed shower ...

dirty small uncomfortablenoise ...neg

Aspect 2 (meal)

breakfast fruit eggs juice ...

good fresh ...delicious wonderfulpos

cold awful terrible poor ...neg

Output 1

...Review n

Michelle KBusan, South Korea

...Hilton Wangfujing made my

stay in Beijing perfect! The

location of the hotel is great. ...

The room was large, luxurious

and very comfortable...

Review 1

Input

Sentiment Classification

Review n...

Overall sentiment:

Aspect-specific sentiment:

Review 1

room :

meal :

staff :

Output 2

Problem Setup

Aspect and sentiment extraction

Aspect extraction

Aspect-specific sentiment extraction

Sentiment Classification

Classify the overall review as positive or negative

Key advantage:

Figure out how sentiments are expressed according to different polarities for a particular aspect.

Page 6: Aspect-Specific Polarity-Aware Summarization of Online Reviews

6 Outline

Motivation

Related work

The proposed APSM and ME-APSM model

Experiments and results

Conclusion

Page 7: Aspect-Specific Polarity-Aware Summarization of Online Reviews

7 Related work

Aspect-based sentiment analysis Identify aspects that have been evaluated (aspect extraction) and predict sentiment for each

extracted aspects(sentiment extraction)

Frequency-based methods (Hu et al. 2004; Popescu et al. 2005)

uses frequent pattern mining and a dependency parser to find frequent noun terms and opinions cast on them.

Limitation: Produce many non-aspects matching with the patterns

Sequential labeling techniques (Jin et al. 2009; Jakob 2010; Choi and Cardie 2010)

Employs POS and lexical features on labeled data sets to train a CRF or HMM model

Limitation: Need manually labeled data for training.

LDA-based methods (ME-LDA (Zhao et al 2010), ME-SAS (Mukherjee and Liu 2012), ASUM (Jo and Oh 2011))

Unsupervised, can extract aspects and sentiments simultaneously.

Limitation: cannot extract polarity-aware sentiments for each aspect.

Page 8: Aspect-Specific Polarity-Aware Summarization of Online Reviews

8 Outline

Motivation

Related work

The proposed APSM and ME-APSM model

Experiments and results

Conclusion

Page 9: Aspect-Specific Polarity-Aware Summarization of Online Reviews

9 The Proposed Models

Two LDA-based aspect and sentiment models Aspect-specific Polarity-aware Sentiment Model (APSM)

Improved version of APSM (ME-APSM), which uses a maximum entropy component to better distinguish aspect word from sentiment word

Model inference

Integrate sentiment and aspect via asymmetric Dirichlet prior

Page 10: Aspect-Specific Polarity-Aware Summarization of Online Reviews

10APSM model

1. For each aspect :

i. Draw

ii. For each sentiment :

Draw

2. For each review

i. Draw

ii. For each aspect :

Draw

iii. For each sentence :

(a) Draw

(b) Draw

(c) Draw

(d) For each term where :

I. Draw ,

II. if , Draw

else draw

z

l w

r

πœ‹

θ δα

𝛾

𝛽𝐴 𝛽𝑂

D

Nd,sK

πœ‘π‘‚πœ‘π΄

πœ“

Sd

K KM

Graphical representation of APSM model

Page 11: Aspect-Specific Polarity-Aware Summarization of Online Reviews

11

x

Employing MaxEnt (ME-APSM model)

Observation: Aspect and sentiment terms play different syntactic roles in a sentence. Aspects Noun/NP ; room, front desk, etc.

Sentiments Adj/Adv; extremely, awesome, etc.

Approach: train a MaxEnt to classify terms to aspect/sentiment

Training data: automatically obtained by a sentiment lexicon

Features: lexical features and POS features

z

l w

r

πœ‹

ΞΈ

πœ†

Ξ±

𝛾

𝛽𝐴 𝛽𝑂

D

Nd,sK

πœ‘π‘‚πœ‘π΄

πœ“

Sd

K KM

Graphical representation of ME-APSM model

Page 12: Aspect-Specific Polarity-Aware Summarization of Online Reviews

12

𝑝 (π‘Ÿπ‘‘ , 𝑠 ,𝑛=οΏ½οΏ½( a)|…)∝(𝛿𝑂 (𝐴)+𝑛𝑑 , 𝑠𝑂 (𝐴)

¬𝑑 , 𝑠 ,𝑛)π›½π‘˜ ,π‘š ,𝑣𝑂 (𝐴) +π‘›π‘˜ ,π‘š ,𝑣

𝑂 (𝐴)¬𝑑 , 𝑠 ,𝑛

βˆ‘π‘£

(π›½ΒΏΒΏπ‘˜ ,π‘š ,𝑣𝑂 (𝐴)+π‘›π‘˜ ,π‘š ,𝑣𝑂(𝐴 )

¬𝑑 ,𝑠 ,𝑛)ΒΏ

Model inference

We use collapsed Gibbs Sampling to inference the model

Sampling formula of and (APSM and ME-APSM):

The sampler of latent variable is:

exp ΒΏ

APSM

ME-APSM

Page 13: Aspect-Specific Polarity-Aware Summarization of Online Reviews

13 Incorporating Prior Knowledge

π›½π‘˜ ,𝑣𝐴 ={ 0 i f π‘£βˆˆπ‘€

0.1 if π‘£βˆˆΞ©π‘˜

0.01others

π›½π‘˜ ,β€²π‘π‘œ 𝑠′ ,𝑣

𝑂 ={ 0 i f π‘£βˆˆπ‘€π‘›

0.1𝑖𝑓 𝑣 βˆˆπ‘€π‘

0.2 if π‘£βˆ‰π‘€π‘βˆ§π‘£βˆˆΞ©π‘˜ , β€²π‘π‘œπ‘ β€²

0.01others

we expect that no negative word appears in each aspect’s positive sentiment model

positive word will be more likely to appear in each aspect’s positive model

sentiment seeds will get higher prior weights

words in the aspect seed list will get higher prior weights

sentiment words should unlikely appear in aspect model

Sentiment Prior

Aspect Prior

Asymmetric Dirichlet prior

Page 14: Aspect-Specific Polarity-Aware Summarization of Online Reviews

14 Outline

Motivation

Related work

The proposed APSM and ME-APSM model

Experiments and results

Conclusion

Page 15: Aspect-Specific Polarity-Aware Summarization of Online Reviews

15 Experimental setup

Datasets

TripAdvisor (2500+/2500-), Amazon product reviews(2000+/2000-)

Sentiment lexicon [Hu and Liu 2004]

2006 positive words and 4783 negative words

MaxEnt training

We randomly select 2000 sentences from both datasets

Then we use sentiment lexicon to label the words as sentiment/aspect words

Stanford POS Tagger to tag the reviews

Parameters setting

K = 30, M = 2, , ;

following [Mukherjee and Liu 2012]

Page 16: Aspect-Specific Polarity-Aware Summarization of Online Reviews

16 Qualitative Results

AspectAPSM ME-APSM

Aspect Senti(p) Senti(n) Aspect Senti(p) Senti(n)

Staff

staffhelpfulfriendlyenglishdeskfrontgood

extremely

stafffriendly

courteoushelpful

attentivecleangreat

recommend

unhelpfulpoorbad

noisecold

problemoverpriced

disappointed

staffhelpfulfriendlyenglishdesk

extremelywaiter

waitress

goodgreat

helpfulfriendly

excellentwonderful

staffclean

rudeunfriendlyunhelpful

noisepoor

disappointedcheaphard

Meal

breakfastcoffeebuffetroomfruiteggsfresh

included

breakfastfriendlyfresh

varietygoodgreat

deliciousnice

coldscrambledproblem

hardbad

expensivepoordie

breakfastcoffeefruit

buffeteggs

cheesecerealjuice

goodgreatfreshhot

wonderfulexcellent

nicefantastic

coldscrambled

awfullimitedterrible

badpoor

disappointed

Example Aspects and Sentiments Extracted by APSM and ME-APSM

Both APSM and ME-APSM can extract coherent aspects and aspect-specific sentiments well.

β€œbreakfast”, β€œcoffee”, β€œbuffet”, β€œfruit” and β€œeggs” are all words related to the aspect meal.

In general, ME-APSM performs better than APSM.

APSM incorrectly identifies the aspect word β€œstaff” as positive sentiment words.

ME-APSM can discover more specific negative sentiment words, such as β€œrude” and β€œunfriendly”.

Page 17: Aspect-Specific Polarity-Aware Summarization of Online Reviews

17 Aspect-Specific Sentiment Extraction

Aspect/Sentiment

ME-LDA APSM ME-APSM

P@5 P@10 P@20 P@5 P@10 P@20 P@5 P@10 P@20

Staff/Pos 1.00 0.90 0.65 0.80 0.70 0.70 1.00 0.80 0.80

Staff/Neg 0.40 0.60 0.35 0.80 0.50 0.35 0.80 0.40 0.30

Room/Pos 0.80 0.60 0.70 1.00 0.90 0.80 1.00 0.80 0.75

Room/Neg 0.60 0.30 0.25 0.40 0.50 0.30 0.80 0.50 0.40

Meal/Pos 0.80 0.80 0.70 0.80 0.80 0.85 1.00 0.80 0.85

Meal/Neg 0.20 0.30 0.30 0.40 0.30 0.35 0.60 0.40 0.30

Avg./Pos 0.87 0.77 0.68 0.87 0.80 0.78 1.00 0.80 0.80

Avg./Neg 0.40 0.40 0.30 0.53 0.43 0.35 0.73 0.43 0.33

Aspect-specific Sentiment Extraction Performance

P@n as the metric to compare ME-LDA, APSM and ME-APSM.

APSM and ME-APSM give better results than ME-LDA.

ME-APSM further outperforms APSM, which suggests the effectiveness of the MaxEnt component.

Page 18: Aspect-Specific Polarity-Aware Summarization of Online Reviews

18 Sentiment ClassificationMethod Hotel Data Set Product Data Set

Lexicon-based Method 62.7% 60.2%

ASUM 65.6% 64.5%

APSM 69.7% 66.5%

ME-APSM 72.9% 69.2%

APSM+ 70.3% 66.9%

ME-APSM+ 73.9% 70.1%

Supervised Classification 74.3% 70.7%

Sentiment Classification Accuracy Lexicon-based Method

counting the positive and negative words in the review

Supervised Classification (Denecke 2009): logistic regression

ASUM (Jo and Oh 2011)

APSM+: APSM with aspect and sentiment seeds

ME-APSM+: ME-APSM with aspect and sentiment seeds

Lexicon-based method performs worst

can not capture the aspect information of the sentiment words.

APSM and ME-APSM give better results than ASUM.

separating aspects and sentiments improve sentiment classification accuracy

ME-APSM further outperforms APSM, which suggests the effectiveness of the MaxEnt component.

APSM+, ME-APSM+ > APSM, ME-APSM

Incorporating sentiment and aspect prior can improves performance

Page 19: Aspect-Specific Polarity-Aware Summarization of Online Reviews

19 Effect of aspect numbers

Sentiment Classification Accuracy with Different Aspect Numbers

Sentiment classification performance increases as K increases

This trend is more evident on the product data set.

Page 20: Aspect-Specific Polarity-Aware Summarization of Online Reviews

20 Outline

Motivation

Related work

The proposed APSM and ME-APSM model

Experiments and results

Conclusion

Page 21: Aspect-Specific Polarity-Aware Summarization of Online Reviews

21 Conclusion

In this paper, we focus on the problem of simultaneously aspect and sentiment extraction and sentiment classification of online reviews.

We proposed APSM and ME-APSM to address the problem. Key advantage: extract aspect-specific and polarity-aware sentiment

Incorporate sentiment and aspect prior information

In the future, we plan to apply our models to more sentiment analysis tasks, such as aspect-level sentiment classification

Page 22: Aspect-Specific Polarity-Aware Summarization of Online Reviews

22 Reference

Hu, M., Liu, B.: Mining and summarizing customer reviews. In: KDD. (2004) 168–177

Jo, Y., Oh, A.H.: Aspect and sentiment unification model for online review analysis. In: WSDM. (2011) 815–824

Popescu, A.M., Nguyen, B., Etzioni, O.: Opine: Extracting product features and opinions from reviews. In: HLT/EMNLP. (2005)

Zhao, W.X., Jiang, J., Yan, H., Li, X.: Jointly modeling aspects and opinions with a maxent-lda hybrid. In: EMNLP. (2010) 56–65

Mukherjee, A., Liu, B.: Aspect extraction through semi-supervised modeling. In:ACL (1). (2012) 339–348

Jin, W., Ho, H.H.: A novel lexicalized hmm-based learning framework for web opinion mining. In: Proceedings of the 26th Annual International Conference on Machine Learning. ICML ’09, New York, NY, USA, ACM (2009) 465–472

Jakob, N., Gurevych, I.: Extracting opinion targets in a single and cross-domain setting with conditional random fields. In: EMNLP. (2010) 1035–1045

Choi, Y., Cardie, C.: Hierarchical sequential learning for extracting opinions and their attributes. In: ACL (Short Papers). (2010) 269–274

Denecke, K.: Are sentiwordnet scores suited for multi-domain sentiment classification? In: ICDIM. (2009) 33–38

Page 23: Aspect-Specific Polarity-Aware Summarization of Online Reviews

Thank you, any question?

23