Predicting Chinese Phrase-Level Sentiment Intensity in Valence-Arousal Dimensions With Linguistic Dependency Features

Yu Chih Deng, Cheng Yu Tsai, Yih-Ru Wang, Sin-Horng Chen, Lung Hao Lee*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Phrase-level sentiment intensity prediction is difficult due to the inclusion of linguistic modifiers (e.g., negators, degree adverbs, and modals) potentially resulting in an intensity shift or polarity reversal for the modified words. This study develops a graph-based Chinese parser based on the deep biaffine attention model to obtain dependency structures and relations. These obtained dependency features are then used in our proposed Weighted-sum Tree GRU network to predict phrase-level sentiment intensity in the valence-arousal dimensions. Dependency parsing results using the Sinica Treebank indicate that our graph-based model outperforms transition-based methods such as MLP and stack-LSTM with identical findings for English dependency parsing. Experimental results on the Chinese EmoBank indicate that our Weighted-sum Tree GRU network model outperforms other transformer-based neural networks such as BERT, ALBERT, XLNET and ELECTRA, reflecting the effectiveness of linguistic dependencies in phrase-level sentiment intensity predication tasks. In addition, our proposed model requires fewer parameters and less inference time for quantitative analysis, making the proposed model is relatively lightweight and efficient.

Original languageEnglish
Pages (from-to)126612-126620
Number of pages9
JournalIEEE Access
Volume10
DOIs
StatePublished - 2022

Keywords

  • Dependency parsing
  • affective computing
  • deep learning
  • dimensional sentiment analysis

Fingerprint

Dive into the research topics of 'Predicting Chinese Phrase-Level Sentiment Intensity in Valence-Arousal Dimensions With Linguistic Dependency Features'. Together they form a unique fingerprint.

Cite this