Toward Transformer Fusions for Chinese Sentiment Intensity Prediction in Valence-Arousal Dimensions

Yu Chih Deng, Yih-Ru Wang, Sin-Horng Chen, Lung Hao Lee*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

BERT (Bidirectional Encoder Representations from Transformers) uses an encoder architecture with an attention mechanism to construct a transformer-based neural network. In this study, we develop a Chinese word-level BERT to learn contextual language representations and propose a transformer fusion framework for Chinese sentiment intensity prediction in the valence-arousal dimensions. Experimental results on the Chinese EmoBank indicate that our transformer-based fusion model outperforms other neural-network-based, regression-based and lexicon-based methods, reflecting the effectiveness of integrating semantic representations in different degrees of linguistic granularity. Our proposed transformer fusion framework is also simple and easy to fine-tune over different downstream tasks.

Original languageEnglish
Pages (from-to)109974-109982
Number of pages9
JournalIEEE Access
Volume11
DOIs
StatePublished - 2023

Keywords

  • affective computing
  • Chinese word-level BERT
  • dimensional sentiment analysis
  • pre-trained language models
  • Transformer fusion

Fingerprint

Dive into the research topics of 'Toward Transformer Fusions for Chinese Sentiment Intensity Prediction in Valence-Arousal Dimensions'. Together they form a unique fingerprint.

Cite this