Abstract
BERT (Bidirectional Encoder Representations from Transformers) uses an encoder architecture with an attention mechanism to construct a transformer-based neural network. In this study, we develop a Chinese word-level BERT to learn contextual language representations and propose a transformer fusion framework for Chinese sentiment intensity prediction in the valence-arousal dimensions. Experimental results on the Chinese EmoBank indicate that our transformer-based fusion model outperforms other neural-network-based, regression-based and lexicon-based methods, reflecting the effectiveness of integrating semantic representations in different degrees of linguistic granularity. Our proposed transformer fusion framework is also simple and easy to fine-tune over different downstream tasks.
Original language | English |
---|---|
Pages (from-to) | 109974-109982 |
Number of pages | 9 |
Journal | IEEE Access |
Volume | 11 |
DOIs | |
State | Published - 2023 |
Keywords
- affective computing
- Chinese word-level BERT
- dimensional sentiment analysis
- pre-trained language models
- Transformer fusion