Syntax-Enhanced Pretrained Language Models for Aspect-Level Sentiment Classification

Li Yuan, Jin Wang*, Lung Hao Lee*, Xuejie Zhang

*此作品的通信作者

研究成果: Article同行評審

摘要

The main challenge of aspect-level sentiment classification (ASC) is associating target aspect terms with relevant contextual words. Existing methods improve ASC performance by incorporating syntactic dependencies through a graph convolution layer on top of BERT. However, these approaches often assign a fixed weight to edges with the same dependency type, overlooking the contextual nuances these dependencies can convey. To address this, we propose syntax-enhanced BERT (SE-BERT), which integrates syntactic distance embeddings, a syntax-enhanced transformer, aspect-specific masking, and a sentiment classification layer. SE-BERT advances previous methods in two main ways. First, SE-BERT enables dynamic weighting for edges with the same dependency type based on the source node's part-of-speech (POS) tags, which enhances graph propagation accuracy and enables more precise edge associations. Second, rather than adding additional graph convolutional layers on top of BERT, SE-BERT replaces BERT's final Transformer layers with the proposed SE-Transformer. This approach directly encodes syntactic information into the attention distribution and word representation within scaled dot-product attention. The model can be initialized from a pretrained checkpoint and fine-tuned for downstream tasks without requiring additional training parameters. Experimental results on five benchmark datasets demonstrate that SE-BERT outperforms existing ASC methods.

原文English
期刊IEEE Transactions on Computational Social Systems
DOIs
出版狀態Accepted/In press - 2024

指紋

深入研究「Syntax-Enhanced Pretrained Language Models for Aspect-Level Sentiment Classification」主題。共同形成了獨特的指紋。

引用此