Exploiting BERT for End-to-End Aspect-based Sentiment Analysis

Xin Li, Lidong Bing, Wenxuan Zhang, Wai Lam


Abstract
In this paper, we investigate the modeling power of contextualized embeddings from pre-trained language models, e.g. BERT, on the E2E-ABSA task. Specifically, we build a series of simple yet insightful neural baselines to deal with E2E-ABSA. The experimental results show that even with a simple linear classification layer, our BERT-based architecture can outperform state-of-the-art works. Besides, we also standardize the comparative study by consistently utilizing a hold-out validation dataset for model selection, which is largely ignored by previous works. Therefore, our work can serve as a BERT-based benchmark for E2E-ABSA.
Anthology ID:
D19-5505
Volume:
Proceedings of the 5th Workshop on Noisy User-generated Text (W-NUT 2019)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Wei Xu, Alan Ritter, Tim Baldwin, Afshin Rahimi
Venue:
WNUT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
34–41
Language:
URL:
https://aclanthology.org/D19-5505
DOI:
10.18653/v1/D19-5505
Bibkey:
Cite (ACL):
Xin Li, Lidong Bing, Wenxuan Zhang, and Wai Lam. 2019. Exploiting BERT for End-to-End Aspect-based Sentiment Analysis. In Proceedings of the 5th Workshop on Noisy User-generated Text (W-NUT 2019), pages 34–41, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Exploiting BERT for End-to-End Aspect-based Sentiment Analysis (Li et al., WNUT 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-5505.pdf
Code
 lixin4ever/BERT-E2E-ABSA