Attention and Lexicon Regularized LSTM for Aspect-based Sentiment Analysis

Lingxian Bao, Patrik Lambert, Toni Badia


Abstract
Attention based deep learning systems have been demonstrated to be the state of the art approach for aspect-level sentiment analysis, however, end-to-end deep neural networks lack flexibility as one can not easily adjust the network to fix an obvious problem, especially when more training data is not available: e.g. when it always predicts positive when seeing the word disappointed. Meanwhile, it is less stressed that attention mechanism is likely to “over-focus” on particular parts of a sentence, while ignoring positions which provide key information for judging the polarity. In this paper, we describe a simple yet effective approach to leverage lexicon information so that the model becomes more flexible and robust. We also explore the effect of regularizing attention vectors to allow the network to have a broader “focus” on different parts of the sentence. The experimental results demonstrate the effectiveness of our approach.
Anthology ID:
P19-2035
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Fernando Alva-Manchego, Eunsol Choi, Daniel Khashabi
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
253–259
Language:
URL:
https://aclanthology.org/P19-2035
DOI:
10.18653/v1/P19-2035
Bibkey:
Cite (ACL):
Lingxian Bao, Patrik Lambert, and Toni Badia. 2019. Attention and Lexicon Regularized LSTM for Aspect-based Sentiment Analysis. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, pages 253–259, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Attention and Lexicon Regularized LSTM for Aspect-based Sentiment Analysis (Bao et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-2035.pdf
Code
 LingxB/atlx
Data
SemEval-2014 Task-4