Enhanced Aspect Level Sentiment Classification with Auxiliary Memory

Peisong Zhu, Tieyun Qian


Abstract
In aspect level sentiment classification, there are two common tasks: to identify the sentiment of an aspect (category) or a term. As specific instances of aspects, terms explicitly occur in sentences. It is beneficial for models to focus on nearby context words. In contrast, as high level semantic concepts of terms, aspects usually have more generalizable representations. However, conventional methods cannot utilize the information of aspects and terms at the same time, because few datasets are annotated with both aspects and terms. In this paper, we propose a novel deep memory network with auxiliary memory to address this problem. In our model, a main memory is used to capture the important context words for sentiment classification. In addition, we build an auxiliary memory to implicitly convert aspects and terms to each other, and feed both of them to the main memory. With the interaction between two memories, the features of aspects and terms can be learnt simultaneously. We compare our model with the state-of-the-art methods on four datasets from different domains. The experimental results demonstrate the effectiveness of our model.
Anthology ID:
C18-1092
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1077–1087
Language:
URL:
https://aclanthology.org/C18-1092
DOI:
Bibkey:
Cite (ACL):
Peisong Zhu and Tieyun Qian. 2018. Enhanced Aspect Level Sentiment Classification with Auxiliary Memory. In Proceedings of the 27th International Conference on Computational Linguistics, pages 1077–1087, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Enhanced Aspect Level Sentiment Classification with Auxiliary Memory (Zhu & Qian, COLING 2018)
Copy Citation:
PDF:
https://aclanthology.org/C18-1092.pdf