EmotionX-DLC: Self-Attentive BiLSTM for Detecting Sequential Emotions in Dialogues

Linkai Luo, Haiqin Yang, Francis Y. L. Chin


Abstract
In this paper, we propose a self-attentive bidirectional long short-term memory (SA-BiLSTM) network to predict multiple emotions for the EmotionX challenge. The BiLSTM exhibits the power of modeling the word dependencies, and extracting the most relevant features for emotion classification. Building on top of BiLSTM, the self-attentive network can model the contextual dependencies between utterances which are helpful for classifying the ambiguous emotions. We achieve 59.6 and 55.0 unweighted accuracy scores in the Friends and the EmotionPush test sets, respectively.
Anthology ID:
W18-3506
Volume:
Proceedings of the Sixth International Workshop on Natural Language Processing for Social Media
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Lun-Wei Ku, Cheng-Te Li
Venue:
SocialNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
32–36
Language:
URL:
https://aclanthology.org/W18-3506
DOI:
10.18653/v1/W18-3506
Bibkey:
Cite (ACL):
Linkai Luo, Haiqin Yang, and Francis Y. L. Chin. 2018. EmotionX-DLC: Self-Attentive BiLSTM for Detecting Sequential Emotions in Dialogues. In Proceedings of the Sixth International Workshop on Natural Language Processing for Social Media, pages 32–36, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
EmotionX-DLC: Self-Attentive BiLSTM for Detecting Sequential Emotions in Dialogues (Luo et al., SocialNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-3506.pdf
Data
EmotionLines