Amobee at IEST 2018: Transfer Learning from Language Models

Alon Rozental, Daniel Fleischer, Zohar Kelrich


Abstract
This paper describes the system developed at Amobee for the WASSA 2018 implicit emotions shared task (IEST). The goal of this task was to predict the emotion expressed by missing words in tweets without an explicit mention of those words. We developed an ensemble system consisting of language models together with LSTM-based networks containing a CNN attention mechanism. Our approach represents a novel use of language models—specifically trained on a large Twitter dataset—to predict and classify emotions. Our system reached 1st place with a macro F1 score of 0.7145.
Anthology ID:
W18-6207
Volume:
Proceedings of the 9th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis
Month:
October
Year:
2018
Address:
Brussels, Belgium
Editors:
Alexandra Balahur, Saif M. Mohammad, Veronique Hoste, Roman Klinger
Venue:
WASSA
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
43–49
Language:
URL:
https://aclanthology.org/W18-6207
DOI:
10.18653/v1/W18-6207
Bibkey:
Cite (ACL):
Alon Rozental, Daniel Fleischer, and Zohar Kelrich. 2018. Amobee at IEST 2018: Transfer Learning from Language Models. In Proceedings of the 9th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis, pages 43–49, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Amobee at IEST 2018: Transfer Learning from Language Models (Rozental et al., WASSA 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-6207.pdf