BERT at SemEval-2020 Task 8: Using BERT to Analyse Meme Emotions

Adithya Avvaru, Sanath Vobilisetty


Abstract
Sentiment analysis, being one of the most sought after research problems within Natural Language Processing (NLP) researchers. The range of problems being addressed by sentiment analysis is increasing. Till now, most of the research focuses on predicting sentiment, or sentiment categories like sarcasm, humor, offense and motivation on text data. But, there is very limited research that is focusing on predicting or analyzing the sentiment of internet memes. We try to address this problem as part of “Task 8 of SemEval 2020: Memotion Analysis”. We have participated in all the three tasks under Memotion Analysis. Our system built using state-of-the-art Transformer-based pre-trained Bidirectional Encoder Representations from Transformers (BERT) performed better compared to baseline models for the two tasks A and C and performed close to the baseline model for task B. In this paper, we present the data used, steps used by us for data cleaning and preparation, the fine-tuning process for BERT based model and finally predict the sentiment or sentiment categories. We found that the sequence models like Long Short Term Memory(LSTM) and its variants performed below par in predicting the sentiments. We also performed a comparative analysis with other Transformer based models like DistilBERT and XLNet.
Anthology ID:
2020.semeval-1.144
Volume:
Proceedings of the Fourteenth Workshop on Semantic Evaluation
Month:
December
Year:
2020
Address:
Barcelona (online)
Editors:
Aurelie Herbelot, Xiaodan Zhu, Alexis Palmer, Nathan Schneider, Jonathan May, Ekaterina Shutova
Venue:
SemEval
SIG:
SIGLEX
Publisher:
International Committee for Computational Linguistics
Note:
Pages:
1094–1099
Language:
URL:
https://aclanthology.org/2020.semeval-1.144
DOI:
10.18653/v1/2020.semeval-1.144
Bibkey:
Cite (ACL):
Adithya Avvaru and Sanath Vobilisetty. 2020. BERT at SemEval-2020 Task 8: Using BERT to Analyse Meme Emotions. In Proceedings of the Fourteenth Workshop on Semantic Evaluation, pages 1094–1099, Barcelona (online). International Committee for Computational Linguistics.
Cite (Informal):
BERT at SemEval-2020 Task 8: Using BERT to Analyse Meme Emotions (Avvaru & Vobilisetty, SemEval 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.semeval-1.144.pdf