Seq2Seq2Sentiment: Multimodal Sequence to Sequence Models for Sentiment Analysis

Hai Pham, Thomas Manzini, Paul Pu Liang, Barnabás Poczós


Abstract
Multimodal machine learning is a core research area spanning the language, visual and acoustic modalities. The central challenge in multimodal learning involves learning representations that can process and relate information from multiple modalities. In this paper, we propose two methods for unsupervised learning of joint multimodal representations using sequence to sequence (Seq2Seq) methods: a Seq2Seq Modality Translation Model and a Hierarchical Seq2Seq Modality Translation Model. We also explore multiple different variations on the multimodal inputs and outputs of these seq2seq models. Our experiments on multimodal sentiment analysis using the CMU-MOSI dataset indicate that our methods learn informative multimodal representations that outperform the baselines and achieve improved performance on multimodal sentiment analysis, specifically in the Bimodal case where our model is able to improve F1 Score by twelve points. We also discuss future directions for multimodal Seq2Seq methods.
Anthology ID:
W18-3308
Volume:
Proceedings of Grand Challenge and Workshop on Human Multimodal Language (Challenge-HML)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Amir Zadeh, Paul Pu Liang, Louis-Philippe Morency, Soujanya Poria, Erik Cambria, Stefan Scherer
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
53–63
Language:
URL:
https://aclanthology.org/W18-3308
DOI:
10.18653/v1/W18-3308
Bibkey:
Cite (ACL):
Hai Pham, Thomas Manzini, Paul Pu Liang, and Barnabás Poczós. 2018. Seq2Seq2Sentiment: Multimodal Sequence to Sequence Models for Sentiment Analysis. In Proceedings of Grand Challenge and Workshop on Human Multimodal Language (Challenge-HML), pages 53–63, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Seq2Seq2Sentiment: Multimodal Sequence to Sequence Models for Sentiment Analysis (Pham et al., ACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-3308.pdf