A Personalized Sentiment Model with Textual and Contextual Information

Siwen Guo, Sviatlana Höhn, Christoph Schommer


Abstract
In this paper, we look beyond the traditional population-level sentiment modeling and consider the individuality in a person’s expressions by discovering both textual and contextual information. In particular, we construct a hierarchical neural network that leverages valuable information from a person’s past expressions, and offer a better understanding of the sentiment from the expresser’s perspective. Additionally, we investigate how a person’s sentiment changes over time so that recent incidents or opinions may have more effect on the person’s current sentiment than the old ones. Psychological studies have also shown that individual variation exists in how easily people change their sentiments. In order to model such traits, we develop a modified attention mechanism with Hawkes process applied on top of a recurrent network for a user-specific design. Implemented with automatically labeled Twitter data, the proposed model has shown positive results employing different input formulations for representing the concerned information.
Anthology ID:
K19-1093
Volume:
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Mohit Bansal, Aline Villavicencio
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
992–1001
Language:
URL:
https://aclanthology.org/K19-1093
DOI:
10.18653/v1/K19-1093
Bibkey:
Cite (ACL):
Siwen Guo, Sviatlana Höhn, and Christoph Schommer. 2019. A Personalized Sentiment Model with Textual and Contextual Information. In Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pages 992–1001, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
A Personalized Sentiment Model with Textual and Contextual Information (Guo et al., CoNLL 2019)
Copy Citation:
PDF:
https://aclanthology.org/K19-1093.pdf
Attachment:
 K19-1093.Attachment.zip
Supplementary material:
 K19-1093.Supplementary_Material.zip