Learning Sentence Representations over Tree Structures for Target-Dependent Classification

Junwen Duan, Xiao Ding, Ting Liu


Abstract
Target-dependent classification tasks, such as aspect-level sentiment analysis, perform fine-grained classifications towards specific targets. Semantic compositions over tree structures are promising for such tasks, as they can potentially capture long-distance interactions between targets and their contexts. However, previous work that operates on tree structures resorts to syntactic parsers or Treebank annotations, which are either subject to noise in informal texts or highly expensive to obtain. To address above issues, we propose a reinforcement learning based approach, which automatically induces target-specific sentence representations over tree structures. The underlying model is a RNN encoder-decoder that explores possible binary tree structures and a reward mechanism that encourages structures that improve performances on downstream tasks. We evaluate our approach on two benchmark tasks: firm-specific cumulative abnormal return prediction (based on formal news texts) and aspect-level sentiment analysis (based on informal social media texts). Experimental results show that our model gives superior performances compared to previous work that operates on parsed trees. Moreover, our approach gives some intuitions on how target-specific sentence representations can be achieved from its word constituents.
Anthology ID:
N18-1051
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marilyn Walker, Heng Ji, Amanda Stent
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
551–560
Language:
URL:
https://aclanthology.org/N18-1051
DOI:
10.18653/v1/N18-1051
Bibkey:
Cite (ACL):
Junwen Duan, Xiao Ding, and Ting Liu. 2018. Learning Sentence Representations over Tree Structures for Target-Dependent Classification. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), pages 551–560, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Learning Sentence Representations over Tree Structures for Target-Dependent Classification (Duan et al., NAACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/N18-1051.pdf