Implicit Discourse Relation Recognition using Neural Tensor Network with Interactive Attention and Sparse Learning

Fengyu Guo, Ruifang He, Di Jin, Jianwu Dang, Longbiao Wang, Xiangang Li


Abstract
Implicit discourse relation recognition aims to understand and annotate the latent relations between two discourse arguments, such as temporal, comparison, etc. Most previous methods encode two discourse arguments separately, the ones considering pair specific clues ignore the bidirectional interactions between two arguments and the sparsity of pair patterns. In this paper, we propose a novel neural Tensor network framework with Interactive Attention and Sparse Learning (TIASL) for implicit discourse relation recognition. (1) We mine the most correlated word pairs from two discourse arguments to model pair specific clues, and integrate them as interactive attention into argument representations produced by the bidirectional long short-term memory network. Meanwhile, (2) the neural tensor network with sparse constraint is proposed to explore the deeper and the more important pair patterns so as to fully recognize discourse relations. The experimental results on PDTB show that our proposed TIASL framework is effective.
Anthology ID:
C18-1046
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
547–558
Language:
URL:
https://aclanthology.org/C18-1046
DOI:
Bibkey:
Cite (ACL):
Fengyu Guo, Ruifang He, Di Jin, Jianwu Dang, Longbiao Wang, and Xiangang Li. 2018. Implicit Discourse Relation Recognition using Neural Tensor Network with Interactive Attention and Sparse Learning. In Proceedings of the 27th International Conference on Computational Linguistics, pages 547–558, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Implicit Discourse Relation Recognition using Neural Tensor Network with Interactive Attention and Sparse Learning (Guo et al., COLING 2018)
Copy Citation:
PDF:
https://aclanthology.org/C18-1046.pdf