Multi-task Attention-based Neural Networks for Implicit Discourse Relationship Representation and Identification

Man Lan, Jianxiang Wang, Yuanbin Wu, Zheng-Yu Niu, Haifeng Wang


Abstract
We present a novel multi-task attention based neural network model to address implicit discourse relationship representation and identification through two types of representation learning, an attention based neural network for learning discourse relationship representation with two arguments and a multi-task framework for learning knowledge from annotated and unannotated corpora. The extensive experiments have been performed on two benchmark corpora (i.e., PDTB and CoNLL-2016 datasets). Experimental results show that our proposed model outperforms the state-of-the-art systems on benchmark corpora.
Anthology ID:
D17-1134
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1299–1308
Language:
URL:
https://aclanthology.org/D17-1134
DOI:
10.18653/v1/D17-1134
Bibkey:
Cite (ACL):
Man Lan, Jianxiang Wang, Yuanbin Wu, Zheng-Yu Niu, and Haifeng Wang. 2017. Multi-task Attention-based Neural Networks for Implicit Discourse Relationship Representation and Identification. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 1299–1308, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Multi-task Attention-based Neural Networks for Implicit Discourse Relationship Representation and Identification (Lan et al., EMNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/D17-1134.pdf