Learning to Explicitate Connectives with Seq2Seq Network for Implicit Discourse Relation Classification

Wei Shi, Vera Demberg


Abstract
Implicit discourse relation classification is one of the most difficult steps in discourse parsing. The difficulty stems from the fact that the coherence relation must be inferred based on the content of the discourse relational arguments. Therefore, an effective encoding of the relational arguments is of crucial importance. We here propose a new model for implicit discourse relation classification, which consists of a classifier, and a sequence-to-sequence model which is trained to generate a representation of the discourse relational arguments by trying to predict the relational arguments including a suitable implicit connective. Training is possible because such implicit connectives have been annotated as part of the PDTB corpus. Along with a memory network, our model could generate more refined representations for the task. And on the now standard 11-way classification, our method outperforms the previous state of the art systems on the PDTB benchmark on multiple settings including cross validation.
Anthology ID:
W19-0416
Volume:
Proceedings of the 13th International Conference on Computational Semantics - Long Papers
Month:
May
Year:
2019
Address:
Gothenburg, Sweden
Editors:
Simon Dobnik, Stergios Chatzikyriakidis, Vera Demberg
Venue:
IWCS
SIG:
SIGSEM
Publisher:
Association for Computational Linguistics
Note:
Pages:
188–199
Language:
URL:
https://aclanthology.org/W19-0416
DOI:
10.18653/v1/W19-0416
Bibkey:
Cite (ACL):
Wei Shi and Vera Demberg. 2019. Learning to Explicitate Connectives with Seq2Seq Network for Implicit Discourse Relation Classification. In Proceedings of the 13th International Conference on Computational Semantics - Long Papers, pages 188–199, Gothenburg, Sweden. Association for Computational Linguistics.
Cite (Informal):
Learning to Explicitate Connectives with Seq2Seq Network for Implicit Discourse Relation Classification (Shi & Demberg, IWCS 2019)
Copy Citation:
PDF:
https://aclanthology.org/W19-0416.pdf