Discourse-Aware Semantic Self-Attention for Narrative Reading Comprehension

Todor Mihaylov, Anette Frank


Abstract
In this work, we propose to use linguistic annotations as a basis for a Discourse-Aware Semantic Self-Attention encoder that we employ for reading comprehension on narrative texts. We extract relations between discourse units, events, and their arguments as well as coreferring mentions, using available annotation tools. Our empirical evaluation shows that the investigated structures improve the overall performance (up to +3.4 Rouge-L), especially intra-sentential and cross-sentential discourse relations, sentence-internal semantic role relations, and long-distance coreference relations. We show that dedicating self-attention heads to intra-sentential relations and relations connecting neighboring sentences is beneficial for finding answers to questions in longer contexts. Our findings encourage the use of discourse-semantic annotations to enhance the generalization capacity of self-attention models for reading comprehension.
Anthology ID:
D19-1257
Volume:
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Kentaro Inui, Jing Jiang, Vincent Ng, Xiaojun Wan
Venues:
EMNLP | IJCNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2541–2552
Language:
URL:
https://aclanthology.org/D19-1257
DOI:
10.18653/v1/D19-1257
Bibkey:
Cite (ACL):
Todor Mihaylov and Anette Frank. 2019. Discourse-Aware Semantic Self-Attention for Narrative Reading Comprehension. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 2541–2552, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Discourse-Aware Semantic Self-Attention for Narrative Reading Comprehension (Mihaylov & Frank, EMNLP-IJCNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-1257.pdf
Attachment:
 D19-1257.Attachment.pdf
Code
 Heidelberg-NLP/discourse-aware-semantic-self-attention
Data
NarrativeQA