Guided Neural Language Generation for Abstractive Summarization using Abstract Meaning Representation

Hardy Hardy, Andreas Vlachos


Abstract
Recent work on abstractive summarization has made progress with neural encoder-decoder architectures. However, such models are often challenged due to their lack of explicit semantic modeling of the source document and its summary. In this paper, we extend previous work on abstractive summarization using Abstract Meaning Representation (AMR) with a neural language generation stage which we guide using the source document. We demonstrate that this guidance improves summarization results by 7.4 and 10.5 points in ROUGE-2 using gold standard AMR parses and parses obtained from an off-the-shelf parser respectively. We also find that the summarization performance on later parses is 2 ROUGE-2 points higher than that of a well-established neural encoder-decoder approach trained on a larger dataset.
Anthology ID:
D18-1086
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
768–773
Language:
URL:
https://aclanthology.org/D18-1086
DOI:
10.18653/v1/D18-1086
Bibkey:
Cite (ACL):
Hardy Hardy and Andreas Vlachos. 2018. Guided Neural Language Generation for Abstractive Summarization using Abstract Meaning Representation. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 768–773, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Guided Neural Language Generation for Abstractive Summarization using Abstract Meaning Representation (Hardy & Vlachos, EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1086.pdf
Code
 sheffieldnlp/AMR2Text-summ