Improving Abstractive Document Summarization with Salient Information Modeling

Yongjian You, Weijia Jia, Tianyi Liu, Wenmian Yang


Abstract
Comprehensive document encoding and salient information selection are two major difficulties for generating summaries with adequate salient information. To tackle the above difficulties, we propose a Transformer-based encoder-decoder framework with two novel extensions for abstractive document summarization. Specifically, (1) to encode the documents comprehensively, we design a focus-attention mechanism and incorporate it into the encoder. This mechanism models a Gaussian focal bias on attention scores to enhance the perception of local context, which contributes to producing salient and informative summaries. (2) To distinguish salient information precisely, we design an independent saliency-selection network which manages the information flow from encoder to decoder. This network effectively reduces the influences of secondary information on the generated summaries. Experimental results on the popular CNN/Daily Mail benchmark demonstrate that our model outperforms other state-of-the-art baselines on the ROUGE metrics.
Anthology ID:
P19-1205
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2132–2141
Language:
URL:
https://aclanthology.org/P19-1205
DOI:
10.18653/v1/P19-1205
Bibkey:
Cite (ACL):
Yongjian You, Weijia Jia, Tianyi Liu, and Wenmian Yang. 2019. Improving Abstractive Document Summarization with Salient Information Modeling. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 2132–2141, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Improving Abstractive Document Summarization with Salient Information Modeling (You et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1205.pdf
Data
CNN/Daily Mail