CLIFF: Contrastive Learning for Improving Faithfulness and Factuality in Abstractive Summarization

Shuyang Cao, Lu Wang


Abstract
We study generating abstractive summaries that are faithful and factually consistent with the given articles. A novel contrastive learning formulation is presented, which leverages both reference summaries, as positive training data, and automatically generated erroneous summaries, as negative training data, to train summarization systems that are better at distinguishing between them. We further design four types of strategies for creating negative samples, to resemble errors made commonly by two state-of-the-art models, BART and PEGASUS, found in our new human annotations of summary errors. Experiments on XSum and CNN/Daily Mail show that our contrastive learning framework is robust across datasets and models. It consistently produces more factual summaries than strong comparisons with post error correction, entailment-based reranking, and unlikelihood training, according to QA-based factuality evaluation. Human judges echo the observation and find that our model summaries correct more errors.
Anthology ID:
2021.emnlp-main.532
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6633–6649
Language:
URL:
https://aclanthology.org/2021.emnlp-main.532
DOI:
10.18653/v1/2021.emnlp-main.532
Bibkey:
Cite (ACL):
Shuyang Cao and Lu Wang. 2021. CLIFF: Contrastive Learning for Improving Faithfulness and Factuality in Abstractive Summarization. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 6633–6649, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
CLIFF: Contrastive Learning for Improving Faithfulness and Factuality in Abstractive Summarization (Cao & Wang, EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.532.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.532.mp4
Code
 makcedward/nlpaug +  additional community code