Simple Unsupervised Summarization by Contextual Matching

Jiawei Zhou, Alexander Rush


Abstract
We propose an unsupervised method for sentence summarization using only language modeling. The approach employs two language models, one that is generic (i.e. pretrained), and the other that is specific to the target domain. We show that by using a product-of-experts criteria these are enough for maintaining continuous contextual matching while maintaining output fluency. Experiments on both abstractive and extractive sentence summarization data sets show promising results of our method without being exposed to any paired data.
Anthology ID:
P19-1503
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5101–5106
Language:
URL:
https://aclanthology.org/P19-1503
DOI:
10.18653/v1/P19-1503
Bibkey:
Cite (ACL):
Jiawei Zhou and Alexander Rush. 2019. Simple Unsupervised Summarization by Contextual Matching. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 5101–5106, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Simple Unsupervised Summarization by Contextual Matching (Zhou & Rush, ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1503.pdf
Video:
 https://aclanthology.org/P19-1503.mp4
Code
 jzhou316/Unsupervised-Sentence-Summarization