Diverse Keyphrase Generation with Neural Unlikelihood Training

Hareesh Bahuleyan, Layla El Asri


Abstract
In this paper, we study sequence-to-sequence (S2S) keyphrase generation models from the perspective of diversity. Recent advances in neural natural language generation have made possible remarkable progress on the task of keyphrase generation, demonstrated through improvements on quality metrics such as F1-score. However, the importance of diversity in keyphrase generation has been largely ignored. We first analyze the extent of information redundancy present in the outputs generated by a baseline model trained using maximum likelihood estimation (MLE). Our findings show that repetition of keyphrases is a major issue with MLE training. To alleviate this issue, we adopt neural unlikelihood (UL) objective for training the S2S model. Our version of UL training operates at (1) the target token level to discourage the generation of repeating tokens; (2) the copy token level to avoid copying repetitive tokens from the source text. Further, to encourage better model planning during the decoding process, we incorporate K-step ahead token prediction objective that computes both MLE and UL losses on future tokens as well. Through extensive experiments on datasets from three different domains we demonstrate that the proposed approach attains considerably large diversity gains, while maintaining competitive output quality.
Anthology ID:
2020.coling-main.462
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5271–5287
Language:
URL:
https://aclanthology.org/2020.coling-main.462
DOI:
10.18653/v1/2020.coling-main.462
Bibkey:
Cite (ACL):
Hareesh Bahuleyan and Layla El Asri. 2020. Diverse Keyphrase Generation with Neural Unlikelihood Training. In Proceedings of the 28th International Conference on Computational Linguistics, pages 5271–5287, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Diverse Keyphrase Generation with Neural Unlikelihood Training (Bahuleyan & El Asri, COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.462.pdf
Code
 BorealisAI/keyphrase-generation
Data
KP20kKPTimesSTACKEX