Improving the Naturalness and Diversity of Referring Expression Generation models using Minimum Risk Training

Nikolaos Panagiaris, Emma Hart, Dimitra Gkatzia


Abstract
In this paper we consider the problem of optimizing neural Referring Expression Generation (REG) models with sequence level objectives. Recently reinforcement learning (RL) techniques have been adopted to train deep end-to-end systems to directly optimize sequence-level objectives. However, there are two issues associated with RL training: (1) effectively applying RL is challenging, and (2) the generated sentences lack in diversity and naturalness due to deficiencies in the generated word distribution, smaller vocabulary size, and repetitiveness of frequent words and phrases. To alleviate these issues, we propose a novel strategy for training REG models, using minimum risk training (MRT) with maximum likelihood estimation (MLE) and we show that our approach outperforms RL w.r.t naturalness and diversity of the output. Specifically, our approach achieves an increase in CIDEr scores between 23%-57% in two datasets. We further demonstrate the robustness of the proposed method through a detailed comparison with different REG models.
Anthology ID:
2020.inlg-1.7
Volume:
Proceedings of the 13th International Conference on Natural Language Generation
Month:
December
Year:
2020
Address:
Dublin, Ireland
Editors:
Brian Davis, Yvette Graham, John Kelleher, Yaji Sripada
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
41–51
Language:
URL:
https://aclanthology.org/2020.inlg-1.7
DOI:
10.18653/v1/2020.inlg-1.7
Bibkey:
Cite (ACL):
Nikolaos Panagiaris, Emma Hart, and Dimitra Gkatzia. 2020. Improving the Naturalness and Diversity of Referring Expression Generation models using Minimum Risk Training. In Proceedings of the 13th International Conference on Natural Language Generation, pages 41–51, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Improving the Naturalness and Diversity of Referring Expression Generation models using Minimum Risk Training (Panagiaris et al., INLG 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.inlg-1.7.pdf
Data
MS COCORefCOCO