Enhanced Transformer Model for Data-to-Text Generation

Li Gong, Josep Crego, Jean Senellart


Abstract
Neural models have recently shown significant progress on data-to-text generation tasks in which descriptive texts are generated conditioned on database records. In this work, we present a new Transformer-based data-to-text generation model which learns content selection and summary generation in an end-to-end fashion. We introduce two extensions to the baseline transformer model: First, we modify the latent representation of the input, which helps to significantly improve the content correctness of the output summary; Second, we include an additional learning objective that accounts for content selection modelling. In addition, we propose two data augmentation methods that succeed to further improve performance of the resulting generation models. Evaluation experiments show that our final model outperforms current state-of-the-art systems as measured by different metrics: BLEU, content selection precision and content ordering. We made publicly available the transformer extension presented in this paper.
Anthology ID:
D19-5615
Volume:
Proceedings of the 3rd Workshop on Neural Generation and Translation
Month:
November
Year:
2019
Address:
Hong Kong
Editors:
Alexandra Birch, Andrew Finch, Hiroaki Hayashi, Ioannis Konstas, Thang Luong, Graham Neubig, Yusuke Oda, Katsuhito Sudoh
Venue:
NGT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
148–156
Language:
URL:
https://aclanthology.org/D19-5615
DOI:
10.18653/v1/D19-5615
Bibkey:
Cite (ACL):
Li Gong, Josep Crego, and Jean Senellart. 2019. Enhanced Transformer Model for Data-to-Text Generation. In Proceedings of the 3rd Workshop on Neural Generation and Translation, pages 148–156, Hong Kong. Association for Computational Linguistics.
Cite (Informal):
Enhanced Transformer Model for Data-to-Text Generation (Gong et al., NGT 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-5615.pdf