KB-NLG: From Knowledge Base to Natural Language Generation

Wen Cui, Minghui Zhou, Rongwen Zhao, Narges Norouzi


Abstract
We perform the natural language generation (NLG) task by mapping sets of Resource Description Framework (RDF) triples into text. First we investigate the impact of increasing the number of entity types in delexicalisaiton on the generation quality. Second we conduct different experiments to evaluate two widely applied language generation systems, encoder-decoder with attention and the Transformer model on a large benchmark dataset. We evaluate different models on automatic metrics, as well as the training time. To our knowledge, we are the first to apply Transformer model to this task.
Anthology ID:
W19-3626
Volume:
Proceedings of the 2019 Workshop on Widening NLP
Month:
August
Year:
2019
Address:
Florence, Italy
Editors:
Amittai Axelrod, Diyi Yang, Rossana Cunha, Samira Shaikh, Zeerak Waseem
Venue:
WiNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
80–82
Language:
URL:
https://aclanthology.org/W19-3626
DOI:
Bibkey:
Cite (ACL):
Wen Cui, Minghui Zhou, Rongwen Zhao, and Narges Norouzi. 2019. KB-NLG: From Knowledge Base to Natural Language Generation. In Proceedings of the 2019 Workshop on Widening NLP, pages 80–82, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
KB-NLG: From Knowledge Base to Natural Language Generation (Cui et al., WiNLP 2019)
Copy Citation: