Low-Resource Response Generation with Template Prior

Ze Yang, Wei Wu, Jian Yang, Can Xu, Zhoujun Li


Abstract
We study open domain response generation with limited message-response pairs. The problem exists in real-world applications but is less explored by the existing work. Since the paired data now is no longer enough to train a neural generation model, we consider leveraging the large scale of unpaired data that are much easier to obtain, and propose response generation with both paired and unpaired data. The generation model is defined by an encoder-decoder architecture with templates as prior, where the templates are estimated from the unpaired data as a neural hidden semi-markov model. By this means, response generation learned from the small paired data can be aided by the semantic and syntactic knowledge in the large unpaired data. To balance the effect of the prior and the input message to response generation, we propose learning the whole generation model with an adversarial approach. Empirical studies on question response generation and sentiment response generation indicate that when only a few pairs are available, our model can significantly outperform several state-of-the-art response generation models in terms of both automatic and human evaluation.
Anthology ID:
D19-1197
Volume:
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Kentaro Inui, Jing Jiang, Vincent Ng, Xiaojun Wan
Venues:
EMNLP | IJCNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1886–1897
Language:
URL:
https://aclanthology.org/D19-1197
DOI:
10.18653/v1/D19-1197
Bibkey:
Cite (ACL):
Ze Yang, Wei Wu, Jian Yang, Can Xu, and Zhoujun Li. 2019. Low-Resource Response Generation with Template Prior. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 1886–1897, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Low-Resource Response Generation with Template Prior (Yang et al., EMNLP-IJCNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-1197.pdf
Code
 TobeyYang/S2S_Temp