Variational Autoregressive Decoder for Neural Response Generation

Jiachen Du, Wenjie Li, Yulan He, Ruifeng Xu, Lidong Bing, Xuan Wang


Abstract
Combining the virtues of probability graphic models and neural networks, Conditional Variational Auto-encoder (CVAE) has shown promising performance in applications such as response generation. However, existing CVAE-based models often generate responses from a single latent variable which may not be sufficient to model high variability in responses. To solve this problem, we propose a novel model that sequentially introduces a series of latent variables to condition the generation of each word in the response sequence. In addition, the approximate posteriors of these latent variables are augmented with a backward Recurrent Neural Network (RNN), which allows the latent variables to capture long-term dependencies of future tokens in generation. To facilitate training, we supplement our model with an auxiliary objective that predicts the subsequent bag of words. Empirical experiments conducted on Opensubtitle and Reddit datasets show that the proposed model leads to significant improvement on both relevance and diversity over state-of-the-art baselines.
Anthology ID:
D18-1354
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3154–3163
Language:
URL:
https://aclanthology.org/D18-1354
DOI:
10.18653/v1/D18-1354
Bibkey:
Cite (ACL):
Jiachen Du, Wenjie Li, Yulan He, Ruifeng Xu, Lidong Bing, and Xuan Wang. 2018. Variational Autoregressive Decoder for Neural Response Generation. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 3154–3163, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Variational Autoregressive Decoder for Neural Response Generation (Du et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1354.pdf
Video:
 https://aclanthology.org/D18-1354.mp4