Sequence to Backward and Forward Sequences: A Content-Introducing Approach to Generative Short-Text Conversation

Lili Mou, Yiping Song, Rui Yan, Ge Li, Lu Zhang, Zhi Jin


Abstract
Using neural networks to generate replies in human-computer dialogue systems is attracting increasing attention over the past few years. However, the performance is not satisfactory: the neural network tends to generate safe, universally relevant replies which carry little meaning. In this paper, we propose a content-introducing approach to neural network-based generative dialogue systems. We first use pointwise mutual information (PMI) to predict a noun as a keyword, reflecting the main gist of the reply. We then propose seq2BF, a “sequence to backward and forward sequences” model, which generates a reply containing the given keyword. Experimental results show that our approach significantly outperforms traditional sequence-to-sequence models in terms of human evaluation and the entropy measure, and that the predicted keyword can appear at an appropriate position in the reply.
Anthology ID:
C16-1316
Volume:
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers
Month:
December
Year:
2016
Address:
Osaka, Japan
Editors:
Yuji Matsumoto, Rashmi Prasad
Venue:
COLING
SIG:
Publisher:
The COLING 2016 Organizing Committee
Note:
Pages:
3349–3358
Language:
URL:
https://aclanthology.org/C16-1316
DOI:
Bibkey:
Cite (ACL):
Lili Mou, Yiping Song, Rui Yan, Ge Li, Lu Zhang, and Zhi Jin. 2016. Sequence to Backward and Forward Sequences: A Content-Introducing Approach to Generative Short-Text Conversation. In Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pages 3349–3358, Osaka, Japan. The COLING 2016 Organizing Committee.
Cite (Informal):
Sequence to Backward and Forward Sequences: A Content-Introducing Approach to Generative Short-Text Conversation (Mou et al., COLING 2016)
Copy Citation:
PDF:
https://aclanthology.org/C16-1316.pdf