Towards Implicit Content-Introducing for Generative Short-Text Conversation Systems

Lili Yao, Yaoyuan Zhang, Yansong Feng, Dongyan Zhao, Rui Yan


Abstract
The study on human-computer conversation systems is a hot research topic nowadays. One of the prevailing methods to build the system is using the generative Sequence-to-Sequence (Seq2Seq) model through neural networks. However, the standard Seq2Seq model is prone to generate trivial responses. In this paper, we aim to generate a more meaningful and informative reply when answering a given question. We propose an implicit content-introducing method which incorporates additional information into the Seq2Seq model in a flexible way. Specifically, we fuse the general decoding and the auxiliary cue word information through our proposed hierarchical gated fusion unit. Experiments on real-life data demonstrate that our model consistently outperforms a set of competitive baselines in terms of BLEU scores and human evaluation.
Anthology ID:
D17-1233
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2190–2199
Language:
URL:
https://aclanthology.org/D17-1233
DOI:
10.18653/v1/D17-1233
Bibkey:
Cite (ACL):
Lili Yao, Yaoyuan Zhang, Yansong Feng, Dongyan Zhao, and Rui Yan. 2017. Towards Implicit Content-Introducing for Generative Short-Text Conversation Systems. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 2190–2199, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Towards Implicit Content-Introducing for Generative Short-Text Conversation Systems (Yao et al., EMNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/D17-1233.pdf