Learning a Simple and Effective Model for Multi-turn Response Generation with Auxiliary Tasks

Yufan Zhao, Can Xu, Wei Wu


Abstract
We study multi-turn response generation for open-domain dialogues. The existing state-of-the-art addresses the problem with deep neural architectures. While these models improved response quality, their complexity also hinders the application of the models in real systems. In this work, we pursue a model that has a simple structure yet can effectively leverage conversation contexts for response generation. To this end, we propose four auxiliary tasks including word order recovery, utterance order recovery, masked word recovery, and masked utterance recovery, and optimize the objectives of these tasks together with maximizing the likelihood of generation. By this means, the auxiliary tasks that relate to context understanding can guide the learning of the generation model to achieve a better local optimum. Empirical studies with three benchmarks indicate that our model can significantly outperform state-of-the-art generation models in terms of response quality on both automatic evaluation and human judgment, and at the same time enjoys a much faster decoding process.
Anthology ID:
2020.emnlp-main.279
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3472–3483
Language:
URL:
https://aclanthology.org/2020.emnlp-main.279
DOI:
10.18653/v1/2020.emnlp-main.279
Bibkey:
Cite (ACL):
Yufan Zhao, Can Xu, and Wei Wu. 2020. Learning a Simple and Effective Model for Multi-turn Response Generation with Auxiliary Tasks. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 3472–3483, Online. Association for Computational Linguistics.
Cite (Informal):
Learning a Simple and Effective Model for Multi-turn Response Generation with Auxiliary Tasks (Zhao et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.279.pdf
Video:
 https://slideslive.com/38938759
Data
DailyDialogUDC