Learning to Customize Model Structures for Few-shot Dialogue Generation Tasks

Yiping Song, Zequn Liu, Wei Bi, Rui Yan, Ming Zhang


Abstract
Training the generative models with minimal corpus is one of the critical challenges for building open-domain dialogue systems. Existing methods tend to use the meta-learning framework which pre-trains the parameters on all non-target tasks then fine-tunes on the target task. However, fine-tuning distinguishes tasks from the parameter perspective but ignores the model-structure perspective, resulting in similar dialogue models for different tasks. In this paper, we propose an algorithm that can customize a unique dialogue model for each task in the few-shot setting. In our approach, each dialogue model consists of a shared module, a gating module, and a private module. The first two modules are shared among all the tasks, while the third one will differentiate into different network structures to better capture the characteristics of the corresponding task. The extensive experiments on two datasets show that our method outperforms all the baselines in terms of task consistency, response quality, and diversity.
Anthology ID:
2020.acl-main.517
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5832–5841
Language:
URL:
https://aclanthology.org/2020.acl-main.517
DOI:
10.18653/v1/2020.acl-main.517
Bibkey:
Cite (ACL):
Yiping Song, Zequn Liu, Wei Bi, Rui Yan, and Ming Zhang. 2020. Learning to Customize Model Structures for Few-shot Dialogue Generation Tasks. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 5832–5841, Online. Association for Computational Linguistics.
Cite (Informal):
Learning to Customize Model Structures for Few-shot Dialogue Generation Tasks (Song et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.517.pdf
Video:
 http://slideslive.com/38928970
Code
 zequnl/CMAML