Learning Low-Resource End-To-End Goal-Oriented Dialog for Fast and Reliable System Deployment

Yinpei Dai, Hangyu Li, Chengguang Tang, Yongbin Li, Jian Sun, Xiaodan Zhu


Abstract
Existing end-to-end dialog systems perform less effectively when data is scarce. To obtain an acceptable success in real-life online services with only a handful of training examples, both fast adaptability and reliable performance are highly desirable for dialog systems. In this paper, we propose the Meta-Dialog System (MDS), which combines the advantages of both meta-learning approaches and human-machine collaboration. We evaluate our methods on a new extended-bAbI dataset and a transformed MultiWOZ dataset for low-resource goal-oriented dialog learning. Experimental results show that MDS significantly outperforms non-meta-learning baselines and can achieve more than 90% per-turn accuracies with only 10 dialogs on the extended-bAbI dataset.
Anthology ID:
2020.acl-main.57
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
609–618
Language:
URL:
https://aclanthology.org/2020.acl-main.57
DOI:
10.18653/v1/2020.acl-main.57
Bibkey:
Cite (ACL):
Yinpei Dai, Hangyu Li, Chengguang Tang, Yongbin Li, Jian Sun, and Xiaodan Zhu. 2020. Learning Low-Resource End-To-End Goal-Oriented Dialog for Fast and Reliable System Deployment. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 609–618, Online. Association for Computational Linguistics.
Cite (Informal):
Learning Low-Resource End-To-End Goal-Oriented Dialog for Fast and Reliable System Deployment (Dai et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.57.pdf
Dataset:
 2020.acl-main.57.Dataset.zip
Video:
 http://slideslive.com/38929454