Towards Low-Resource Semi-Supervised Dialogue Generation with Meta-Learning

Yi Huang, Junlan Feng, Shuo Ma, Xiaoyu Du, Xiaoting Wu


Abstract
In this paper, we propose a meta-learning based semi-supervised explicit dialogue state tracker (SEDST) for neural dialogue generation, denoted as MEDST. Our main motivation is to further bridge the chasm between the need for high accuracy dialogue state tracker and the common reality that only scarce annotated data is available for most real-life dialogue tasks. Specifically, MEDST has two core steps: meta-training with adequate unlabelled data in an automatic way and meta-testing with a few annotated data by supervised learning. In particular, we enhance SEDST via entropy regularization, and investigate semi-supervised learning frameworks based on model-agnostic meta-learning (MAML) that are able to reduce the amount of required intermediate state labelling. We find that by leveraging un-annotated data in meta-way instead, the amount of dialogue state annotations can be reduced below 10% while maintaining equivalent system performance. Experimental results show MEDST outperforms SEDST substantially by 18.7% joint goal accuracy and 14.3% entity match rate on the KVRET corpus with 2% labelled data in semi-supervision.
Anthology ID:
2020.findings-emnlp.368
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4123–4128
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.368
DOI:
10.18653/v1/2020.findings-emnlp.368
Bibkey:
Cite (ACL):
Yi Huang, Junlan Feng, Shuo Ma, Xiaoyu Du, and Xiaoting Wu. 2020. Towards Low-Resource Semi-Supervised Dialogue Generation with Meta-Learning. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 4123–4128, Online. Association for Computational Linguistics.
Cite (Informal):
Towards Low-Resource Semi-Supervised Dialogue Generation with Meta-Learning (Huang et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.368.pdf