Slot-consistent NLG for Task-oriented Dialogue Systems with Iterative Rectification Network

Yangming Li, Kaisheng Yao, Libo Qin, Wanxiang Che, Xiaolong Li, Ting Liu


Abstract
Data-driven approaches using neural networks have achieved promising performances in natural language generation (NLG). However, neural generators are prone to make mistakes, e.g., neglecting an input slot value and generating a redundant slot value. Prior works refer this to hallucination phenomenon. In this paper, we study slot consistency for building reliable NLG systems with all slot values of input dialogue act (DA) properly generated in output sentences. We propose Iterative Rectification Network (IRN) for improving general NLG systems to produce both correct and fluent responses. It applies a bootstrapping algorithm to sample training candidates and uses reinforcement learning to incorporate discrete reward related to slot inconsistency into training. Comprehensive studies have been conducted on multiple benchmark datasets, showing that the proposed methods have significantly reduced the slot error rate (ERR) for all strong baselines. Human evaluations also have confirmed its effectiveness.
Anthology ID:
2020.acl-main.10
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
97–106
Language:
URL:
https://aclanthology.org/2020.acl-main.10
DOI:
10.18653/v1/2020.acl-main.10
Bibkey:
Cite (ACL):
Yangming Li, Kaisheng Yao, Libo Qin, Wanxiang Che, Xiaolong Li, and Ting Liu. 2020. Slot-consistent NLG for Task-oriented Dialogue Systems with Iterative Rectification Network. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 97–106, Online. Association for Computational Linguistics.
Cite (Informal):
Slot-consistent NLG for Task-oriented Dialogue Systems with Iterative Rectification Network (Li et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.10.pdf
Video:
 http://slideslive.com/38928822