A Dual-Attention Hierarchical Recurrent Neural Network for Dialogue Act Classification

Ruizhe Li, Chenghua Lin, Matthew Collinson, Xiao Li, Guanyi Chen


Abstract
Recognising dialogue acts (DA) is important for many natural language processing tasks such as dialogue generation and intention recognition. In this paper, we propose a dual-attention hierarchical recurrent neural network for DA classification. Our model is partially inspired by the observation that conversational utterances are normally associated with both a DA and a topic, where the former captures the social act and the latter describes the subject matter. However, such a dependency between DAs and topics has not been utilised by most existing systems for DA classification. With a novel dual task-specific attention mechanism, our model is able, for utterances, to capture information about both DAs and topics, as well as information about the interactions between them. Experimental results show that by modelling topic as an auxiliary task, our model can significantly improve DA classification, yielding better or comparable performance to the state-of-the-art method on three public datasets.
Anthology ID:
K19-1036
Volume:
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Mohit Bansal, Aline Villavicencio
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
383–392
Language:
URL:
https://aclanthology.org/K19-1036
DOI:
10.18653/v1/K19-1036
Bibkey:
Cite (ACL):
Ruizhe Li, Chenghua Lin, Matthew Collinson, Xiao Li, and Guanyi Chen. 2019. A Dual-Attention Hierarchical Recurrent Neural Network for Dialogue Act Classification. In Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pages 383–392, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
A Dual-Attention Hierarchical Recurrent Neural Network for Dialogue Act Classification (Li et al., CoNLL 2019)
Copy Citation:
PDF:
https://aclanthology.org/K19-1036.pdf
Data
DailyDialogSwitchboard-1 Corpus