Modeling with Recurrent Neural Networks for Open Vocabulary Slots

Jun-Seong Kim, Junghoe Kim, SeungUn Park, Kwangyong Lee, Yoonju Lee


Abstract
Dealing with ‘open-vocabulary’ slots has been among the challenges in the natural language area. While recent studies on attention-based recurrent neural network (RNN) models have performed well in completing several language related tasks such as spoken language understanding and dialogue systems, there has been a lack of attempts to address filling slots that take on values from a virtually unlimited set. In this paper, we propose a new RNN model that can capture the vital concept: Understanding the role of a word may vary according to how long a reader focuses on a particular part of a sentence. The proposed model utilizes a long-term aware attention structure, positional encoding primarily considering the relative distance between words, and multi-task learning of a character-based language model and an intent detection model. We show that the model outperforms the existing RNN models with respect to discovering ‘open-vocabulary’ slots without any external information, such as a named entity database or knowledge base. In particular, we confirm that it performs better with a greater number of slots in a dataset, including unknown words, by evaluating the models on a dataset of several domains. In addition, the proposed model also demonstrates superior performance with regard to intent detection.
Anthology ID:
C18-1235
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2778–2790
Language:
URL:
https://aclanthology.org/C18-1235
DOI:
Bibkey:
Cite (ACL):
Jun-Seong Kim, Junghoe Kim, SeungUn Park, Kwangyong Lee, and Yoonju Lee. 2018. Modeling with Recurrent Neural Networks for Open Vocabulary Slots. In Proceedings of the 27th International Conference on Computational Linguistics, pages 2778–2790, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Modeling with Recurrent Neural Networks for Open Vocabulary Slots (Kim et al., COLING 2018)
Copy Citation:
PDF:
https://aclanthology.org/C18-1235.pdf
Data
ATIS