Bag of Experts Architectures for Model Reuse in Conversational Language Understanding

Rahul Jha, Alex Marin, Suvamsh Shivaprasad, Imed Zitouni


Abstract
Slot tagging, the task of detecting entities in input user utterances, is a key component of natural language understanding systems for personal digital assistants. Since each new domain requires a different set of slots, the annotation costs for labeling data for training slot tagging models increases rapidly as the number of domains grow. To tackle this, we describe Bag of Experts (BoE) architectures for model reuse for both LSTM and CRF based models. Extensive experimentation over a dataset of 10 domains drawn from data relevant to our commercial personal digital assistant shows that our BoE models outperform the baseline models with a statistically significant average margin of 5.06% in absolute F1-score when training with 2000 instances per domain, and achieve an even higher improvement of 12.16% when only 25% of the training data is used.
Anthology ID:
N18-3019
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 3 (Industry Papers)
Month:
June
Year:
2018
Address:
New Orleans - Louisiana
Editors:
Srinivas Bangalore, Jennifer Chu-Carroll, Yunyao Li
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
153–161
Language:
URL:
https://aclanthology.org/N18-3019
DOI:
10.18653/v1/N18-3019
Bibkey:
Cite (ACL):
Rahul Jha, Alex Marin, Suvamsh Shivaprasad, and Imed Zitouni. 2018. Bag of Experts Architectures for Model Reuse in Conversational Language Understanding. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 3 (Industry Papers), pages 153–161, New Orleans - Louisiana. Association for Computational Linguistics.
Cite (Informal):
Bag of Experts Architectures for Model Reuse in Conversational Language Understanding (Jha et al., NAACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/N18-3019.pdf
Video:
 https://aclanthology.org/N18-3019.mp4