XtremeDistil: Multi-stage Distillation for Massive Multilingual Models

Subhabrata Mukherjee, Ahmed Hassan Awadallah


Abstract
Deep and large pre-trained language models are the state-of-the-art for various natural language processing tasks. However, the huge size of these models could be a deterrent to using them in practice. Some recent works use knowledge distillation to compress these huge models into shallow ones. In this work we study knowledge distillation with a focus on multilingual Named Entity Recognition (NER). In particular, we study several distillation strategies and propose a stage-wise optimization scheme leveraging teacher internal representations, that is agnostic of teacher architecture, and show that it outperforms strategies employed in prior works. Additionally, we investigate the role of several factors like the amount of unlabeled data, annotation resources, model architecture and inference latency to name a few. We show that our approach leads to massive compression of teacher models like mBERT by upto 35x in terms of parameters and 51x in terms of latency for batch inference while retaining 95% of its F1-score for NER over 41 languages.
Anthology ID:
2020.acl-main.202
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2221–2234
Language:
URL:
https://aclanthology.org/2020.acl-main.202
DOI:
10.18653/v1/2020.acl-main.202
Bibkey:
Cite (ACL):
Subhabrata Mukherjee and Ahmed Hassan Awadallah. 2020. XtremeDistil: Multi-stage Distillation for Massive Multilingual Models. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 2221–2234, Online. Association for Computational Linguistics.
Cite (Informal):
XtremeDistil: Multi-stage Distillation for Massive Multilingual Models (Mukherjee & Hassan Awadallah, ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.202.pdf
Video:
 http://slideslive.com/38929189
Data
AG NewsGLUEIMDb Movie Reviews