Improving Word Embedding Factorization for Compression Using Distilled Nonlinear Neural Decomposition

Vasileios Lioutas, Ahmad Rashid, Krtin Kumar, Md. Akmal Haidar, Mehdi Rezagholizadeh


Abstract
Word-embeddings are vital components of Natural Language Processing (NLP) models and have been extensively explored. However, they consume a lot of memory which poses a challenge for edge deployment. Embedding matrices, typically, contain most of the parameters for language models and about a third for machine translation systems. In this paper, we propose Distilled Embedding, an (input/output) embedding compression method based on low-rank matrix decomposition and knowledge distillation. First, we initialize the weights of our decomposed matrices by learning to reconstruct the full pre-trained word-embedding and then fine-tune end-to-end, employing knowledge distillation on the factorized embedding. We conduct extensive experiments with various compression rates on machine translation and language modeling, using different data-sets with a shared word-embedding matrix for both embedding and vocabulary projection matrices. We show that the proposed technique is simple to replicate, with one fixed parameter controlling compression size, has higher BLEU score on translation and lower perplexity on language modeling compared to complex, difficult to tune state-of-the-art methods.
Anthology ID:
2020.findings-emnlp.250
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2774–2784
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.250
DOI:
10.18653/v1/2020.findings-emnlp.250
Bibkey:
Cite (ACL):
Vasileios Lioutas, Ahmad Rashid, Krtin Kumar, Md. Akmal Haidar, and Mehdi Rezagholizadeh. 2020. Improving Word Embedding Factorization for Compression Using Distilled Nonlinear Neural Decomposition. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 2774–2784, Online. Association for Computational Linguistics.
Cite (Informal):
Improving Word Embedding Factorization for Compression Using Distilled Nonlinear Neural Decomposition (Lioutas et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.250.pdf
Optional supplementary material:
 2020.findings-emnlp.250.OptionalSupplementaryMaterial.zip
Data
WikiText-103WikiText-2