Scalable Syntax-Aware Language Models Using Knowledge Distillation

Adhiguna Kuncoro, Chris Dyer, Laura Rimell, Stephen Clark, Phil Blunsom


Abstract
Prior work has shown that, on small amounts of training data, syntactic neural language models learn structurally sensitive generalisations more successfully than sequential language models. However, their computational complexity renders scaling difficult, and it remains an open question whether structural biases are still necessary when sequential models have access to ever larger amounts of training data. To answer this question, we introduce an efficient knowledge distillation (KD) technique that transfers knowledge from a syntactic language model trained on a small corpus to an LSTM language model, hence enabling the LSTM to develop a more structurally sensitive representation of the larger training data it learns from. On targeted syntactic evaluations, we find that, while sequential LSTMs perform much better than previously reported, our proposed technique substantially improves on this baseline, yielding a new state of the art. Our findings and analysis affirm the importance of structural biases, even in models that learn from large amounts of data.
Anthology ID:
P19-1337
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3472–3484
Language:
URL:
https://aclanthology.org/P19-1337
DOI:
10.18653/v1/P19-1337
Bibkey:
Cite (ACL):
Adhiguna Kuncoro, Chris Dyer, Laura Rimell, Stephen Clark, and Phil Blunsom. 2019. Scalable Syntax-Aware Language Models Using Knowledge Distillation. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 3472–3484, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Scalable Syntax-Aware Language Models Using Knowledge Distillation (Kuncoro et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1337.pdf
Video:
 https://aclanthology.org/P19-1337.mp4