The Unstoppable Rise of Computational Linguistics in Deep Learning

James Henderson


Abstract
In this paper, we trace the history of neural networks applied to natural language understanding tasks, and identify key contributions which the nature of language has made to the development of neural network architectures. We focus on the importance of variable binding and its instantiation in attention-based models, and argue that Transformer is not a sequence model but an induced-structure model. This perspective leads to predictions of the challenges facing research in deep learning architectures for natural language understanding.
Anthology ID:
2020.acl-main.561
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6294–6306
Language:
URL:
https://aclanthology.org/2020.acl-main.561
DOI:
10.18653/v1/2020.acl-main.561
Bibkey:
Cite (ACL):
James Henderson. 2020. The Unstoppable Rise of Computational Linguistics in Deep Learning. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 6294–6306, Online. Association for Computational Linguistics.
Cite (Informal):
The Unstoppable Rise of Computational Linguistics in Deep Learning (Henderson, ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.561.pdf
Video:
 http://slideslive.com/38929007