Language Production Dynamics with Recurrent Neural Networks

Jesús Calvillo, Matthew Crocker


Abstract
We present an analysis of the internal mechanism of the recurrent neural model of sentence production presented by Calvillo et al. (2016). The results show clear patterns of computation related to each layer in the network allowing to infer an algorithmic account, where the semantics activates the semantically related words, then each word generated at each time step activates syntactic and semantic constraints on possible continuations, while the recurrence preserves information through time. We propose that such insights could generalize to other models with similar architecture, including some used in computational linguistics for language modeling, machine translation and image caption generation.
Anthology ID:
W18-2803
Volume:
Proceedings of the Eight Workshop on Cognitive Aspects of Computational Language Learning and Processing
Month:
July
Year:
2018
Address:
Melbourne
Editors:
Marco Idiart, Alessandro Lenci, Thierry Poibeau, Aline Villavicencio
Venue:
CogACLL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
17–26
Language:
URL:
https://aclanthology.org/W18-2803
DOI:
10.18653/v1/W18-2803
Bibkey:
Cite (ACL):
Jesús Calvillo and Matthew Crocker. 2018. Language Production Dynamics with Recurrent Neural Networks. In Proceedings of the Eight Workshop on Cognitive Aspects of Computational Language Learning and Processing, pages 17–26, Melbourne. Association for Computational Linguistics.
Cite (Informal):
Language Production Dynamics with Recurrent Neural Networks (Calvillo & Crocker, CogACLL 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-2803.pdf