Word-order Biases in Deep-agent Emergent Communication

Rahma Chaabouni, Eugene Kharitonov, Alessandro Lazaric, Emmanuel Dupoux, Marco Baroni


Abstract
Sequence-processing neural networks led to remarkable progress on many NLP tasks. As a consequence, there has been increasing interest in understanding to what extent they process language as humans do. We aim here to uncover which biases such models display with respect to “natural” word-order constraints. We train models to communicate about paths in a simple gridworld, using miniature languages that reflect or violate various natural language trends, such as the tendency to avoid redundancy or to minimize long-distance dependencies. We study how the controlled characteristics of our miniature languages affect individual learning and their stability across multiple network generations. The results draw a mixed picture. On the one hand, neural networks show a strong tendency to avoid long-distance dependencies. On the other hand, there is no clear preference for the efficient, non-redundant encoding of information that is widely attested in natural language. We thus suggest inoculating a notion of “effort” into neural networks, as a possible way to make their linguistic behavior more human-like.
Anthology ID:
P19-1509
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5166–5175
Language:
URL:
https://aclanthology.org/P19-1509
DOI:
10.18653/v1/P19-1509
Bibkey:
Cite (ACL):
Rahma Chaabouni, Eugene Kharitonov, Alessandro Lazaric, Emmanuel Dupoux, and Marco Baroni. 2019. Word-order Biases in Deep-agent Emergent Communication. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 5166–5175, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Word-order Biases in Deep-agent Emergent Communication (Chaabouni et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1509.pdf
Supplementary:
 P19-1509.Supplementary.pdf
Video:
 https://aclanthology.org/P19-1509.mp4
Code
 facebookresearch/brica