Climbing towards NLU: On Meaning, Form, and Understanding in the Age of Data

Emily M. Bender, Alexander Koller


Abstract
The success of the large neural language models on many NLP tasks is exciting. However, we find that these successes sometimes lead to hype in which these models are being described as “understanding” language or capturing “meaning”. In this position paper, we argue that a system trained only on form has a priori no way to learn meaning. In keeping with the ACL 2020 theme of “Taking Stock of Where We’ve Been and Where We’re Going”, we argue that a clear understanding of the distinction between form and meaning will help guide the field towards better science around natural language understanding.
Anthology ID:
2020.acl-main.463
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5185–5198
Language:
URL:
https://aclanthology.org/2020.acl-main.463
DOI:
10.18653/v1/2020.acl-main.463
Award:
 Best Theme Paper
Bibkey:
Cite (ACL):
Emily M. Bender and Alexander Koller. 2020. Climbing towards NLU: On Meaning, Form, and Understanding in the Age of Data. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 5185–5198, Online. Association for Computational Linguistics.
Cite (Informal):
Climbing towards NLU: On Meaning, Form, and Understanding in the Age of Data (Bender & Koller, ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.463.pdf
Video:
 http://slideslive.com/38929214