Correlating Neural and Symbolic Representations of Language

Grzegorz Chrupała, Afra Alishahi


Abstract
Analysis methods which enable us to better understand the representations and functioning of neural models of language are increasingly needed as deep learning becomes the dominant approach in NLP. Here we present two methods based on Representational Similarity Analysis (RSA) and Tree Kernels (TK) which allow us to directly quantify how strongly the information encoded in neural activation patterns corresponds to information represented by symbolic structures such as syntax trees. We first validate our methods on the case of a simple synthetic language for arithmetic expressions with clearly defined syntax and semantics, and show that they exhibit the expected pattern of results. We then our methods to correlate neural representations of English sentences with their constituency parse trees.
Anthology ID:
P19-1283
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2952–2962
Language:
URL:
https://aclanthology.org/P19-1283
DOI:
10.18653/v1/P19-1283
Bibkey:
Cite (ACL):
Grzegorz Chrupała and Afra Alishahi. 2019. Correlating Neural and Symbolic Representations of Language. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 2952–2962, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Correlating Neural and Symbolic Representations of Language (Chrupała & Alishahi, ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1283.pdf
Video:
 https://aclanthology.org/P19-1283.mp4
Code
 gchrupala/ursa