Open Sesame: Getting inside BERT’s Linguistic Knowledge

Yongjie Lin, Yi Chern Tan, Robert Frank


Abstract
How and to what extent does BERT encode syntactically-sensitive hierarchical information or positionally-sensitive linear information? Recent work has shown that contextual representations like BERT perform well on tasks that require sensitivity to linguistic structure. We present here two studies which aim to provide a better understanding of the nature of BERT’s representations. The first of these focuses on the identification of structurally-defined elements using diagnostic classifiers, while the second explores BERT’s representation of subject-verb agreement and anaphor-antecedent dependencies through a quantitative assessment of self-attention vectors. In both cases, we find that BERT encodes positional information about word tokens well on its lower layers, but switches to a hierarchically-oriented encoding on higher layers. We conclude then that BERT’s representations do indeed model linguistically relevant aspects of hierarchical structure, though they do not appear to show the sharp sensitivity to hierarchical structure that is found in human processing of reflexive anaphora.
Anthology ID:
W19-4825
Volume:
Proceedings of the 2019 ACL Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP
Month:
August
Year:
2019
Address:
Florence, Italy
Editors:
Tal Linzen, Grzegorz Chrupała, Yonatan Belinkov, Dieuwke Hupkes
Venue:
BlackboxNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
241–253
Language:
URL:
https://aclanthology.org/W19-4825
DOI:
10.18653/v1/W19-4825
Bibkey:
Cite (ACL):
Yongjie Lin, Yi Chern Tan, and Robert Frank. 2019. Open Sesame: Getting inside BERT’s Linguistic Knowledge. In Proceedings of the 2019 ACL Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP, pages 241–253, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Open Sesame: Getting inside BERT’s Linguistic Knowledge (Lin et al., BlackboxNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/W19-4825.pdf
Code
 yongjie-lin/bert-opensesame