Mediators in Determining what Processing BERT Performs First

Aviv Slobodkin, Leshem Choshen, Omri Abend


Abstract
Probing neural models for the ability to perform downstream tasks using their activation patterns is often used to localize what parts of the network specialize in performing what tasks. However, little work addressed potential mediating factors in such comparisons. As a test-case mediating factor, we consider the prediction’s context length, namely the length of the span whose processing is minimally required to perform the prediction. We show that not controlling for context length may lead to contradictory conclusions as to the localization patterns of the network, depending on the distribution of the probing dataset. Indeed, when probing BERT with seven tasks, we find that it is possible to get 196 different rankings between them when manipulating the distribution of context lengths in the probing dataset. We conclude by presenting best practices for conducting such comparisons in the future.
Anthology ID:
2021.naacl-main.8
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
86–93
Language:
URL:
https://aclanthology.org/2021.naacl-main.8
DOI:
10.18653/v1/2021.naacl-main.8
Bibkey:
Cite (ACL):
Aviv Slobodkin, Leshem Choshen, and Omri Abend. 2021. Mediators in Determining what Processing BERT Performs First. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 86–93, Online. Association for Computational Linguistics.
Cite (Informal):
Mediators in Determining what Processing BERT Performs First (Slobodkin et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.8.pdf
Video:
 https://aclanthology.org/2021.naacl-main.8.mp4
Code
 lovodkin93/BERT-context-distance