Too Much in Common: Shifting of Embeddings in Transformer Language Models and its Implications

Daniel Biś, Maksim Podkorytov, Xiuwen Liu


Abstract
The success of language models based on the Transformer architecture appears to be inconsistent with observed anisotropic properties of representations learned by such models. We resolve this by showing, contrary to previous studies, that the representations do not occupy a narrow cone, but rather drift in common directions. At any training step, all of the embeddings except for the ground-truth target embedding are updated with gradient in the same direction. Compounded over the training set, the embeddings drift and share common components, manifested in their shape in all the models we have empirically tested. Our experiments show that isotropy can be restored using a simple transformation.
Anthology ID:
2021.naacl-main.403
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5117–5130
Language:
URL:
https://aclanthology.org/2021.naacl-main.403
DOI:
10.18653/v1/2021.naacl-main.403
Bibkey:
Cite (ACL):
Daniel Biś, Maksim Podkorytov, and Xiuwen Liu. 2021. Too Much in Common: Shifting of Embeddings in Transformer Language Models and its Implications. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 5117–5130, Online. Association for Computational Linguistics.
Cite (Informal):
Too Much in Common: Shifting of Embeddings in Transformer Language Models and its Implications (Biś et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.403.pdf
Video:
 https://aclanthology.org/2021.naacl-main.403.mp4
Code
 danielbis/toomuchincommon