Explaining and Improving BERT Performance on Lexical Semantic Change Detection

Severin Laicher, Sinan Kurtyigit, Dominik Schlechtweg, Jonas Kuhn, Sabine Schulte im Walde


Abstract
Type- and token-based embedding architectures are still competing in lexical semantic change detection. The recent success of type-based models in SemEval-2020 Task 1 has raised the question why the success of token-based models on a variety of other NLP tasks does not translate to our field. We investigate the influence of a range of variables on clusterings of BERT vectors and show that its low performance is largely due to orthographic information on the target word, which is encoded even in the higher layers of BERT representations. By reducing the influence of orthography we considerably improve BERT’s performance.
Anthology ID:
2021.eacl-srw.25
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Student Research Workshop
Month:
April
Year:
2021
Address:
Online
Editors:
Ionut-Teodor Sorodoc, Madhumita Sushil, Ece Takmaz, Eneko Agirre
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
192–202
Language:
URL:
https://aclanthology.org/2021.eacl-srw.25
DOI:
10.18653/v1/2021.eacl-srw.25
Bibkey:
Cite (ACL):
Severin Laicher, Sinan Kurtyigit, Dominik Schlechtweg, Jonas Kuhn, and Sabine Schulte im Walde. 2021. Explaining and Improving BERT Performance on Lexical Semantic Change Detection. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Student Research Workshop, pages 192–202, Online. Association for Computational Linguistics.
Cite (Informal):
Explaining and Improving BERT Performance on Lexical Semantic Change Detection (Laicher et al., EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-srw.25.pdf