Multilingual LAMA: Investigating Knowledge in Multilingual Pretrained Language Models

Nora Kassner, Philipp Dufter, Hinrich Schütze


Abstract
Recently, it has been found that monolingual English language models can be used as knowledge bases. Instead of structural knowledge base queries, masked sentences such as “Paris is the capital of [MASK]” are used as probes. We translate the established benchmarks TREx and GoogleRE into 53 languages. Working with mBERT, we investigate three questions. (i) Can mBERT be used as a multilingual knowledge base? Most prior work only considers English. Extending research to multiple languages is important for diversity and accessibility. (ii) Is mBERT’s performance as knowledge base language-independent or does it vary from language to language? (iii) A multilingual model is trained on more text, e.g., mBERT is trained on 104 Wikipedias. Can mBERT leverage this for better performance? We find that using mBERT as a knowledge base yields varying performance across languages and pooling predictions across languages improves performance. Conversely, mBERT exhibits a language bias; e.g., when queried in Italian, it tends to predict Italy as the country of origin.
Anthology ID:
2021.eacl-main.284
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Editors:
Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3250–3258
Language:
URL:
https://aclanthology.org/2021.eacl-main.284
DOI:
10.18653/v1/2021.eacl-main.284
Award:
 Best Short Paper
Bibkey:
Cite (ACL):
Nora Kassner, Philipp Dufter, and Hinrich Schütze. 2021. Multilingual LAMA: Investigating Knowledge in Multilingual Pretrained Language Models. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 3250–3258, Online. Association for Computational Linguistics.
Cite (Informal):
Multilingual LAMA: Investigating Knowledge in Multilingual Pretrained Language Models (Kassner et al., EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-main.284.pdf
Code
 norakassner/mlama
Data
ConceptNetLAMAT-REx