Relational World Knowledge Representation in Contextual Language Models: A Review

Tara Safavi, Danai Koutra


Abstract
Relational knowledge bases (KBs) are commonly used to represent world knowledge in machines. However, while advantageous for their high degree of precision and interpretability, KBs are usually organized according to manually-defined schemas, which limit their expressiveness and require significant human efforts to engineer and maintain. In this review, we take a natural language processing perspective to these limitations, examining how they may be addressed in part by training deep contextual language models (LMs) to internalize and express relational knowledge in more flexible forms. We propose to organize knowledge representation strategies in LMs by the level of KB supervision provided, from no KB supervision at all to entity- and relation-level supervision. Our contributions are threefold: (1) We provide a high-level, extensible taxonomy for knowledge representation in LMs; (2) Within our taxonomy, we highlight notable models, evaluation tasks, and findings, in order to provide an up-to-date review of current knowledge representation capabilities in LMs; and (3) We suggest future research directions that build upon the complementary aspects of LMs and KBs as knowledge representations.
Anthology ID:
2021.emnlp-main.81
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1053–1067
Language:
URL:
https://aclanthology.org/2021.emnlp-main.81
DOI:
10.18653/v1/2021.emnlp-main.81
Bibkey:
Cite (ACL):
Tara Safavi and Danai Koutra. 2021. Relational World Knowledge Representation in Contextual Language Models: A Review. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 1053–1067, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Relational World Knowledge Representation in Contextual Language Models: A Review (Safavi & Koutra, EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.81.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.81.mp4
Data
ConceptNetGLUELAMA