Highly Efficient Knowledge Graph Embedding Learning with Orthogonal Procrustes Analysis

Xutan Peng, Guanyi Chen, Chenghua Lin, Mark Stevenson


Abstract
Knowledge Graph Embeddings (KGEs) have been intensively explored in recent years due to their promise for a wide range of applications. However, existing studies focus on improving the final model performance without acknowledging the computational cost of the proposed approaches, in terms of execution time and environmental impact. This paper proposes a simple yet effective KGE framework which can reduce the training time and carbon footprint by orders of magnitudes compared with state-of-the-art approaches, while producing competitive performance. We highlight three technical innovations: full batch learning via relational matrices, closed-form Orthogonal Procrustes Analysis for KGEs, and non-negative-sampling training. In addition, as the first KGE method whose entity embeddings also store full relation information, our trained models encode rich semantics and are highly interpretable. Comprehensive experiments and ablation studies involving 13 strong baselines and two standard datasets verify the effectiveness and efficiency of our algorithm.
Anthology ID:
2021.naacl-main.187
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2364–2375
Language:
URL:
https://aclanthology.org/2021.naacl-main.187
DOI:
10.18653/v1/2021.naacl-main.187
Bibkey:
Cite (ACL):
Xutan Peng, Guanyi Chen, Chenghua Lin, and Mark Stevenson. 2021. Highly Efficient Knowledge Graph Embedding Learning with Orthogonal Procrustes Analysis. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 2364–2375, Online. Association for Computational Linguistics.
Cite (Informal):
Highly Efficient Knowledge Graph Embedding Learning with Orthogonal Procrustes Analysis (Peng et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.187.pdf
Video:
 https://aclanthology.org/2021.naacl-main.187.mp4
Code
 Pzoom522/ProcrustEs-KGE
Data
FB15k-237