Global Textual Relation Embedding for Relational Understanding

Zhiyu Chen, Hanwen Zha, Honglei Liu, Wenhu Chen, Xifeng Yan, Yu Su


Abstract
Pre-trained embeddings such as word embeddings and sentence embeddings are fundamental tools facilitating a wide range of downstream NLP tasks. In this work, we investigate how to learn a general-purpose embedding of textual relations, defined as the shortest dependency path between entities. Textual relation embedding provides a level of knowledge between word/phrase level and sentence level, and we show that it can facilitate downstream tasks requiring relational understanding of the text. To learn such an embedding, we create the largest distant supervision dataset by linking the entire English ClueWeb09 corpus to Freebase. We use global co-occurrence statistics between textual and knowledge base relations as the supervision signal to train the embedding. Evaluation on two relational understanding tasks demonstrates the usefulness of the learned textual relation embedding. The data and code can be found at https://github.com/czyssrs/GloREPlus
Anthology ID:
P19-1127
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1324–1330
Language:
URL:
https://aclanthology.org/P19-1127
DOI:
10.18653/v1/P19-1127
Bibkey:
Cite (ACL):
Zhiyu Chen, Hanwen Zha, Honglei Liu, Wenhu Chen, Xifeng Yan, and Yu Su. 2019. Global Textual Relation Embedding for Relational Understanding. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 1324–1330, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Global Textual Relation Embedding for Relational Understanding (Chen et al., ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1127.pdf
Code
 czyssrs/GloREPlus
Data
GloREPlusKineticsKinetics 400