Exploring the zero-shot limit of FewRel

Alberto Cetoli


Abstract
This paper proposes a general purpose relation extractor that uses Wikidata descriptions to represent the relation’s surface form. The results are tested on the FewRel 1.0 dataset, which provides an excellent framework for training and evaluating the proposed zero-shot learning system in English. This relation extractor architecture exploits the implicit knowledge of a language model through a question-answering approach.
Anthology ID:
2020.coling-main.124
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
1447–1451
Language:
URL:
https://aclanthology.org/2020.coling-main.124
DOI:
10.18653/v1/2020.coling-main.124
Bibkey:
Cite (ACL):
Alberto Cetoli. 2020. Exploring the zero-shot limit of FewRel. In Proceedings of the 28th International Conference on Computational Linguistics, pages 1447–1451, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Exploring the zero-shot limit of FewRel (Cetoli, COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.124.pdf
Code
 fractalego/fewrel_zero_shot
Data
FewRelSQuAD