Meta-Learning Improves Lifelong Relation Extraction

Abiola Obamuyide, Andreas Vlachos


Abstract
Most existing relation extraction models assume a fixed set of relations and are unable to adapt to exploit newly available supervision data to extract new relations. In order to alleviate such problems, there is the need to develop approaches that make relation extraction models capable of continuous adaptation and learning. We investigate and present results for such an approach, based on a combination of ideas from lifelong learning and optimization-based meta-learning. We evaluate the proposed approach on two recent lifelong relation extraction benchmarks, and demonstrate that it markedly outperforms current state-of-the-art approaches.
Anthology ID:
W19-4326
Volume:
Proceedings of the 4th Workshop on Representation Learning for NLP (RepL4NLP-2019)
Month:
August
Year:
2019
Address:
Florence, Italy
Editors:
Isabelle Augenstein, Spandana Gella, Sebastian Ruder, Katharina Kann, Burcu Can, Johannes Welbl, Alexis Conneau, Xiang Ren, Marek Rei
Venue:
RepL4NLP
SIG:
SIGREP
Publisher:
Association for Computational Linguistics
Note:
Pages:
224–229
Language:
URL:
https://aclanthology.org/W19-4326
DOI:
10.18653/v1/W19-4326
Bibkey:
Cite (ACL):
Abiola Obamuyide and Andreas Vlachos. 2019. Meta-Learning Improves Lifelong Relation Extraction. In Proceedings of the 4th Workshop on Representation Learning for NLP (RepL4NLP-2019), pages 224–229, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Meta-Learning Improves Lifelong Relation Extraction (Obamuyide & Vlachos, RepL4NLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/W19-4326.pdf
Data
SimpleQuestions