End-to-End Neural Relation Extraction with Global Optimization

Meishan Zhang, Yue Zhang, Guohong Fu


Abstract
Neural networks have shown promising results for relation extraction. State-of-the-art models cast the task as an end-to-end problem, solved incrementally using a local classifier. Yet previous work using statistical models have demonstrated that global optimization can achieve better performances compared to local classification. We build a globally optimized neural model for end-to-end relation extraction, proposing novel LSTM features in order to better learn context representations. In addition, we present a novel method to integrate syntactic information to facilitate global learning, yet requiring little background on syntactic grammars thus being easy to extend. Experimental results show that our proposed model is highly effective, achieving the best performances on two standard benchmarks.
Anthology ID:
D17-1182
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1730–1740
Language:
URL:
https://aclanthology.org/D17-1182
DOI:
10.18653/v1/D17-1182
Bibkey:
Cite (ACL):
Meishan Zhang, Yue Zhang, and Guohong Fu. 2017. End-to-End Neural Relation Extraction with Global Optimization. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 1730–1740, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
End-to-End Neural Relation Extraction with Global Optimization (Zhang et al., EMNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/D17-1182.pdf
Data
ACE 2005CoNLLCoNLL04