Easy First Relation Extraction with Information Redundancy

Shuai Ma, Gang Wang, Yansong Feng, Jinpeng Huai


Abstract
Many existing relation extraction (RE) models make decisions globally using integer linear programming (ILP). However, it is nontrivial to make use of integer linear programming as a blackbox solver for RE. Its cost of time and memory may become unacceptable with the increase of data scale, and redundant information needs to be encoded cautiously for ILP. In this paper, we propose an easy first approach for relation extraction with information redundancies, embedded in the results produced by local sentence level extractors, during which conflict decisions are resolved with domain and uniqueness constraints. Information redundancies are leveraged to support both easy first collective inference for easy decisions in the first stage and ILP for hard decisions in a subsequent stage. Experimental study shows that our approach improves the efficiency and accuracy of RE, and outperforms both ILP and neural network-based methods.
Anthology ID:
D19-1398
Volume:
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Kentaro Inui, Jing Jiang, Vincent Ng, Xiaojun Wan
Venues:
EMNLP | IJCNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
3851–3861
Language:
URL:
https://aclanthology.org/D19-1398
DOI:
10.18653/v1/D19-1398
Bibkey:
Cite (ACL):
Shuai Ma, Gang Wang, Yansong Feng, and Jinpeng Huai. 2019. Easy First Relation Extraction with Information Redundancy. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 3851–3861, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Easy First Relation Extraction with Information Redundancy (Ma et al., EMNLP-IJCNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-1398.pdf