Deep Residual Learning for Weakly-Supervised Relation Extraction

Yi Yao Huang, William Yang Wang


Abstract
Deep residual learning (ResNet) is a new method for training very deep neural networks using identity mapping for shortcut connections. ResNet has won the ImageNet ILSVRC 2015 classification task, and achieved state-of-the-art performances in many computer vision tasks. However, the effect of residual learning on noisy natural language processing tasks is still not well understood. In this paper, we design a novel convolutional neural network (CNN) with residual learning, and investigate its impacts on the task of distantly supervised noisy relation extraction. In contradictory to popular beliefs that ResNet only works well for very deep networks, we found that even with 9 layers of CNNs, using identity mapping could significantly improve the performance for distantly-supervised relation extraction.
Anthology ID:
D17-1191
Volume:
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Month:
September
Year:
2017
Address:
Copenhagen, Denmark
Editors:
Martha Palmer, Rebecca Hwa, Sebastian Riedel
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1803–1807
Language:
URL:
https://aclanthology.org/D17-1191
DOI:
10.18653/v1/D17-1191
Bibkey:
Cite (ACL):
Yi Yao Huang and William Yang Wang. 2017. Deep Residual Learning for Weakly-Supervised Relation Extraction. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 1803–1807, Copenhagen, Denmark. Association for Computational Linguistics.
Cite (Informal):
Deep Residual Learning for Weakly-Supervised Relation Extraction (Huang & Wang, EMNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/D17-1191.pdf