A Richer-but-Smarter Shortest Dependency Path with Attentive Augmentation for Relation Extraction

Duy-Cat Can, Hoang-Quynh Le, Quang-Thuy Ha, Nigel Collier


Abstract
To extract the relationship between two entities in a sentence, two common approaches are (1) using their shortest dependency path (SDP) and (2) using an attention model to capture a context-based representation of the sentence. Each approach suffers from its own disadvantage of either missing or redundant information. In this work, we propose a novel model that combines the advantages of these two approaches. This is based on the basic information in the SDP enhanced with information selected by several attention mechanisms with kernel filters, namely RbSP (Richer-but-Smarter SDP). To exploit the representation behind the RbSP structure effectively, we develop a combined deep neural model with a LSTM network on word sequences and a CNN on RbSP. Experimental results on the SemEval-2010 dataset demonstrate improved performance over competitive baselines. The data and source code are available at https://github.com/catcd/RbSP.
Anthology ID:
N19-1298
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Jill Burstein, Christy Doran, Thamar Solorio
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2902–2912
Language:
URL:
https://aclanthology.org/N19-1298
DOI:
10.18653/v1/N19-1298
Bibkey:
Cite (ACL):
Duy-Cat Can, Hoang-Quynh Le, Quang-Thuy Ha, and Nigel Collier. 2019. A Richer-but-Smarter Shortest Dependency Path with Attentive Augmentation for Relation Extraction. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 2902–2912, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
A Richer-but-Smarter Shortest Dependency Path with Attentive Augmentation for Relation Extraction (Can et al., NAACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/N19-1298.pdf
Code
 catcd/RbSP
Data
SemEval-2010 Task-8