Relation Classification Using Segment-Level Attention-based CNN and Dependency-based RNN

Van-Hien Tran, Van-Thuy Phi, Hiroyuki Shindo, Yuji Matsumoto


Abstract
Recently, relation classification has gained much success by exploiting deep neural networks. In this paper, we propose a new model effectively combining Segment-level Attention-based Convolutional Neural Networks (SACNNs) and Dependency-based Recurrent Neural Networks (DepRNNs). While SACNNs allow the model to selectively focus on the important information segment from the raw sequence, DepRNNs help to handle the long-distance relations from the shortest dependency path of relation entities. Experiments on the SemEval-2010 Task 8 dataset show that our model is comparable to the state-of-the-art without using any external lexical features.
Anthology ID:
N19-1286
Volume:
Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers)
Month:
June
Year:
2019
Address:
Minneapolis, Minnesota
Editors:
Jill Burstein, Christy Doran, Thamar Solorio
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2793–2798
Language:
URL:
https://aclanthology.org/N19-1286
DOI:
10.18653/v1/N19-1286
Bibkey:
Cite (ACL):
Van-Hien Tran, Van-Thuy Phi, Hiroyuki Shindo, and Yuji Matsumoto. 2019. Relation Classification Using Segment-Level Attention-based CNN and Dependency-based RNN. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 2793–2798, Minneapolis, Minnesota. Association for Computational Linguistics.
Cite (Informal):
Relation Classification Using Segment-Level Attention-based CNN and Dependency-based RNN (Tran et al., NAACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/N19-1286.pdf