Second-Order Neural Dependency Parsing with Message Passing and End-to-End Training

Xinyu Wang, Kewei Tu


Abstract
In this paper, we propose second-order graph-based neural dependency parsing using message passing and end-to-end neural networks. We empirically show that our approaches match the accuracy of very recent state-of-the-art second-order graph-based neural dependency parsers and have significantly faster speed in both training and testing. We also empirically show the advantage of second-order parsing over first-order parsing and observe that the usefulness of the head-selection structured constraint vanishes when using BERT embedding.
Anthology ID:
2020.aacl-main.12
Volume:
Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing
Month:
December
Year:
2020
Address:
Suzhou, China
Editors:
Kam-Fai Wong, Kevin Knight, Hua Wu
Venue:
AACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
93–99
Language:
URL:
https://aclanthology.org/2020.aacl-main.12
DOI:
Bibkey:
Cite (ACL):
Xinyu Wang and Kewei Tu. 2020. Second-Order Neural Dependency Parsing with Message Passing and End-to-End Training. In Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing, pages 93–99, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Second-Order Neural Dependency Parsing with Message Passing and End-to-End Training (Wang & Tu, AACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.aacl-main.12.pdf
Code
 wangxinyu0922/Second_Order_Parsing
Data
Chinese TreebankPenn Treebank