Dependency parsing with structure preserving embeddings

Ákos Kádár, Lan Xiao, Mete Kemertas, Federico Fancellu, Allan Jepson, Afsaneh Fazly


Abstract
Modern neural approaches to dependency parsing are trained to predict a tree structure by jointly learning a contextual representation for tokens in a sentence, as well as a head–dependent scoring function. Whereas this strategy results in high performance, it is difficult to interpret these representations in relation to the geometry of the underlying tree structure. Our work seeks instead to learn interpretable representations by training a parser to explicitly preserve structural properties of a tree. We do so by casting dependency parsing as a tree embedding problem where we incorporate geometric properties of dependency trees in the form of training losses within a graph-based parser. We provide a thorough evaluation of these geometric losses, showing that a majority of them yield strong tree distance preservation as well as parsing performance on par with a competitive graph-based parser (Qi et al., 2018). Finally, we show where parsing errors lie in terms of tree relationship in order to guide future work.
Anthology ID:
2021.eacl-main.144
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Editors:
Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1684–1697
Language:
URL:
https://aclanthology.org/2021.eacl-main.144
DOI:
10.18653/v1/2021.eacl-main.144
Bibkey:
Cite (ACL):
Ákos Kádár, Lan Xiao, Mete Kemertas, Federico Fancellu, Allan Jepson, and Afsaneh Fazly. 2021. Dependency parsing with structure preserving embeddings. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 1684–1697, Online. Association for Computational Linguistics.
Cite (Informal):
Dependency parsing with structure preserving embeddings (Kádár et al., EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-main.144.pdf