AMR Parsing with Latent Structural Information

Qiji Zhou, Yue Zhang, Donghong Ji, Hao Tang


Abstract
Abstract Meaning Representations (AMRs) capture sentence-level semantics structural representations to broad-coverage natural sentences. We investigate parsing AMR with explicit dependency structures and interpretable latent structures. We generate the latent soft structure without additional annotations, and fuse both dependency and latent structure via an extended graph neural networks. The fused structural information helps our experiments results to achieve the best reported results on both AMR 2.0 (77.5% Smatch F1 on LDC2017T10) and AMR 1.0 ((71.8% Smatch F1 on LDC2014T12).
Anthology ID:
2020.acl-main.397
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4306–4319
Language:
URL:
https://aclanthology.org/2020.acl-main.397
DOI:
10.18653/v1/2020.acl-main.397
Bibkey:
Cite (ACL):
Qiji Zhou, Yue Zhang, Donghong Ji, and Hao Tang. 2020. AMR Parsing with Latent Structural Information. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 4306–4319, Online. Association for Computational Linguistics.
Cite (Informal):
AMR Parsing with Latent Structural Information (Zhou et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.397.pdf
Video:
 http://slideslive.com/38929103