Generalized Shortest-Paths Encoders for AMR-to-Text Generation

Lisa Jin, Daniel Gildea


Abstract
For text generation from semantic graphs, past neural models encoded input structure via gated convolutions along graph edges. Although these operations provide local context, the distance messages can travel is bounded by the number of encoder propagation steps. We adopt recent efforts of applying Transformer self-attention to graphs to allow global feature propagation. Instead of feeding shortest paths to the vertex self-attention module, we train a model to learn them using generalized shortest-paths algorithms. This approach widens the receptive field of a graph encoder by exposing it to all possible graph paths. We explore how this path diversity affects performance across levels of AMR connectivity, demonstrating gains on AMRs of higher reentrancy counts and diameters. Analysis of generated sentences also supports high semantic coherence of our models for reentrant AMRs. Our best model achieves a 1.4 BLEU and 1.8 chrF++ margin over a baseline that encodes only pairwise-unique shortest paths.
Anthology ID:
2020.coling-main.181
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2004–2013
Language:
URL:
https://aclanthology.org/2020.coling-main.181
DOI:
10.18653/v1/2020.coling-main.181
Bibkey:
Cite (ACL):
Lisa Jin and Daniel Gildea. 2020. Generalized Shortest-Paths Encoders for AMR-to-Text Generation. In Proceedings of the 28th International Conference on Computational Linguistics, pages 2004–2013, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Generalized Shortest-Paths Encoders for AMR-to-Text Generation (Jin & Gildea, COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.181.pdf