Learning Latent Trees with Stochastic Perturbations and Differentiable Dynamic Programming

Caio Corro, Ivan Titov


Abstract
We treat projective dependency trees as latent variables in our probabilistic model and induce them in such a way as to be beneficial for a downstream task, without relying on any direct tree supervision. Our approach relies on Gumbel perturbations and differentiable dynamic programming. Unlike previous approaches to latent tree learning, we stochastically sample global structures and our parser is fully differentiable. We illustrate its effectiveness on sentiment analysis and natural language inference tasks. We also study its properties on a synthetic structure induction task. Ablation studies emphasize the importance of both stochasticity and constraining latent structures to be projective trees.
Anthology ID:
P19-1551
Volume:
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2019
Address:
Florence, Italy
Editors:
Anna Korhonen, David Traum, Lluís Màrquez
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5508–5521
Language:
URL:
https://aclanthology.org/P19-1551
DOI:
10.18653/v1/P19-1551
Bibkey:
Cite (ACL):
Caio Corro and Ivan Titov. 2019. Learning Latent Trees with Stochastic Perturbations and Differentiable Dynamic Programming. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 5508–5521, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Learning Latent Trees with Stochastic Perturbations and Differentiable Dynamic Programming (Corro & Titov, ACL 2019)
Copy Citation:
PDF:
https://aclanthology.org/P19-1551.pdf
Code
 FilippoC/diffdp
Data
ListOpsMultiNLISNLISST