Learning from Non-Binary Constituency Trees via Tensor Decomposition

Daniele Castellana, Davide Bacciu


Abstract
Processing sentence constituency trees in binarised form is a common and popular approach in literature. However, constituency trees are non-binary by nature. The binarisation procedure changes deeply the structure, furthering constituents that instead are close. In this work, we introduce a new approach to deal with non-binary constituency trees which leverages tensor-based models. In particular, we show how a powerful composition function based on the canonical tensor decomposition can exploit such a rich structure. A key point of our approach is the weight sharing constraint imposed on the factor matrices, which allows limiting the number of model parameters. Finally, we introduce a Tree-LSTM model which takes advantage of this composition function and we experimentally assess its performance on different NLP tasks.
Anthology ID:
2020.coling-main.346
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3899–3910
Language:
URL:
https://aclanthology.org/2020.coling-main.346
DOI:
10.18653/v1/2020.coling-main.346
Bibkey:
Cite (ACL):
Daniele Castellana and Davide Bacciu. 2020. Learning from Non-Binary Constituency Trees via Tensor Decomposition. In Proceedings of the 28th International Conference on Computational Linguistics, pages 3899–3910, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Learning from Non-Binary Constituency Trees via Tensor Decomposition (Castellana & Bacciu, COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.346.pdf
Code
 danielecastellana22/tensor-tree-nn
Data
SSTSST-2SST-5