Zero-shot Dependency Parsing with Pre-trained Multilingual Sentence Representations

Ke Tran, Arianna Bisazza


Abstract
We investigate whether off-the-shelf deep bidirectional sentence representations (Devlin et al., 2019) trained on a massively multilingual corpus (multilingual BERT) enable the development of an unsupervised universal dependency parser. This approach only leverages a mix of monolingual corpora in many languages and does not require any translation data making it applicable to low-resource languages. In our experiments we outperform the best CoNLL 2018 language-specific systems in all of the shared task’s six truly low-resource languages while using a single system. However, we also find that (i) parsing accuracy still varies dramatically when changing the training languages and (ii) in some target languages zero-shot transfer fails under all tested conditions, raising concerns on the ‘universality’ of the whole approach.
Anthology ID:
D19-6132
Volume:
Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Colin Cherry, Greg Durrett, George Foster, Reza Haffari, Shahram Khadivi, Nanyun Peng, Xiang Ren, Swabha Swayamdipta
Venue:
WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
281–288
Language:
URL:
https://aclanthology.org/D19-6132
DOI:
10.18653/v1/D19-6132
Bibkey:
Cite (ACL):
Ke Tran and Arianna Bisazza. 2019. Zero-shot Dependency Parsing with Pre-trained Multilingual Sentence Representations. In Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019), pages 281–288, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Zero-shot Dependency Parsing with Pre-trained Multilingual Sentence Representations (Tran & Bisazza, 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-6132.pdf