Neural Unsupervised Parsing Beyond English

Katharina Kann, Anhad Mohananey, Samuel R. Bowman, Kyunghyun Cho


Abstract
Recently, neural network models which automatically infer syntactic structure from raw text have started to achieve promising results. However, earlier work on unsupervised parsing shows large performance differences between non-neural models trained on corpora in different languages, even for comparable amounts of data. With that in mind, we train instances of the PRPN architecture (Shen et al., 2018)—one of these unsupervised neural network parsers—for Arabic, Chinese, English, and German. We find that (i) the model strongly outperforms trivial baselines and, thus, acquires at least some parsing ability for all languages; (ii) good hyperparameter values seem to be universal; (iii) how the model benefits from larger training set sizes depends on the corpus, with the model achieving the largest performance gains when increasing the number of sentences from 2,500 to 12,500 for English. In addition, we show that, by sharing parameters between the related languages German and English, we can improve the model’s unsupervised parsing F1 score by up to 4% in the low-resource setting.
Anthology ID:
D19-6123
Volume:
Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Colin Cherry, Greg Durrett, George Foster, Reza Haffari, Shahram Khadivi, Nanyun Peng, Xiang Ren, Swabha Swayamdipta
Venue:
WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
209–218
Language:
URL:
https://aclanthology.org/D19-6123
DOI:
10.18653/v1/D19-6123
Bibkey:
Cite (ACL):
Katharina Kann, Anhad Mohananey, Samuel R. Bowman, and Kyunghyun Cho. 2019. Neural Unsupervised Parsing Beyond English. In Proceedings of the 2nd Workshop on Deep Learning Approaches for Low-Resource NLP (DeepLo 2019), pages 209–218, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Neural Unsupervised Parsing Beyond English (Kann et al., 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-6123.pdf