Unsupervised Cross-Lingual Adaptation of Dependency Parsers Using CRF Autoencoders

Zhao Li, Kewei Tu


Abstract
We consider the task of cross-lingual adaptation of dependency parsers without annotated target corpora and parallel corpora. Previous work either directly applies a discriminative source parser to the target language, ignoring unannotated target corpora, or employs an unsupervised generative parser that can leverage unannotated target data but has weaker representational power than discriminative parsers. In this paper, we propose to utilize unsupervised discriminative parsers based on the CRF autoencoder framework for this task. We train a source parser and use it to initialize and regularize a target parser that is trained on unannotated target data. We conduct experiments that transfer an English parser to 20 target languages. The results show that our method significantly outperforms previous methods.
Anthology ID:
2020.findings-emnlp.193
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2127–2133
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.193
DOI:
10.18653/v1/2020.findings-emnlp.193
Bibkey:
Cite (ACL):
Zhao Li and Kewei Tu. 2020. Unsupervised Cross-Lingual Adaptation of Dependency Parsers Using CRF Autoencoders. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 2127–2133, Online. Association for Computational Linguistics.
Cite (Informal):
Unsupervised Cross-Lingual Adaptation of Dependency Parsers Using CRF Autoencoders (Li & Tu, Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.193.pdf
Code
 livc/cross-crfae