Multi-view Subword Regularization

Xinyi Wang, Sebastian Ruder, Graham Neubig


Abstract
Multilingual pretrained representations generally rely on subword segmentation algorithms to create a shared multilingual vocabulary. However, standard heuristic algorithms often lead to sub-optimal segmentation, especially for languages with limited amounts of data. In this paper, we take two major steps towards alleviating this problem. First, we demonstrate empirically that applying existing subword regularization methods (Kudo, 2018; Provilkov et al., 2020) during fine-tuning of pre-trained multilingual representations improves the effectiveness of cross-lingual transfer. Second, to take full advantage of different possible input segmentations, we propose Multi-view Subword Regularization (MVR), a method that enforces the consistency of predictors between using inputs tokenized by the standard and probabilistic segmentations. Results on the XTREME multilingual benchmark (Hu et al., 2020) show that MVR brings consistent improvements of up to 2.5 points over using standard segmentation algorithms.
Anthology ID:
2021.naacl-main.40
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
473–482
Language:
URL:
https://aclanthology.org/2021.naacl-main.40
DOI:
10.18653/v1/2021.naacl-main.40
Bibkey:
Cite (ACL):
Xinyi Wang, Sebastian Ruder, and Graham Neubig. 2021. Multi-view Subword Regularization. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 473–482, Online. Association for Computational Linguistics.
Cite (Informal):
Multi-view Subword Regularization (Wang et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.40.pdf
Optional supplementary data:
 2021.naacl-main.40.OptionalSupplementaryData.zip
Video:
 https://aclanthology.org/2021.naacl-main.40.mp4
Code
 cindyxinyiwang/multiview-subword-regularization
Data
MLQAPAWS-XXQuAD