Multiple Segmentations of Thai Sentences for Neural Machine Translation

Alberto Poncelas, Wichaya Pidchamook, Chao-Hong Liu, James Hadley, Andy Way


Abstract
Thai is a low-resource language, so it is often the case that data is not available in sufficient quantities to train an Neural Machine Translation (NMT) model which perform to a high level of quality. In addition, the Thai script does not use white spaces to delimit the boundaries between words, which adds more complexity when building sequence to sequence models. In this work, we explore how to augment a set of English–Thai parallel data by replicating sentence-pairs with different word segmentation methods on Thai, as training data for NMT model training. Using different merge operations of Byte Pair Encoding, different segmentations of Thai sentences can be obtained. The experiments show that combining these datasets, performance is improved for NMT models trained with a dataset that has been split using a supervised splitting tool.
Anthology ID:
2020.sltu-1.33
Volume:
Proceedings of the 1st Joint Workshop on Spoken Language Technologies for Under-resourced languages (SLTU) and Collaboration and Computing for Under-Resourced Languages (CCURL)
Month:
May
Year:
2020
Address:
Marseille, France
Editors:
Dorothee Beermann, Laurent Besacier, Sakriani Sakti, Claudia Soria
Venue:
SLTU
SIG:
Publisher:
European Language Resources association
Note:
Pages:
240–244
Language:
English
URL:
https://aclanthology.org/2020.sltu-1.33
DOI:
Bibkey:
Cite (ACL):
Alberto Poncelas, Wichaya Pidchamook, Chao-Hong Liu, James Hadley, and Andy Way. 2020. Multiple Segmentations of Thai Sentences for Neural Machine Translation. In Proceedings of the 1st Joint Workshop on Spoken Language Technologies for Under-resourced languages (SLTU) and Collaboration and Computing for Under-Resourced Languages (CCURL), pages 240–244, Marseille, France. European Language Resources association.
Cite (Informal):
Multiple Segmentations of Thai Sentences for Neural Machine Translation (Poncelas et al., SLTU 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.sltu-1.33.pdf