Grapheme-to-Phoneme Conversion with a Multilingual Transformer Model

Omnia ElSaadany, Benjamin Suter


Abstract
In this paper, we describe our three submissions to the SIGMORPHON 2020 shared task 1 on grapheme-to-phoneme conversion for 15 languages. We experimented with a single multilingual transformer model. We observed that the multilingual model achieves results on par with our separately trained monolingual models and is even able to avoid a few of the errors made by the monolingual models.
Anthology ID:
2020.sigmorphon-1.7
Volume:
Proceedings of the 17th SIGMORPHON Workshop on Computational Research in Phonetics, Phonology, and Morphology
Month:
July
Year:
2020
Address:
Online
Editors:
Garrett Nicolai, Kyle Gorman, Ryan Cotterell
Venue:
SIGMORPHON
SIG:
SIGMORPHON
Publisher:
Association for Computational Linguistics
Note:
Pages:
85–89
Language:
URL:
https://aclanthology.org/2020.sigmorphon-1.7
DOI:
10.18653/v1/2020.sigmorphon-1.7
Bibkey:
Cite (ACL):
Omnia ElSaadany and Benjamin Suter. 2020. Grapheme-to-Phoneme Conversion with a Multilingual Transformer Model. In Proceedings of the 17th SIGMORPHON Workshop on Computational Research in Phonetics, Phonology, and Morphology, pages 85–89, Online. Association for Computational Linguistics.
Cite (Informal):
Grapheme-to-Phoneme Conversion with a Multilingual Transformer Model (ElSaadany & Suter, SIGMORPHON 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.sigmorphon-1.7.pdf