Lite Training Strategies for Portuguese-English and English-Portuguese Translation

Alexandre Lopes, Rodrigo Nogueira, Roberto Lotufo, Helio Pedrini


Abstract
Despite the widespread adoption of deep learning for machine translation, it is still expensive to develop high-quality translation models. In this work, we investigate the use of pre-trained models, such as T5 for Portuguese-English and English-Portuguese translation tasks using low-cost hardware. We explore the use of Portuguese and English pre-trained language models and propose an adaptation of the English tokenizer to represent Portuguese characters, such as diaeresis, acute and grave accents. We compare our models to the Google Translate API and MarianMT on a subset of the ParaCrawl dataset, as well as to the winning submission to the WMT19 Biomedical Translation Shared Task. We also describe our submission to the WMT20 Biomedical Translation Shared Task. Our results show that our models have a competitive performance to state-of-the-art models while being trained on modest hardware (a single 8GB gaming GPU for nine days). Our data, models and code are available in our GitHub repository.
Anthology ID:
2020.wmt-1.90
Volume:
Proceedings of the Fifth Conference on Machine Translation
Month:
November
Year:
2020
Address:
Online
Editors:
Loïc Barrault, Ondřej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussà, Christian Federmann, Mark Fishel, Alexander Fraser, Yvette Graham, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, André Martins, Makoto Morishita, Christof Monz, Masaaki Nagata, Toshiaki Nakazawa, Matteo Negri
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
833–840
Language:
URL:
https://aclanthology.org/2020.wmt-1.90
DOI:
Bibkey:
Cite (ACL):
Alexandre Lopes, Rodrigo Nogueira, Roberto Lotufo, and Helio Pedrini. 2020. Lite Training Strategies for Portuguese-English and English-Portuguese Translation. In Proceedings of the Fifth Conference on Machine Translation, pages 833–840, Online. Association for Computational Linguistics.
Cite (Informal):
Lite Training Strategies for Portuguese-English and English-Portuguese Translation (Lopes et al., WMT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.wmt-1.90.pdf
Video:
 https://slideslive.com/38939645
Code
 unicamp-dl/Lite-T5-Translation
Data
capes