The Lazy Encoder: A Fine-Grained Analysis of the Role of Morphology in Neural Machine Translation

Arianna Bisazza, Clara Tump


Abstract
Neural sequence-to-sequence models have proven very effective for machine translation, but at the expense of model interpretability. To shed more light into the role played by linguistic structure in the process of neural machine translation, we perform a fine-grained analysis of how various source-side morphological features are captured at different levels of the NMT encoder while varying the target language. Differently from previous work, we find no correlation between the accuracy of source morphology encoding and translation quality. We do find that morphological features are only captured in context and only to the extent that they are directly transferable to the target words.
Anthology ID:
D18-1313
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
2871–2876
Language:
URL:
https://aclanthology.org/D18-1313
DOI:
10.18653/v1/D18-1313
Bibkey:
Cite (ACL):
Arianna Bisazza and Clara Tump. 2018. The Lazy Encoder: A Fine-Grained Analysis of the Role of Morphology in Neural Machine Translation. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 2871–2876, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
The Lazy Encoder: A Fine-Grained Analysis of the Role of Morphology in Neural Machine Translation (Bisazza & Tump, EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1313.pdf
Video:
 https://aclanthology.org/D18-1313.mp4