Encodings of Source Syntax: Similarities in NMT Representations Across Target Languages

Tyler A. Chang, Anna Rafferty


Abstract
We train neural machine translation (NMT) models from English to six target languages, using NMT encoder representations to predict ancestor constituent labels of source language words. We find that NMT encoders learn similar source syntax regardless of NMT target language, relying on explicit morphosyntactic cues to extract syntactic features from source sentences. Furthermore, the NMT encoders outperform RNNs trained directly on several of the constituent label prediction tasks, suggesting that NMT encoder representations can be used effectively for natural language tasks involving syntax. However, both the NMT encoders and the directly-trained RNNs learn substantially different syntactic information from a probabilistic context-free grammar (PCFG) parser. Despite lower overall accuracy scores, the PCFG often performs well on sentences for which the RNN-based models perform poorly, suggesting that RNN architectures are constrained in the types of syntax they can learn.
Anthology ID:
2020.repl4nlp-1.2
Volume:
Proceedings of the 5th Workshop on Representation Learning for NLP
Month:
July
Year:
2020
Address:
Online
Editors:
Spandana Gella, Johannes Welbl, Marek Rei, Fabio Petroni, Patrick Lewis, Emma Strubell, Minjoon Seo, Hannaneh Hajishirzi
Venue:
RepL4NLP
SIG:
SIGREP
Publisher:
Association for Computational Linguistics
Note:
Pages:
7–16
Language:
URL:
https://aclanthology.org/2020.repl4nlp-1.2
DOI:
10.18653/v1/2020.repl4nlp-1.2
Bibkey:
Cite (ACL):
Tyler A. Chang and Anna Rafferty. 2020. Encodings of Source Syntax: Similarities in NMT Representations Across Target Languages. In Proceedings of the 5th Workshop on Representation Learning for NLP, pages 7–16, Online. Association for Computational Linguistics.
Cite (Informal):
Encodings of Source Syntax: Similarities in NMT Representations Across Target Languages (Chang & Rafferty, RepL4NLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.repl4nlp-1.2.pdf
Video:
 http://slideslive.com/38929768