Self-Attention with Structural Position Representations

Xing Wang, Zhaopeng Tu, Longyue Wang, Shuming Shi


Abstract
Although self-attention networks (SANs) have advanced the state-of-the-art on various NLP tasks, one criticism of SANs is their ability of encoding positions of input words (Shaw et al., 2018). In this work, we propose to augment SANs with structural position representations to model the latent structure of the input sentence, which is complementary to the standard sequential positional representations. Specifically, we use dependency tree to represent the grammatical structure of a sentence, and propose two strategies to encode the positional relationships among words in the dependency tree. Experimental results on NIST Chinese-to-English and WMT14 English-to-German translation tasks show that the proposed approach consistently boosts performance over both the absolute and relative sequential position representations.
Anthology ID:
D19-1145
Volume:
Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Kentaro Inui, Jing Jiang, Vincent Ng, Xiaojun Wan
Venues:
EMNLP | IJCNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1403–1409
Language:
URL:
https://aclanthology.org/D19-1145
DOI:
10.18653/v1/D19-1145
Bibkey:
Cite (ACL):
Xing Wang, Zhaopeng Tu, Longyue Wang, and Shuming Shi. 2019. Self-Attention with Structural Position Representations. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 1403–1409, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Self-Attention with Structural Position Representations (Wang et al., EMNLP-IJCNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-1145.pdf