Explore Better Relative Position Embeddings from Encoding Perspective for Transformer Models

Anlin Qu, Jianwei Niu, Shasha Mo


Abstract
Relative position embedding (RPE) is a successful method to explicitly and efficaciously encode position information into Transformer models. In this paper, we investigate the potential problems in Shaw-RPE and XL-RPE, which are the most representative and prevalent RPEs, and propose two novel RPEs called Low-level Fine-grained High-level Coarse-grained (LFHC) RPE and Gaussian Cumulative Distribution Function (GCDF) RPE. LFHC-RPE is an improvement of Shaw-RPE, which enhances the perception ability at medium and long relative positions. GCDF-RPE utilizes the excellent properties of the Gaussian function to amend the prior encoding mechanism in XL-RPE. Experimental results on nine authoritative datasets demonstrate the effectiveness of our methods empirically. Furthermore, GCDF-RPE achieves the best overall performance among five different RPEs.
Anthology ID:
2021.emnlp-main.237
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2989–2997
Language:
URL:
https://aclanthology.org/2021.emnlp-main.237
DOI:
10.18653/v1/2021.emnlp-main.237
Bibkey:
Cite (ACL):
Anlin Qu, Jianwei Niu, and Shasha Mo. 2021. Explore Better Relative Position Embeddings from Encoding Perspective for Transformer Models. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 2989–2997, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Explore Better Relative Position Embeddings from Encoding Perspective for Transformer Models (Qu et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.237.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.237.mp4
Code
 menghuanlater/lfhc-gcdf-rpe
Data
CMRCGLUEIMDb Movie ReviewsQNLISNLIWikiText-103WikiText-2