Scale down Transformer by Grouping Features for a Lightweight Character-level Language Model

Sungrae Park, Geewook Kim, Junyeop Lee, Junbum Cha, Ji-Hoon Kim, Hwalsuk Lee


Abstract
This paper introduces a method that efficiently reduces the computational cost and parameter size of Transformer. The proposed model, refer to as Group-Transformer, splits feature space into multiple groups, factorizes the calculation paths, and reduces computations for the group interaction. Extensive experiments on two benchmark tasks, enwik8 and text8, prove our model’s effectiveness and efficiency in small-scale Transformers. To the best of our knowledge, Group-Transformer is the first attempt to design Transformer with the group strategy, widely used for efficient CNN architectures.
Anthology ID:
2020.coling-main.607
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
6883–6893
Language:
URL:
https://aclanthology.org/2020.coling-main.607
DOI:
10.18653/v1/2020.coling-main.607
Bibkey:
Cite (ACL):
Sungrae Park, Geewook Kim, Junyeop Lee, Junbum Cha, Ji-Hoon Kim, and Hwalsuk Lee. 2020. Scale down Transformer by Grouping Features for a Lightweight Character-level Language Model. In Proceedings of the 28th International Conference on Computational Linguistics, pages 6883–6893, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Scale down Transformer by Grouping Features for a Lightweight Character-level Language Model (Park et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.607.pdf
Code
 clovaai/group-transformer