Attention Is All You Need for Chinese Word Segmentation

Sufeng Duan, Hai Zhao


Abstract
Taking greedy decoding algorithm as it should be, this work focuses on further strengthening the model itself for Chinese word segmentation (CWS), which results in an even more fast and more accurate CWS model. Our model consists of an attention only stacked encoder and a light enough decoder for the greedy segmentation plus two highway connections for smoother training, in which the encoder is composed of a newly proposed Transformer variant, Gaussian-masked Directional (GD) Transformer, and a biaffine attention scorer. With the effective encoder design, our model only needs to take unigram features for scoring. Our model is evaluated on SIGHAN Bakeoff benchmark datasets. The experimental results show that with the highest segmentation speed, the proposed model achieves new state-of-the-art or comparable performance against strong baselines in terms of strict closed test setting.
Anthology ID:
2020.emnlp-main.317
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3862–3872
Language:
URL:
https://aclanthology.org/2020.emnlp-main.317
DOI:
10.18653/v1/2020.emnlp-main.317
Bibkey:
Cite (ACL):
Sufeng Duan and Hai Zhao. 2020. Attention Is All You Need for Chinese Word Segmentation. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 3862–3872, Online. Association for Computational Linguistics.
Cite (Informal):
Attention Is All You Need for Chinese Word Segmentation (Duan & Zhao, EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.317.pdf
Video:
 https://slideslive.com/38938730
Code
 akibcmi/SAMS