Do Word Embeddings Capture Spelling Variation?

Dong Nguyen, Jack Grieve


Abstract
Analyses of word embeddings have primarily focused on semantic and syntactic properties. However, word embeddings have the potential to encode other properties as well. In this paper, we propose a new perspective on the analysis of word embeddings by focusing on spelling variation. In social media, spelling variation is abundant and often socially meaningful. Here, we analyze word embeddings trained on Twitter and Reddit data. We present three analyses using pairs of word forms covering seven types of spelling variation in English. Taken together, our results show that word embeddings encode spelling variation patterns of various types to some extent, even embeddings trained using the skipgram model which does not take spelling into account. Our results also suggest a link between the intentionality of the variation and the distance of the non-conventional spellings to their conventional spellings.
Anthology ID:
2020.coling-main.75
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
870–881
Language:
URL:
https://aclanthology.org/2020.coling-main.75
DOI:
10.18653/v1/2020.coling-main.75
Bibkey:
Cite (ACL):
Dong Nguyen and Jack Grieve. 2020. Do Word Embeddings Capture Spelling Variation?. In Proceedings of the 28th International Conference on Computational Linguistics, pages 870–881, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Do Word Embeddings Capture Spelling Variation? (Nguyen & Grieve, COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.75.pdf
Code
 dongpng/coling2020