Incorporating Latent Meanings of Morphological Compositions to Enhance Word Embeddings

Yang Xu, Jiawei Liu, Wei Yang, Liusheng Huang


Abstract
Traditional word embedding approaches learn semantic information at word level while ignoring the meaningful internal structures of words like morphemes. Furthermore, existing morphology-based models directly incorporate morphemes to train word embeddings, but still neglect the latent meanings of morphemes. In this paper, we explore to employ the latent meanings of morphological compositions of words to train and enhance word embeddings. Based on this purpose, we propose three Latent Meaning Models (LMMs), named LMM-A, LMM-S and LMM-M respectively, which adopt different strategies to incorporate the latent meanings of morphemes during the training process. Experiments on word similarity, syntactic analogy and text classification are conducted to validate the feasibility of our models. The results demonstrate that our models outperform the baselines on five word similarity datasets. On Wordsim-353 and RG-65 datasets, our models nearly achieve 5% and 7% gains over the classic CBOW model, respectively. For the syntactic analogy and text classification tasks, our models also surpass all the baselines including a morphology-based model.
Anthology ID:
P18-1114
Volume:
Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Iryna Gurevych, Yusuke Miyao
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1232–1242
Language:
URL:
https://aclanthology.org/P18-1114
DOI:
10.18653/v1/P18-1114
Bibkey:
Cite (ACL):
Yang Xu, Jiawei Liu, Wei Yang, and Liusheng Huang. 2018. Incorporating Latent Meanings of Morphological Compositions to Enhance Word Embeddings. In Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 1232–1242, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Incorporating Latent Meanings of Morphological Compositions to Enhance Word Embeddings (Xu et al., ACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/P18-1114.pdf
Presentation:
 P18-1114.Presentation.pdf
Video:
 https://aclanthology.org/P18-1114.mp4
Code
 Y-Xu/lmm