Learning to Generate Word Representations using Subword Information

Yeachan Kim, Kang-Min Kim, Ji-Min Lee, SangKeun Lee


Abstract
Distributed representations of words play a major role in the field of natural language processing by encoding semantic and syntactic information of words. However, most existing works on learning word representations typically regard words as individual atomic units and thus are blind to subword information in words. This further gives rise to a difficulty in representing out-of-vocabulary (OOV) words. In this paper, we present a character-based word representation approach to deal with this limitation. The proposed model learns to generate word representations from characters. In our model, we employ a convolutional neural network and a highway network over characters to extract salient features effectively. Unlike previous models that learn word representations from a large corpus, we take a set of pre-trained word embeddings and generalize it to word entries, including OOV words. To demonstrate the efficacy of the proposed model, we perform both an intrinsic and an extrinsic task which are word similarity and language modeling, respectively. Experimental results show clearly that the proposed model significantly outperforms strong baseline models that regard words or their subwords as atomic units. For example, we achieve as much as 18.5% improvement on average in perplexity for morphologically rich languages compared to strong baselines in the language modeling task.
Anthology ID:
C18-1216
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2551–2561
Language:
URL:
https://aclanthology.org/C18-1216
DOI:
Bibkey:
Cite (ACL):
Yeachan Kim, Kang-Min Kim, Ji-Min Lee, and SangKeun Lee. 2018. Learning to Generate Word Representations using Subword Information. In Proceedings of the 27th International Conference on Computational Linguistics, pages 2551–2561, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Learning to Generate Word Representations using Subword Information (Kim et al., COLING 2018)
Copy Citation:
PDF:
https://aclanthology.org/C18-1216.pdf