Combining Word-Level and Character-Level Representations for Relation Classification of Informal Text

Dongyun Liang, Weiran Xu, Yinge Zhao


Abstract
Word representation models have achieved great success in natural language processing tasks, such as relation classification. However, it does not always work on informal text, and the morphemes of some misspelling words may carry important short-distance semantic information. We propose a hybrid model, combining the merits of word-level and character-level representations to learn better representations on informal text. Experiments on two dataset of relation classification, SemEval-2010 Task8 and a large-scale one we compile from informal text, show that our model achieves a competitive result in the former and state-of-the-art with the other.
Anthology ID:
W17-2606
Volume:
Proceedings of the 2nd Workshop on Representation Learning for NLP
Month:
August
Year:
2017
Address:
Vancouver, Canada
Editors:
Phil Blunsom, Antoine Bordes, Kyunghyun Cho, Shay Cohen, Chris Dyer, Edward Grefenstette, Karl Moritz Hermann, Laura Rimell, Jason Weston, Scott Yih
Venue:
RepL4NLP
SIG:
SIGREP
Publisher:
Association for Computational Linguistics
Note:
Pages:
43–47
Language:
URL:
https://aclanthology.org/W17-2606
DOI:
10.18653/v1/W17-2606
Bibkey:
Cite (ACL):
Dongyun Liang, Weiran Xu, and Yinge Zhao. 2017. Combining Word-Level and Character-Level Representations for Relation Classification of Informal Text. In Proceedings of the 2nd Workshop on Representation Learning for NLP, pages 43–47, Vancouver, Canada. Association for Computational Linguistics.
Cite (Informal):
Combining Word-Level and Character-Level Representations for Relation Classification of Informal Text (Liang et al., RepL4NLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/W17-2606.pdf