Word Sense Induction with Neural biLM and Symmetric Patterns

Asaf Amrami, Yoav Goldberg


Abstract
An established method for Word Sense Induction (WSI) uses a language model to predict probable substitutes for target words, and induces senses by clustering these resulting substitute vectors. We replace the ngram-based language model (LM) with a recurrent one. Beyond being more accurate, the use of the recurrent LM allows us to effectively query it in a creative way, using what we call dynamic symmetric patterns. The combination of the RNN-LM and the dynamic symmetric patterns results in strong substitute vectors for WSI, allowing to surpass the current state-of-the-art on the SemEval 2013 WSI shared task by a large margin.
Anthology ID:
D18-1523
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4860–4867
Language:
URL:
https://aclanthology.org/D18-1523
DOI:
10.18653/v1/D18-1523
Bibkey:
Cite (ACL):
Asaf Amrami and Yoav Goldberg. 2018. Word Sense Induction with Neural biLM and Symmetric Patterns. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4860–4867, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Word Sense Induction with Neural biLM and Symmetric Patterns (Amrami & Goldberg, EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1523.pdf
Code
 asafamr/SymPatternWSI