Second-order Co-occurrence Sensitivity of Skip-Gram with Negative Sampling

Dominik Schlechtweg, Cennet Oguz, Sabine Schulte im Walde


Abstract
We simulate first- and second-order context overlap and show that Skip-Gram with Negative Sampling is similar to Singular Value Decomposition in capturing second-order co-occurrence information, while Pointwise Mutual Information is agnostic to it. We support the results with an empirical study finding that the models react differently when provided with additional second-order information. Our findings reveal a basic property of Skip-Gram with Negative Sampling and point towards an explanation of its success on a variety of tasks.
Anthology ID:
W19-4803
Volume:
Proceedings of the 2019 ACL Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP
Month:
August
Year:
2019
Address:
Florence, Italy
Editors:
Tal Linzen, Grzegorz Chrupała, Yonatan Belinkov, Dieuwke Hupkes
Venue:
BlackboxNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
24–30
Language:
URL:
https://aclanthology.org/W19-4803
DOI:
10.18653/v1/W19-4803
Bibkey:
Cite (ACL):
Dominik Schlechtweg, Cennet Oguz, and Sabine Schulte im Walde. 2019. Second-order Co-occurrence Sensitivity of Skip-Gram with Negative Sampling. In Proceedings of the 2019 ACL Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP, pages 24–30, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Second-order Co-occurrence Sensitivity of Skip-Gram with Negative Sampling (Schlechtweg et al., BlackboxNLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/W19-4803.pdf
Code
 Garrafao/SecondOrder