An Unsupervised Method for Learning Representations of Multi-word Expressions for Semantic Classification

Robert Vacareanu, Marco A. Valenzuela-Escárcega, Rebecca Sharp, Mihai Surdeanu


Abstract
This paper explores an unsupervised approach to learning a compositional representation function for multi-word expressions (MWEs), and evaluates it on the Tratz dataset, which associates two-word expressions with the semantic relation between the compound constituents (e.g. the label employer is associated with the noun compound government agency) (Tratz, 2011). The composition function is based on recurrent neural networks, and is trained using the Skip-Gram objective to predict the words in the context of MWEs. Thus our approach can naturally leverage large unlabeled text sources. Further, our method can make use of provided MWEs when available, but can also function as a completely unsupervised algorithm, using MWE boundaries predicted by a single, domain-agnostic part-of-speech pattern. With pre-defined MWE boundaries, our method outperforms the previous state-of-the-art performance on the coarse-grained evaluation of the Tratz dataset (Tratz, 2011), with an F1 score of 50.4%. The unsupervised version of our method approaches the performance of the supervised one, and even outperforms it in some configurations.
Anthology ID:
2020.coling-main.297
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3346–3356
Language:
URL:
https://aclanthology.org/2020.coling-main.297
DOI:
10.18653/v1/2020.coling-main.297
Bibkey:
Cite (ACL):
Robert Vacareanu, Marco A. Valenzuela-Escárcega, Rebecca Sharp, and Mihai Surdeanu. 2020. An Unsupervised Method for Learning Representations of Multi-word Expressions for Semantic Classification. In Proceedings of the 28th International Conference on Computational Linguistics, pages 3346–3356, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
An Unsupervised Method for Learning Representations of Multi-word Expressions for Semantic Classification (Vacareanu et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.297.pdf