Multidirectional Associative Optimization of Function-Specific Word Representations

Daniela Gerz, Ivan Vulić, Marek Rei, Roi Reichart, Anna Korhonen


Abstract
We present a neural framework for learning associations between interrelated groups of words such as the ones found in Subject-Verb-Object (SVO) structures. Our model induces a joint function-specific word vector space, where vectors of e.g. plausible SVO compositions lie close together. The model retains information about word group membership even in the joint space, and can thereby effectively be applied to a number of tasks reasoning over the SVO structure. We show the robustness and versatility of the proposed framework by reporting state-of-the-art results on the tasks of estimating selectional preference and event similarity. The results indicate that the combinations of representations learned with our task-independent model outperform task-specific architectures from prior work, while reducing the number of parameters by up to 95%.
Anthology ID:
2020.acl-main.257
Volume:
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Month:
July
Year:
2020
Address:
Online
Editors:
Dan Jurafsky, Joyce Chai, Natalie Schluter, Joel Tetreault
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2872–2882
Language:
URL:
https://aclanthology.org/2020.acl-main.257
DOI:
10.18653/v1/2020.acl-main.257
Bibkey:
Cite (ACL):
Daniela Gerz, Ivan Vulić, Marek Rei, Roi Reichart, and Anna Korhonen. 2020. Multidirectional Associative Optimization of Function-Specific Word Representations. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 2872–2882, Online. Association for Computational Linguistics.
Cite (Informal):
Multidirectional Associative Optimization of Function-Specific Word Representations (Gerz et al., ACL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.acl-main.257.pdf
Video:
 http://slideslive.com/38928982
Code
 cambridgeltl/fs-wrep