Representation Learning for Type-Driven Composition

Gijs Wijnholds, Mehrnoosh Sadrzadeh, Stephen Clark


Abstract
This paper is about learning word representations using grammatical type information. We use the syntactic types of Combinatory Categorial Grammar to develop multilinear representations, i.e. maps with n arguments, for words with different functional types. The multilinear maps of words compose with each other to form sentence representations. We extend the skipgram algorithm from vectors to multi- linear maps to learn these representations and instantiate it on unary and binary maps for transitive verbs. These are evaluated on verb and sentence similarity and disambiguation tasks and a subset of the SICK relatedness dataset. Our model performs better than previous type- driven models and is competitive with state of the art representation learning methods such as BERT and neural sentence encoders.
Anthology ID:
2020.conll-1.24
Volume:
Proceedings of the 24th Conference on Computational Natural Language Learning
Month:
November
Year:
2020
Address:
Online
Editors:
Raquel Fernández, Tal Linzen
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
313–324
Language:
URL:
https://aclanthology.org/2020.conll-1.24
DOI:
10.18653/v1/2020.conll-1.24
Bibkey:
Cite (ACL):
Gijs Wijnholds, Mehrnoosh Sadrzadeh, and Stephen Clark. 2020. Representation Learning for Type-Driven Composition. In Proceedings of the 24th Conference on Computational Natural Language Learning, pages 313–324, Online. Association for Computational Linguistics.
Cite (Informal):
Representation Learning for Type-Driven Composition (Wijnholds et al., CoNLL 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.conll-1.24.pdf
Code
 gijswijnholds/tensorskipgram-torch
Data
SICK