Seeing Both the Forest and the Trees: Multi-head Attention for Joint Classification on Different Compositional Levels

Miruna Pislar, Marek Rei


Abstract
In natural languages, words are used in association to construct sentences. It is not words in isolation, but the appropriate use of hierarchical structures that conveys the meaning of the whole sentence. Neural networks have the ability to capture expressive language features; however, insights into the link between words and sentences are difficult to acquire automatically. In this work, we design a deep neural network architecture that explicitly wires lower and higher linguistic components; we then evaluate its ability to perform the same task at different hierarchical levels. Settling on broad text classification tasks, we show that our model, MHAL, learns to simultaneously solve them at different levels of granularity by fluidly transferring knowledge between hierarchies. Using a multi-head attention mechanism to tie the representations between single words and full sentences, MHAL systematically outperforms equivalent models that are not incentivized towards developing compositional representations. Moreover, we demonstrate that, with the proposed architecture, the sentence information flows naturally to individual words, allowing the model to behave like a sequence labeler (which is a lower, word-level task) even without any word supervision, in a zero-shot fashion.
Anthology ID:
2020.coling-main.335
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3761–3775
Language:
URL:
https://aclanthology.org/2020.coling-main.335
DOI:
10.18653/v1/2020.coling-main.335
Bibkey:
Cite (ACL):
Miruna Pislar and Marek Rei. 2020. Seeing Both the Forest and the Trees: Multi-head Attention for Joint Classification on Different Compositional Levels. In Proceedings of the 28th International Conference on Computational Linguistics, pages 3761–3775, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Seeing Both the Forest and the Trees: Multi-head Attention for Joint Classification on Different Compositional Levels (Pislar & Rei, COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.335.pdf
Code
 MirunaPislar/multi-head-attention-labeller
Data
FCESST