Quasi-Multitask Learning: an Efficient Surrogate for Obtaining Model Ensembles

Norbert Kis-Szabó, Gábor Berend


Abstract
We propose the technique of quasi-multitask learning (Q-MTL), a simple and easy to implement modification of standard multitask learning, in which the tasks to be modeled are identical. With this easy modification of a standard neural classifier we can get benefits similar to an ensemble of classifiers with a fraction of the resources required. We illustrate it through a series of sequence labeling experiments over a diverse set of languages, that applying Q-MTL consistently increases the generalization ability of the applied models. The proposed architecture can be regarded as a new regularization technique that encourages the model to develop an internal representation of the problem at hand which is beneficial to multiple output units of the classifier at the same time. Our experiments corroborate that by relying on the proposed algorithm, we can approximate the quality of an ensemble of classifiers at a fraction of computational resources required. Additionally, our results suggest that Q-MTL handles the presence of noisy training labels better than ensembles.
Anthology ID:
2020.sustainlp-1.13
Volume:
Proceedings of SustaiNLP: Workshop on Simple and Efficient Natural Language Processing
Month:
November
Year:
2020
Address:
Online
Editors:
Nafise Sadat Moosavi, Angela Fan, Vered Shwartz, Goran Glavaš, Shafiq Joty, Alex Wang, Thomas Wolf
Venue:
sustainlp
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
97–106
Language:
URL:
https://aclanthology.org/2020.sustainlp-1.13
DOI:
10.18653/v1/2020.sustainlp-1.13
Bibkey:
Cite (ACL):
Norbert Kis-Szabó and Gábor Berend. 2020. Quasi-Multitask Learning: an Efficient Surrogate for Obtaining Model Ensembles. In Proceedings of SustaiNLP: Workshop on Simple and Efficient Natural Language Processing, pages 97–106, Online. Association for Computational Linguistics.
Cite (Informal):
Quasi-Multitask Learning: an Efficient Surrogate for Obtaining Model Ensembles (Kis-Szabó & Berend, sustainlp 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.sustainlp-1.13.pdf
Video:
 https://slideslive.com/38939435