Learning What to Share: Leaky Multi-Task Network for Text Classification

Liqiang Xiao, Honglun Zhang, Wenqing Chen, Yongkun Wang, Yaohui Jin


Abstract
Neural network based multi-task learning has achieved great success on many NLP problems, which focuses on sharing knowledge among tasks by linking some layers to enhance the performance. However, most existing approaches suffer from the interference between tasks because they lack of selection mechanism for feature sharing. In this way, the feature spaces of tasks may be easily contaminated by helpless features borrowed from others, which will confuse the models for making correct prediction. In this paper, we propose a multi-task convolutional neural network with the Leaky Unit, which has memory and forgetting mechanism to filter the feature flows between tasks. Experiments on five different datasets for text classification validate the benefits of our approach.
Anthology ID:
C18-1175
Volume:
Proceedings of the 27th International Conference on Computational Linguistics
Month:
August
Year:
2018
Address:
Santa Fe, New Mexico, USA
Editors:
Emily M. Bender, Leon Derczynski, Pierre Isabelle
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2055–2065
Language:
URL:
https://aclanthology.org/C18-1175
DOI:
Bibkey:
Cite (ACL):
Liqiang Xiao, Honglun Zhang, Wenqing Chen, Yongkun Wang, and Yaohui Jin. 2018. Learning What to Share: Leaky Multi-Task Network for Text Classification. In Proceedings of the 27th International Conference on Computational Linguistics, pages 2055–2065, Santa Fe, New Mexico, USA. Association for Computational Linguistics.
Cite (Informal):
Learning What to Share: Leaky Multi-Task Network for Text Classification (Xiao et al., COLING 2018)
Copy Citation:
PDF:
https://aclanthology.org/C18-1175.pdf
Data
IMDb Movie ReviewsSSTSST-2