Deep Multi-Task Learning with Shared Memory for Text Classification

Pengfei Liu1, Xipeng Qiu2, Xuanjing Huang2
1, 2Fudan University


Abstract

Neural network based models have achieved impressive results on various of natural language processing (NLP) tasks. However, in previous works, most models are learned separately based on single-task supervised objectives, which often suffer from insufficient training data. In this paper, we propose two deep architectures which can be trained jointly on multiple related tasks. More specifically, we augment neural model with an external memory, which is shared by several tasks. Experiments on two groups of text classification tasks show that our proposed architectures can improve the performance of a task with the help of other related tasks.