Towards Semi-Supervised Learning for Deep Semantic Role Labeling

Sanket Vaibhav Mehta, Jay Yoon Lee, Jaime Carbonell


Abstract
Neural models have shown several state-of-the-art performances on Semantic Role Labeling (SRL). However, the neural models require an immense amount of semantic-role corpora and are thus not well suited for low-resource languages or domains. The paper proposes a semi-supervised semantic role labeling method that outperforms the state-of-the-art in limited SRL training corpora. The method is based on explicitly enforcing syntactic constraints by augmenting the training objective with a syntactic-inconsistency loss component and uses SRL-unlabeled instances to train a joint-objective LSTM. On CoNLL-2012 English section, the proposed semi-supervised training with 1%, 10% SRL-labeled data and varying amounts of SRL-unlabeled data achieves +1.58, +0.78 F1, respectively, over the pre-trained models that were trained on SOTA architecture with ELMo on the same SRL-labeled data. Additionally, by using the syntactic-inconsistency loss on inference time, the proposed model achieves +3.67, +2.1 F1 over pre-trained model on 1%, 10% SRL-labeled data, respectively.
Anthology ID:
D18-1538
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4958–4963
Language:
URL:
https://aclanthology.org/D18-1538
DOI:
10.18653/v1/D18-1538
Bibkey:
Cite (ACL):
Sanket Vaibhav Mehta, Jay Yoon Lee, and Jaime Carbonell. 2018. Towards Semi-Supervised Learning for Deep Semantic Role Labeling. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4958–4963, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
Towards Semi-Supervised Learning for Deep Semantic Role Labeling (Mehta et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1538.pdf
Attachment:
 D18-1538.Attachment.pdf