Training Structured Prediction Energy Networks with Indirect Supervision

Amirmohammad Rooshenas, Aishwarya Kamath, Andrew McCallum


Abstract
This paper introduces rank-based training of structured prediction energy networks (SPENs). Our method samples from output structures using gradient descent and minimizes the ranking violation of the sampled structures with respect to a scalar scoring function defined with domain knowledge. We have successfully trained SPEN for citation field extraction without any labeled data instances, where the only source of supervision is a simple human-written scoring function. Such scoring functions are often easy to provide; the SPEN then furnishes an efficient structured prediction inference procedure.
Anthology ID:
N18-2021
Volume:
Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)
Month:
June
Year:
2018
Address:
New Orleans, Louisiana
Editors:
Marilyn Walker, Heng Ji, Amanda Stent
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
130–135
Language:
URL:
https://aclanthology.org/N18-2021
DOI:
10.18653/v1/N18-2021
Bibkey:
Cite (ACL):
Amirmohammad Rooshenas, Aishwarya Kamath, and Andrew McCallum. 2018. Training Structured Prediction Energy Networks with Indirect Supervision. In Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), pages 130–135, New Orleans, Louisiana. Association for Computational Linguistics.
Cite (Informal):
Training Structured Prediction Energy Networks with Indirect Supervision (Rooshenas et al., NAACL 2018)
Copy Citation:
PDF:
https://aclanthology.org/N18-2021.pdf
Video:
 https://aclanthology.org/N18-2021.mp4