Exploring Contextualized Neural Language Models for Temporal Dependency Parsing

Hayley Ross, Jonathon Cai, Bonan Min


Abstract
Extracting temporal relations between events and time expressions has many applications such as constructing event timelines and time-related question answering. It is a challenging problem which requires syntactic and semantic information at sentence or discourse levels, which may be captured by deep contextualized language models (LMs) such as BERT (Devlin et al., 2019). In this paper, we develop several variants of BERT-based temporal dependency parser, and show that BERT significantly improves temporal dependency parsing (Zhang and Xue, 2018a). We also present a detailed analysis on why deep contextualized neural LMs help and where they may fall short. Source code and resources are made available at https://github.com/bnmin/tdp_ranking.
Anthology ID:
2020.emnlp-main.689
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8548–8553
Language:
URL:
https://aclanthology.org/2020.emnlp-main.689
DOI:
10.18653/v1/2020.emnlp-main.689
Bibkey:
Cite (ACL):
Hayley Ross, Jonathon Cai, and Bonan Min. 2020. Exploring Contextualized Neural Language Models for Temporal Dependency Parsing. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 8548–8553, Online. Association for Computational Linguistics.
Cite (Informal):
Exploring Contextualized Neural Language Models for Temporal Dependency Parsing (Ross et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.689.pdf
Video:
 https://slideslive.com/38938936
Code
 bnmin/tdp_ranking