Joint Incremental Disfluency Detection and Dependency Parsing

Matthew Honnibal, Mark Johnson


Abstract
We present an incremental dependency parsing model that jointly performs disfluency detection. The model handles speech repairs using a novel non-monotonic transition system, and includes several novel classes of features. For comparison, we evaluated two pipeline systems, using state-of-the-art disfluency detectors. The joint model performed better on both tasks, with a parse accuracy of 90.5% and 84.0% accuracy at disfluency detection. The model runs in expected linear time, and processes over 550 tokens a second.
Anthology ID:
Q14-1011
Volume:
Transactions of the Association for Computational Linguistics, Volume 2
Month:
Year:
2014
Address:
Cambridge, MA
Editors:
Dekang Lin, Michael Collins, Lillian Lee
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
131–142
Language:
URL:
https://aclanthology.org/Q14-1011
DOI:
10.1162/tacl_a_00171
Bibkey:
Cite (ACL):
Matthew Honnibal and Mark Johnson. 2014. Joint Incremental Disfluency Detection and Dependency Parsing. Transactions of the Association for Computational Linguistics, 2:131–142.
Cite (Informal):
Joint Incremental Disfluency Detection and Dependency Parsing (Honnibal & Johnson, TACL 2014)
Copy Citation:
PDF:
https://aclanthology.org/Q14-1011.pdf
Video:
 https://aclanthology.org/Q14-1011.mp4
Data
Penn Treebank