Crowdsourcing Discourse Relation Annotations by a Two-Step Connective Insertion Task

Frances Yung, Vera Demberg, Merel Scholman


Abstract
The perspective of being able to crowd-source coherence relations bears the promise of acquiring annotations for new texts quickly, which could then increase the size and variety of discourse-annotated corpora. It would also open the avenue to answering new research questions: Collecting annotations from a larger number of individuals per instance would allow to investigate the distribution of inferred relations, and to study individual differences in coherence relation interpretation. However, annotating coherence relations with untrained workers is not trivial. We here propose a novel two-step annotation procedure, which extends an earlier method by Scholman and Demberg (2017a). In our approach, coherence relation labels are inferred from connectives that workers insert into the text. We show that the proposed method leads to replicable coherence annotations, and analyse the agreement between the obtained relation labels and annotations from PDTB and RSTDT on the same texts.
Anthology ID:
W19-4003
Volume:
Proceedings of the 13th Linguistic Annotation Workshop
Month:
August
Year:
2019
Address:
Florence, Italy
Editors:
Annemarie Friedrich, Deniz Zeyrek, Jet Hoek
Venue:
LAW
SIG:
SIGANN
Publisher:
Association for Computational Linguistics
Note:
Pages:
16–25
Language:
URL:
https://aclanthology.org/W19-4003
DOI:
10.18653/v1/W19-4003
Bibkey:
Cite (ACL):
Frances Yung, Vera Demberg, and Merel Scholman. 2019. Crowdsourcing Discourse Relation Annotations by a Two-Step Connective Insertion Task. In Proceedings of the 13th Linguistic Annotation Workshop, pages 16–25, Florence, Italy. Association for Computational Linguistics.
Cite (Informal):
Crowdsourcing Discourse Relation Annotations by a Two-Step Connective Insertion Task (Yung et al., LAW 2019)
Copy Citation:
PDF:
https://aclanthology.org/W19-4003.pdf