One Vector is Not Enough: Entity-Augmented Distributed Semantics for Discourse Relations

Yangfeng Ji, Jacob Eisenstein


Abstract
Discourse relations bind smaller linguistic units into coherent texts. Automatically identifying discourse relations is difficult, because it requires understanding the semantics of the linked arguments. A more subtle challenge is that it is not enough to represent the meaning of each argument of a discourse relation, because the relation may depend on links between lowerlevel components, such as entity mentions. Our solution computes distributed meaning representations for each discourse argument by composition up the syntactic parse tree. We also perform a downward compositional pass to capture the meaning of coreferent entity mentions. Implicit discourse relations are then predicted from these two representations, obtaining substantial improvements on the Penn Discourse Treebank.
Anthology ID:
Q15-1024
Volume:
Transactions of the Association for Computational Linguistics, Volume 3
Month:
Year:
2015
Address:
Cambridge, MA
Editors:
Michael Collins, Lillian Lee
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
329–344
Language:
URL:
https://aclanthology.org/Q15-1024
DOI:
10.1162/tacl_a_00142
Bibkey:
Cite (ACL):
Yangfeng Ji and Jacob Eisenstein. 2015. One Vector is Not Enough: Entity-Augmented Distributed Semantics for Discourse Relations. Transactions of the Association for Computational Linguistics, 3:329–344.
Cite (Informal):
One Vector is Not Enough: Entity-Augmented Distributed Semantics for Discourse Relations (Ji & Eisenstein, TACL 2015)
Copy Citation:
PDF:
https://aclanthology.org/Q15-1024.pdf
Video:
 https://aclanthology.org/Q15-1024.mp4