DIRT Paraphrase Collection - RTE Users

From ACL Wiki
Revision as of 08:52, 6 April 2009 by Celct (talk | contribs) (New page: <br /> {|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1" style="margin-left: 20px;" |- bgcolor="#CDCDCD" ! nowrap="nowrap"|Partecipants ! nowrap="nowrap"|Campaign ! n...)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search


Partecipants Campaign Version Specific usage description Evalutations / Comments
Boeing RTE4 Original DIRT db Elaborate T sentence with DIRT-implied entailments precision/recall in RTE4:

boeing run1: 67%/6%; boeing run2: 54%/30%

BIU RTE4 We used the canonical DIRT rulebase version of Szpektor and Dagan (RANLP 2007), and considered top 25 rules. Verb-net + Nom-lex plus + Parc Polarity Lexicon + DIRT Paraphrase Collection = +0.9% on RTE-4 ablation tests.
UAIC RTE4 Data extracted from proceedings. Partecipants are recommended to edit fields
Uoeltg RTE4 Data extracted from proceedings. Partecipants are recommended to edit fields
New user Partecipants are recommended to edit fields
Total: 4