Difference between revisions of "VerbOcean - RTE Users"

From ACL Wiki
Jump to navigation Jump to search
m
Line 24: Line 24:
 
| FIRST USE: Ablation test performed together with WordNet and DLSIUAES_negation_list. Positive impact on two-way run: +0.66% accuracy. Negative impact on three-way run: -1% (-0.5% for two-way derived).<br/>
 
| FIRST USE: Ablation test performed together with WordNet and DLSIUAES_negation_list. Positive impact on two-way run: +0.66% accuracy. Negative impact on three-way run: -1% (-0.5% for two-way derived).<br/>
 
SECOND USE: No ablation test performed.
 
SECOND USE: No ablation test performed.
 +
 +
|- bgcolor="#ECECEC" align="left"
 +
| FBKirst
 +
| RTE5
 +
|
 +
| Extraction of 18232 entailment rules for all the English verbs connected by the ”stronger-than” relation. For instance, if ”kill [stronger-than] injure”, then the rule ”kill ENTAILS injure” is added to the rules repository.
 +
| Ablation test performed. Negative impct of the resource: -0.16% accuracy on two-way task.
  
 
|- bgcolor="#ECECEC" align="left"
 
|- bgcolor="#ECECEC" align="left"

Revision as of 08:05, 4 December 2009

When not otherwise specified, the data about version, usage and evaluation of the resource have been provided by participants themselves.

Participants* Campaign Version Specific usage description Evaluations / Comments
DFKI RTE5 FIRST USE: VerbOcean relations are used to calculate relatedness between verbs in T and H.
SECOND USE: used to assign relatedness between nominal predicates in T and H,after using WordNet to change the verbal nouns into verbs.
FIRST USE: Ablation test performed. Impact of the resource: null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.33%/+0.5% for run2; +0.17%/+0.17% for run3.
SECOND USE (WordNet+VerbOcean): null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.5%/+0.67% for run2; +0.17%/+0.17% for run3.
DLSIUAES RTE5 FIRST USE: Antonymy relation between verbs.

SECOND USE: VerbOcean relations used to find correspondence between verbs.

FIRST USE: Ablation test performed together with WordNet and DLSIUAES_negation_list. Positive impact on two-way run: +0.66% accuracy. Negative impact on three-way run: -1% (-0.5% for two-way derived).

SECOND USE: No ablation test performed.

FBKirst RTE5 Extraction of 18232 entailment rules for all the English verbs connected by the ”stronger-than” relation. For instance, if ”kill [stronger-than] injure”, then the rule ”kill ENTAILS injure” is added to the rules repository. Ablation test performed. Negative impct of the resource: -0.16% accuracy on two-way task.
DFKI RTE4 Unrefined Semantic relation between verbs. No separate evaluation.
DLSIUAES RTE4 Data taken from the RTE4 proceedings. Participants are recommended to add further information.
UAIC RTE4 Data taken from the RTE4 proceedings. Participants are recommended to add further information.
UPC RTE4 Data taken from the RTE4 proceedings. Participants are recommended to add further information.
VENSES RTE3 Semantic relation between words No evaluation of the resource
New user Participants are encouraged to contribute.
Total: 5


[*] For further information about participants, click here: RTE Challenges - Data about participants

   Return to RTE Knowledge Resources