Difference between revisions of "VerbOcean - RTE Users"
Jump to navigation
Jump to search
m |
m |
||
(13 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
− | |||
<br> | <br> | ||
− | {|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1 | + | {|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1" |
|- bgcolor="#CDCDCD" | |- bgcolor="#CDCDCD" | ||
! nowrap="nowrap"|Participants* | ! nowrap="nowrap"|Participants* | ||
Line 8: | Line 7: | ||
! nowrap="nowrap"|Specific usage description | ! nowrap="nowrap"|Specific usage description | ||
! nowrap="nowrap"|Evaluations / Comments | ! nowrap="nowrap"|Evaluations / Comments | ||
+ | |||
|- bgcolor="#ECECEC" align="left" | |- bgcolor="#ECECEC" align="left" | ||
− | | | + | | DFKI |
− | | | + | | RTE5 |
+ | | | ||
+ | | FIRST USE: VerbOcean relations are used to calculate relatedness between verbs in T and H.<br/>SECOND USE: used to assign relatedness between nominal predicates in T and H,after using WordNet to change the verbal nouns into verbs. | ||
+ | | FIRST USE: Ablation test performed. Impact of the resource: null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.33%/+0.5% for run2; +0.17%/+0.17% for run3.<br/>SECOND USE (WordNet+VerbOcean): null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.5%/+0.67% for run2; +0.17%/+0.17% for run3. | ||
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | DLSIUAES | ||
+ | | RTE5 | ||
+ | | | ||
+ | | FIRST USE: Antonymy relation between verbs.<br/> | ||
+ | SECOND USE: VerbOcean relations used to find correspondence between verbs. | ||
+ | | FIRST USE: Ablation test performed together with WordNet and DLSIUAES_negation_list. Positive impact on two-way run: +0.66% accuracy. Negative impact on three-way run: -1% (-0.5% for two-way derived).<br/> | ||
+ | SECOND USE: No ablation test performed. | ||
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | FBKirst | ||
+ | | RTE5 | ||
+ | | | ||
+ | | Extraction of 18232 entailment rules for all the English verbs connected by the ”stronger-than” relation. For instance, if ”kill [stronger-than] injure”, then the rule ”kill ENTAILS injure” is added to the rules repository. | ||
+ | | Ablation test performed. Negative impact of the resource: -0.16% accuracy on two-way task. | ||
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | QUANTA | ||
+ | | RTE5 | ||
+ | | | ||
+ | | We use "opposite-of" relation in VerbOcean as a feature | ||
+ | | Ablation test performed. Null impact of the resource on two-way task. | ||
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | Siel_09 | ||
+ | | RTE5 | ||
+ | | | ||
+ | | Similarity/anthonymy/unrelatedness between verbs | ||
+ | | Ablation test performed. Null impact of the resource both on two-way and on three-way task. | ||
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | UAIC | ||
+ | | RTE5 | ||
| | | | ||
− | | | + | | "Opposite-of" relation to detect contradiction. Used in combination with WordNet. |
− | | | + | | Ablation test performed (Wordnet + VerbOcean). Positive impact of the two resources together: +2% accuracy on two-way, +1.5% on three-way task. |
+ | |||
|- bgcolor="#ECECEC" align="left" | |- bgcolor="#ECECEC" align="left" | ||
| DFKI | | DFKI | ||
Line 38: | Line 76: | ||
| | | | ||
| ''Data taken from the RTE4 proceedings. Participants are recommended to add further information.'' | | ''Data taken from the RTE4 proceedings. Participants are recommended to add further information.'' | ||
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | VENSES | ||
+ | | RTE3 | ||
+ | | | ||
+ | | Semantic relation between words | ||
+ | | No evaluation of the resource | ||
+ | |||
|- bgcolor="#ECECEC" align="left" | |- bgcolor="#ECECEC" align="left" | ||
| ''New user'' | | ''New user'' | ||
Line 47: | Line 93: | ||
{|class="wikitable" cellpadding="3" cellspacing="0" border="0" style="margin-left: 20px;" | {|class="wikitable" cellpadding="3" cellspacing="0" border="0" style="margin-left: 20px;" | ||
|- | |- | ||
− | ! align="left"|Total: | + | ! align="left"|Total: 11 |
|} | |} | ||
<br> | <br> |
Latest revision as of 06:19, 22 December 2009
Participants* | Campaign | Version | Specific usage description | Evaluations / Comments |
---|---|---|---|---|
DFKI | RTE5 | FIRST USE: VerbOcean relations are used to calculate relatedness between verbs in T and H. SECOND USE: used to assign relatedness between nominal predicates in T and H,after using WordNet to change the verbal nouns into verbs. |
FIRST USE: Ablation test performed. Impact of the resource: null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.33%/+0.5% for run2; +0.17%/+0.17% for run3. SECOND USE (WordNet+VerbOcean): null/+0.17% accuracy respectively on two-way and three-way task for run1; +0.5%/+0.67% for run2; +0.17%/+0.17% for run3. | |
DLSIUAES | RTE5 | FIRST USE: Antonymy relation between verbs. SECOND USE: VerbOcean relations used to find correspondence between verbs. |
FIRST USE: Ablation test performed together with WordNet and DLSIUAES_negation_list. Positive impact on two-way run: +0.66% accuracy. Negative impact on three-way run: -1% (-0.5% for two-way derived). SECOND USE: No ablation test performed. | |
FBKirst | RTE5 | Extraction of 18232 entailment rules for all the English verbs connected by the ”stronger-than” relation. For instance, if ”kill [stronger-than] injure”, then the rule ”kill ENTAILS injure” is added to the rules repository. | Ablation test performed. Negative impact of the resource: -0.16% accuracy on two-way task. | |
QUANTA | RTE5 | We use "opposite-of" relation in VerbOcean as a feature | Ablation test performed. Null impact of the resource on two-way task. | |
Siel_09 | RTE5 | Similarity/anthonymy/unrelatedness between verbs | Ablation test performed. Null impact of the resource both on two-way and on three-way task. | |
UAIC | RTE5 | "Opposite-of" relation to detect contradiction. Used in combination with WordNet. | Ablation test performed (Wordnet + VerbOcean). Positive impact of the two resources together: +2% accuracy on two-way, +1.5% on three-way task. | |
DFKI | RTE4 | Unrefined | Semantic relation between verbs. | No separate evaluation. |
DLSIUAES | RTE4 | Data taken from the RTE4 proceedings. Participants are recommended to add further information. | ||
UAIC | RTE4 | Data taken from the RTE4 proceedings. Participants are recommended to add further information. | ||
UPC | RTE4 | Data taken from the RTE4 proceedings. Participants are recommended to add further information. | ||
VENSES | RTE3 | Semantic relation between words | No evaluation of the resource | |
New user | Participants are encouraged to contribute. |
Total: 11 |
---|
[*] For further information about participants, click here: RTE Challenges - Data about participants
Return to RTE Knowledge Resources