Difference between revisions of "Wikipedia - RTE Users"
Jump to navigation
Jump to search
m |
m |
||
(14 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
− | |||
<br> | <br> | ||
− | + | {|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1" | |
− | {|class="wikitable sortable" cellpadding="3" cellspacing="0" border="1 | ||
|- bgcolor="#CDCDCD" | |- bgcolor="#CDCDCD" | ||
! nowrap="nowrap"|Participants* | ! nowrap="nowrap"|Participants* | ||
Line 8: | Line 6: | ||
! nowrap="nowrap"|Version | ! nowrap="nowrap"|Version | ||
! nowrap="nowrap"|Specific usage description | ! nowrap="nowrap"|Specific usage description | ||
− | ! nowrap="nowrap"| | + | ! nowrap="nowrap"|Evaluations / Comments |
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | BIU | ||
+ | | RTE5 | ||
+ | | | ||
+ | | Used for the creation of WikiRules!, a resource of lexical reference rules | ||
+ | | Cfr. evaluation of [[WikiRules! - RTE Users|WikiRules!]] | ||
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | Cswhu | ||
+ | | RTE5 | ||
+ | | | ||
+ | | Named entity relations extraction | ||
+ | | Ablation test performed. Positive impact of the resosurce: +1.33% on two-way, +3.34% on three-way task | ||
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | FBKirst | ||
+ | | RTE5 | ||
+ | | | ||
+ | | Rules extracted from WP using Latent Semantic Analysis (LSA) | ||
+ | | Ablation test performed. Positive impact of the resosurce: +1% on two-way task | ||
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | PeMoZa | ||
+ | | RTE5 | ||
+ | | 17 October 2009 | ||
+ | | Similarity measure based on WikiPedia, using jLSA (java Latent Semantic Analysis) | ||
+ | | No ablation test performed.<br/> | ||
+ | It is worth mentioning that the similarity measure over Wikipedia, not only covers all the existing pairs, but its computation is also about 100 times faster than the previously used tool based on WordNet | ||
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | QUANTA | ||
+ | | RTE5 | ||
+ | | | ||
+ | | Wikipedia redirection set is employed as a synonymous entity set to detect the entity entailment. | ||
+ | | No ablation test performed.<br/> | ||
+ | |||
+ | |- bgcolor="#ECECEC" align="left" | ||
+ | | UAIC | ||
+ | | RTE5 | ||
+ | | | ||
+ | | FIRST USE: Extraction of a new resource based on named entity relation mining using Wikipedia (Iftene and Balahur-Dobrescu, 2008)<br/> | ||
+ | SECOND USE: Extraction of a new resource aimed to identify a "distance" between one name entity from hypothesis and name entities from text | ||
+ | | FIRST USE: Ablation test performed. Positive impact of the resource: +0.17% on two-way, +0.50% on three-way task.<br/> | ||
+ | SECOND USE: Ablation test performed (Wikipedia+NER's+Perl patterns). Positive impact of the whole NE module: +6.17% on two-way, +5% on three-way task. | ||
+ | |||
|- bgcolor="#ECECEC" align="left" | |- bgcolor="#ECECEC" align="left" | ||
| BIU | | BIU | ||
| RTE4 | | RTE4 | ||
| | | | ||
− | | | + | | Used for the creation of WikiRules!, a resource of lexical reference rules |
− | | | + | | Cfr. evaluation of [[WikiRules! - RTE Users|WikiRules!]] |
+ | |||
|- bgcolor="#ECECEC" align="left" | |- bgcolor="#ECECEC" align="left" | ||
| UAIC | | UAIC | ||
Line 36: | Line 81: | ||
{|class="wikitable" cellpadding="3" cellspacing="0" border="0" style="margin-left: 20px;" | {|class="wikitable" cellpadding="3" cellspacing="0" border="0" style="margin-left: 20px;" | ||
|- | |- | ||
− | ! align="left"|Total: | + | ! align="left"|Total: 9 |
|} | |} | ||
<br> | <br> |
Latest revision as of 06:20, 22 December 2009
Participants* | Campaign | Version | Specific usage description | Evaluations / Comments |
---|---|---|---|---|
BIU | RTE5 | Used for the creation of WikiRules!, a resource of lexical reference rules | Cfr. evaluation of WikiRules! | |
Cswhu | RTE5 | Named entity relations extraction | Ablation test performed. Positive impact of the resosurce: +1.33% on two-way, +3.34% on three-way task | |
FBKirst | RTE5 | Rules extracted from WP using Latent Semantic Analysis (LSA) | Ablation test performed. Positive impact of the resosurce: +1% on two-way task | |
PeMoZa | RTE5 | 17 October 2009 | Similarity measure based on WikiPedia, using jLSA (java Latent Semantic Analysis) | No ablation test performed. It is worth mentioning that the similarity measure over Wikipedia, not only covers all the existing pairs, but its computation is also about 100 times faster than the previously used tool based on WordNet |
QUANTA | RTE5 | Wikipedia redirection set is employed as a synonymous entity set to detect the entity entailment. | No ablation test performed. | |
UAIC | RTE5 | FIRST USE: Extraction of a new resource based on named entity relation mining using Wikipedia (Iftene and Balahur-Dobrescu, 2008) SECOND USE: Extraction of a new resource aimed to identify a "distance" between one name entity from hypothesis and name entities from text |
FIRST USE: Ablation test performed. Positive impact of the resource: +0.17% on two-way, +0.50% on three-way task. SECOND USE: Ablation test performed (Wikipedia+NER's+Perl patterns). Positive impact of the whole NE module: +6.17% on two-way, +5% on three-way task. | |
BIU | RTE4 | Used for the creation of WikiRules!, a resource of lexical reference rules | Cfr. evaluation of WikiRules! | |
UAIC | RTE4 | Data taken from the RTE4 proceedings. Participants are recommended to add further information. | ||
UPC | RTE4 | Data taken from the RTE4 proceedings. Participants are recommended to add further information. | ||
New user | Participants are encouraged to contribute. |
Total: 9 |
---|
[*] For further information about participants, click here: RTE Challenges - Data about participants
Return to RTE Knowledge Resources