RTE6 - Ablation Tests: Difference between revisions
Jump to navigation
Jump to search
Amarchetti (talk | contribs) No edit summary |
Amarchetti (talk | contribs) No edit summary |
||
| Line 47: | Line 47: | ||
|- bgcolor="#FFFFF" "align="left" | |- bgcolor="#FFFFF" "align="left" | ||
| Name Normalization | | Name Normalization | ||
| | | budapestcad2_abl-2 | ||
| style="text-align: center;"| 0.65 | | style="text-align: center;"| 0.65 | ||
| style="text-align: center;"| no name normalization was performed (e.g. George W. Bush -> Bush). | | style="text-align: center;"| no name normalization was performed (e.g. George W. Bush -> Bush). | ||
| Line 53: | Line 53: | ||
|- bgcolor="#FFFFF" "align="left" | |- bgcolor="#FFFFF" "align="left" | ||
| Named Entities Recognition | | Named Entities Recognition | ||
| | | budapestcad2_abl-3 | ||
| style="text-align: center;"| -1.23 | | style="text-align: center;"| -1.23 | ||
| style="text-align: center;"| no NER | | style="text-align: center;"| no NER | ||
|- bgcolor="#ECECEC" "align="left" | |||
| WordNet | |||
| budapestcad2_abl-4 | |||
| style="text-align: center;"| -1.11 | |||
| style="text-align: center;"| No WordNet. (In the original run, WordNet was used to find the synonyms of words in the triplets, and additional triplets were generated from all possible combinations.) | |||
|- bgcolor="#ECECEC" "align="left" | |||
| WordNet | |||
| deb_iitb1_abl-1 | |||
| style="text-align: center;"| 8.68 | |||
| style="text-align: center;"| Wordnet is albated in this test.No change of code required only wordnet module is removed while matching. | |||
|- bgcolor="#ECECEC" "align="left" | |||
| VerbOcean | |||
| deb_iitb1_abl-2 | |||
| style="text-align: center;"| 1.87 | |||
| style="text-align: center;"| VerbOcean is albated in this test.No change of code required only VerbOcean module is removed while matching. | |||
|- bgcolor="#ECECEC" "align="left" | |||
| WordNet | |||
| deb_iitb2_abl-1 | |||
| style="text-align: center;"| 7.9 | |||
| style="text-align: center;"| Wordnet is albated in this test.No change of code required only wordnet module is removed while matching. | |||
|- bgcolor="#ECECEC" "align="left" | |||
| VerbOcean | |||
| deb_iitb2_abl-2 | |||
| style="text-align: center;"| 0.94 | |||
| style="text-align: center;"| VerbOcean is albated in this test.No change of code required only VerbOcean module is removed while matching. | |||
|- bgcolor="#ECECEC" "align="left" | |||
| WordNet | |||
| deb_iitb3_abl-1 | |||
| style="text-align: center;"| 11.43 | |||
| style="text-align: center;"| Wordnet is albated in this test. No change of code required only wordnet module is removed while matching. | |||
|- bgcolor="#ECECEC" "align="left" | |||
| WordNet | |||
| deb_iitb3_abl-2 | |||
| style="text-align: center;"| 2.54 | |||
| style="text-align: center;"| VerbOcean is albated in this test.No change of code required only VerbOcean module is removed while matching. | |||
|} | |} | ||
<br> | <br> | ||
Revision as of 16:20, 2 February 2011
The following table lists the results of the ablation tests (a mandatory track since the RTE5 campaign), submitted by participants to RTE6 .
Participants are kindly invited to check if all the inserted information is correct and complete.
| Ablated Component | Ablation Run[1] | Resource impact - F1 | Resource Usage Description |
|---|---|---|---|
| WordNet | BIU1_abl-1 | 0.9 | No Word-Net. On Dev set: 39.18% (compared to 40.73% when WN is used) |
| CatVar | BIU1_abl-2 | 0.63 | No CatVar. On Dev set achieved about 40.20% (compared to 40.73% when CatVar is used) |
| Coreference resolver | BIU1_abl-3 | -0.88 | No coreference resolver
On Dev set 41.62% (Compared to 40.73% when Coreference resolver is used). This ablation test is an unusual ablation test, since it shows that the co-reference resolution component has a negative impact. |
| DIRT | Boeing1_abl-1 | 3.97 | DIRT removed |
| WordNet | Boeing1_abl-2 | 4.42 | No WordNet |
| Name Normalization | budapestcad2_abl-2 | 0.65 | no name normalization was performed (e.g. George W. Bush -> Bush). |
| Named Entities Recognition | budapestcad2_abl-3 | -1.23 | no NER |
| WordNet | budapestcad2_abl-4 | -1.11 | No WordNet. (In the original run, WordNet was used to find the synonyms of words in the triplets, and additional triplets were generated from all possible combinations.) |
| WordNet | deb_iitb1_abl-1 | 8.68 | Wordnet is albated in this test.No change of code required only wordnet module is removed while matching. |
| VerbOcean | deb_iitb1_abl-2 | 1.87 | VerbOcean is albated in this test.No change of code required only VerbOcean module is removed while matching. |
| WordNet | deb_iitb2_abl-1 | 7.9 | Wordnet is albated in this test.No change of code required only wordnet module is removed while matching. |
| VerbOcean | deb_iitb2_abl-2 | 0.94 | VerbOcean is albated in this test.No change of code required only VerbOcean module is removed while matching. |
| WordNet | deb_iitb3_abl-1 | 11.43 | Wordnet is albated in this test. No change of code required only wordnet module is removed while matching. |
| WordNet | deb_iitb3_abl-2 | 2.54 | VerbOcean is albated in this test.No change of code required only VerbOcean module is removed while matching. |
Footnotes
- ↑ For further information about participants, click here: RTE Challenges - Data about participants
Return to RTE Knowledge Resources