Conference and Labs of the Evaluation Forum - CLEF 2013
CALL FOR LAB PROPOSALS
Valencia, Spain, 23-26 September 2013
The CLEF 2013 conference is next year's edition of the popular CLEF
campaign and workshop series which has run since 2000 contributing
to the systematic evaluation of information access systems, primarily
through experimentation on shared tasks. In 2010 CLEF was launched
in a new format, as a conference with research presentations, panels,
poster and demo sessions and laboratory evaluation workshops
interleaved during three and a half days of intense and stimulating
The CLEF Initiative is a self-organized body whose main mission is
to promote research, innovation, and development of information access
systems with an emphasis on multilingual and multimodal information
with various levels of structure. CLEF promotes research and
development by providing an infrastructure for:
- multilingual and multimodal system testing, tuning and evaluation;
- investigation of the use of unstructured, semi-structured,
highly-structured, and semantically enriched data in information access;
- creation of reusable test collections for benchmarking;
- exploration of new evaluation methodologies and innovative ways of
using experimental data;
- discussion of results, comparison of approaches, exchange of ideas,
and transfer of knowledge.
In 2013 CLEF will be organised in September in Valencia, and researchers
and practitioners from all segments of the information access and related
communities are invited to submit evaluation lab proposals for review.
The lab selection committee will select among the proposals and will
occasionally suggest modifications to proposed labs to best suit
the CLEF lab workflow.
Proposals are accepted for two different types of labs:
1) Evaluation labs that follow a "campaign-style" evaluation practice
for specific information access problems (during the year preceding
the conference) in the tradition of past CLEF campaign tracks. In 2012
there were 7 evaluation labs ( http://clef2012.org/index.php?page=Pages/labs.html ):
ChiC, CLEF-IP, ImageCLEF, INEX, PAN, QA4MRE, RepLab.
Topics covered by evaluation labs can be inspired by any information
access-related domain or task.
2) Lab workshops organised as speaking and discussion sessions to explore
issues of evaluation methodology, metrics, and processes in information
access and closely related fields, such as natural language processing,
machine translation, and human-computer interaction. Lab workshops can
be a first step towards an evaluation lab: in 2012 one lab workshop was
given: CLEFeHealth. One of the 2011 lab workshops (CHiC) evolved into a
2012 evaluation lab. This progression from a lab workshop to an
evaluation lab is a development track we wish to encourage - but lab
workshops do not need to be associated with an evaluation lab: theoretical
and methodological issues are numerous in our field and can well be
addressed in focused workshop sessions without shared tasks.
If the organisers of the proposal are new to CLEF or other shared task
evaluation campaigns, we highly recommend that a lab workshop first be
organised to discuss the format, the problem space, and the practicalities
of the shared task. In both cases, it is expected that lab sessions at the
conference will contain ample time for general discussion and engagement by
all participants - not just those presenting campaign results and papers.
Organisers should plan time for panels, demos, etc. where applicable.
The CLEF 2013 conference will reserve about half of the conference program
for lab sessions. The lab sessions will take place at the site of the
conference in Valencia. The labs will present their overall results
"overview presentations" during the plenary scientific paper sessions to
allow non-participants to get a sense of where the research frontiers
USAGE SCENARIOS - THE VALIDATION OF BENCHMARKING
The sustainability and impact of the effort put into CLEF benchmarking
evaluations hinges on their validity when applied to information seeking
and information access activities in practice. We wish to see all
labs - whether lab workshops or evaluation labs - to address the issue
of validation through explicitly stated hypotheses of usage. An evaluation
lab should be concrete with respect to situation, context, platform and
user preferences for which the suggested evaluation benchmark is valid;
a lab workshop should discuss how participants with domain and usage
experience and expertise can be recruited to the workshop to provide a
grounding of evaluation methodology in application to real-world task.
Lab proposals should provide sufficient information for the lab organising
committee to be able to judge the importance, quality, impact and benefits
for the research community. Each lab should have one or more organisers
responsible for the execution of the lab. Proposals should be 2-4 pages
long and should provide the following information:
1) Title of the proposed lab.
2) The planned format of the lab, i.e. evaluation lab or lab workshop.
3) Planned length of the lab session at the conference:
half-day, one day, two days.
4) Names and full addresses, including contact details, of the lab
organiser(s), a brief description of the organisers' experience and
background in the topic and in previous evaluation campaigns, and
links to web pages of the lab organisers.
5) A brief description of the lab topic and goals, its relevance to
CLEF and significance for the research field.
6) The proposal should give a brief but clear statement of usage
scenarios or domain to which the activity is intended to contribute.
In connection with the usage scenarios task-relevant stakeholders
should be identified and ideally enlisted in an active role in the
lab to validate the scenarios. NO MORE THAN 3 TASKS SHOULD BE PROPOSED.
7) A statement on the intended development/growth path if the proposal is
for a continuation of activities previously undertaken at CLEF workshops.
8) A description of the target audience, areas from which the participants
are expected to come, an analysis of the potential for participants
(number, statements of intent to participate where applicable), potential
industry stakeholders, strategy for publicising the lab.
9) Arrangements for the organisation of the lab campaign, if applicable,
including a brief outline of the campaign milestones, test data to be
used, indications of the size of the data collections, issues of
scalability, tasks to be proposed to participants, and format of
presentation at the conference.
10) If the lab proposes to set up a steering committee to oversee its
activities, include names, addresses, and home page links of people who
have agreed to be part of the steering committee if the lab proposal is
accepted. This list should ideally include people from several different
academic sites and industrial stakeholders.
Each submitted proposal will be reviewed by the CLEF 2013 lab organising
committee. The decision will be sent by email to the responsible organiser
by October 28, 2012. The final length of the lab session will be determined
based on the overall organisation of the conference and the number of
submissions received by a lab. Due to space restrictions, only a limited
number of lab sessions can be conducted in parallel at the conference.
The reviewers may suggest modifications to the proposed lab in order to
better fit it to the overall organisation of CLEF 2013.
Reviewing criteria for labs include:
- The appropriateness of the lab to the overall information access agenda
pursued by CLEF and its fit to other labs considered for inclusion.
- Potential impact of the lab to current and future real-world information
access challenges, current commercial applications, and future promising
- Number of potential participants, critical mass.
- Innovation, uniqueness and amount of contribution to new knowledge
in the field.
- Focus of lab program, and specifically for evaluation labs: Practicability
and feasibility of task, soundness of methodology.
- For returning proposals: Movement beyond previous year's labs.
- Coverage of theory and practice, breadth of organising group, contact
surfaces to stakeholders and research efforts.
Lab Organiser's Tasks:
- Produce a "Call for Participation" for an evaluation lab or a
"Call for Papers" for a lab workshop and disseminating it through
all appropriate means.
- Provide a web page URL which can be linked into the CLEF 2013 home page.
- Provide a brief description of the lab for the conference program.
- Sign up campaign participants, and execute the campaign in the case
of evaluation labs.
- Review submitted papers and position papers in case of lab workshops.
- Schedule lab session activities in collaboration with the local organisers
and the CLEF Lab Organising Committee Chairs.
- Send the lab schedule and other lab material, all in PDF format, to the
CLEF Lab Organising Committee Chairs (deadline to be defined).
- Organise of post-conference publication of lab results in appropriate
form (special issue, lab proceedings, etc.).
The lab material (papers, presentations etc.) will be distributed by the
CLEF 2013 organisation to the conference participants in electronic format
(copyright will not be asked for from the authors, but only permission
to publish and disseminate).
The working notes of the labs will be published online in time for the
conference. It is foreseen that this online publication will have an ISBN
number and be indexed in relevant services. It is the responsibility of lab
organisers to arrange for appropriate post-conference publication of the
IMPORTANT DATES (please note the tight schedule)
Final lab proposals: 15 October 2012
Notification of lab acceptance: 28 October 2012
CLEF 2013 Conference: 23-26 September 2013
Lab proposals (or questions) should be submitted via e-mail (either
plain text or PDF format) to both CLEF Lab Chairs.
CLEF Lab Organising Committee (CLEF-LOC):
Roberto Navigli, Sapienza University of Rome (CLEF LOC 2013 co-chair)
Dan Tufis, RACAI, RACAI (CLEF LOC 2013 co-chair)
Jussi Karlgren, Gavagai and SICS (CLEF LOC 2012 co-chair)
Christa Womser-Hacker, Universität Hildesheim (CLEF LOC 2012 co-chair)
Mark Sanderson, RMIT University
Hugo Zaragoza, Websays
(others to be confirmed)