[Apologies for cross-postings]
The deadline for the submissions of labs proposals has been extended to 14th July. You can find all the information about the call at: https://clef2024.imag.fr/index.php?page=Pages/call_for_lab_proposals.html
At its 25th edition, the Conference and Labs of the Evaluation Forum (CLEF) is a continuation of the very successful series of evaluation campaigns of the Cross Language Evaluation Forum (CLEF) which ran between 2000 and 2009, and established a framework of systematic evaluation of information access systems, primarily through experimentation on shared tasks. As a leading annual international conference, CLEF uniquely combines evaluation laboratories and workshops with research presentations, panels, posters and demo sessions. In 2024, CLEF takes place in September, 9-12 at the University of Grenoble Alpes, France.
Researchers and practitioners from all areas of information access and related communities are invited to submit proposals for running evaluation labs as part of CLEF 2024. Proposals will be reviewed by a lab selection committee, composed of researchers with extensive experience in evaluating information retrieval and extraction systems. Organisers of selected proposals will be invited to include their lab in the CLEF 2024 labs programme, possibly subject to suggested modifications to their proposal to better suit the CLEF lab workflow or timeline.
The CLEF Initiative (http://www.clef-initiative.eu/) is a self-organised body whose main mission is to promote research, innovation, and development of information access systems with an emphasis on multilingual information in different modalities - including text and multimedia - with various levels of structure. CLEF promotes research and development by providing an infrastructure for:
independent evaluation of information access systems;
investigation of the use of unstructured, semi-structured, highly-structured, and semantically enriched data in information access;
creation of reusable test collections for benchmarking;
exploration of new evaluation methodologies and innovative ways of using experimental data;
discussion of results, comparison of approaches, exchange of ideas, and transfer of knowledge.
Scope of CLEF Labs
We invite submission of proposals for two types of labs:
“Campaign-style” Evaluation Labs for specific information access problems (during the twelve months period preceding the conference), similar in nature to the traditional CLEF campaign “tracks”. Topics covered by campaign-style labs can be inspired by any information access-related domain or task.
Labs that follow a more classical “workshop” pattern, exploring evaluation methodology, metrics, processes, etc. in information access and closely related fields, such as natural language processing, machine translation, and human-computer interaction.
We highly recommend organisers new to the CLEF format of shared task evaluation campaigns to first consider organising a lab workshop to discuss the format of their proposed task, the problem space and practicalities of the shared task. The CLEF 2024 programme will reserve about half of the conference schedule for lab sessions. During the conference, the lab organisers will present their overall results in overview presentations during the plenary scientific paper sessions to give non-participants insights into where the research frontiers are moving. During the conference, lab organisers are expected to organise separate sessions for their lab with ample time for general discussion and engagement with all participants - not just those presenting campaign results and papers. Organisers should plan time in their sessions for activities such as panels, demos, poster sessions, etc. as appropriate. CLEF is always interested in receiving and facilitating innovative lab proposals.
Potential task proposers unsure of the suitability of their task proposal or its format for inclusion at CLEF are encouraged to contact the CLEF 2024 Lab Organizing Committee Chairs to discuss its suitability or design at an early stage.
Lab proposals must provide sufficient information to judge the relevance, timeliness, scientific quality, benefits for the research community, and the competence of the proposers to coordinate the lab. Each lab proposal should identify one or more organisers as responsible for ensuring the timely execution of the lab. Proposals should be 3 to 4 pages long and should provide the following information:
Title of the proposed lab.
A brief description of the lab topic and goals, its relevance to CLEF and the significance for the field.
A brief and clear statement on usage scenarios and domain to which the activity is intended to contribute, including the evaluation setup and metrics.
Details on the lab organiser(s), including identifying the task chair(s) responsible for ensuring the running of the task. This should include details of any previous involvement in organising or participating in evaluation tasks at CLEF or similar campaigns.
The planned format of the lab, i.e., campaign-style (“track”) or workshop.
Is the lab a continuation of an activity from previous year(s) or a new activity?
For activities continued from previous year(s): Statistics from previous years (number of participants/runs for each task), a clear statement on why another edition is needed, an explicit listing of the changes proposed, and a discussion of lessons to be learned or insights to be made.
For new activities: A statement on why a new evaluation campaign is needed and how the community would benefit from the activity.
Details of the expected target audience, i.e., who do you expect to participate in the task(s), and how do you propose to reach them.
Brief details of tasks to be carried out in the lab. The proposal should clearly motivate the need for each of the proposed tasks and provide evidence of its capability of attracting enough participation. The dataset which will be adopted by the Lab needs to be described and motivated in the perspective of the goals of the Labs; also indications on how the dataset will be shared are useful. It is fine for a lab to have a single task, but labs often contain multiple closely related tasks, needing a strong motivation for more than 3 tasks, to avoid useless fragmentation.
Expected length of the lab session at the conference: half-day, one day, two days. This should include high-level details of planned structure of the session, e.g. participant presentations, invited speaker(s), panels, etc., to justify the requested session length.
Arrangements for the organisation of the lab campaign: who will be responsible for activities within the task; how will data be acquired or created, what tools or methods will be used, e.g., how will necessary queries be created or relevance assessment carried out; any other information which is relevant to the conduct of your lab.
If the lab proposes to set up a steering committee to oversee and advise its activities, include names, addresses, and homepage links of people you propose to be involved.
Lab proposals must be submitted at the following address:
choosing the “CLEF 2024 Lab Proposals” track.
Each submitted proposal will be reviewed by the CLEF 2024 Lab Organizing Committee. The acceptance decision will be sent by email to the responsible organiser by 28 July 2023. The final length of the lab session at the conference will be determined based on the overall organisation of the conference and the number of participant submissions received by a lab.
Advertising Labs at CLEF 2023 and ECIR 2024
Organisers of accepted labs are expected to advertise their labs at both CLEF 2023 (18-21 September 2023, Thessaloniki, Greece) and ECIR 2024 (24-28 March 2024, Glasgow, Scotland). So, at least one lab representative should attend these events.
Advertising at CLEF 2023 will consist of displaying a poster describing the new lab, running a break-out session to discuss the lab with prospective participants, and advertising/announcing it during the closing session.
Advertising at ECIR 2024 will consist of submitting a lab description to be included in ECIR 2024 proceedings (11 October 2023) and advertising the lab in a booster session during ECIR 2024.
Mentorship Program for Lab Proposals from newcomers
CLEF 2019 introduced a mentorship program to support the preparation of lab proposals for newcomers to CLEF. The program will be continued at CLEF 2024 and we encourage newcomers to refer to Friedberg et al. (2015) for initial guidance on preparing their proposal:
Friedberg I, Wass MN, Mooney SD, Radivojac P. Ten simple rules for a community computational challenge. PLoS Comput Biol. 2015 Apr 23;11(4):e1004150.
The CLEF newcomers mentoring program offers help, guidance, and feedback on the writing of your draft lab proposal by assigning a mentor to you, who help you in preparing and maturing the lab proposal for submission. If your lab proposal falls into the scope of an already existing CLEF lab, the mentor will help you to get in touch with those lab organisers and team up forces.
Lab proposals for mentorship must be submitted at the following address:
choosing the “CLEF 2024 Lab Mentorship” track.
29 May 2023: Requests for mentorship submission (only newcomers)
29 May 2023 - 16 June 2023: Mentorship period
7 July 2023 14 July 2023 (extended): Lab proposals submission (newcomers and veterans)
28 July 2023: Notification of lab acceptance
18-21 Sep 2023: Advertising Accepted Labs at CLEF 2023, Thessaloniki, Greece
11 October 2023: Submission of short lab description for ECIR 2024
13 November 2023: Lab registration opens
24-28 March 2024: Advertising labs at ECIR 2024, Glasgow, UK
CLEF 2024 Lab Chairs
Petra Galuscakova, University of Stavanger, Norway
Alba García Seco de Herrera, University of Essex, UK
CLEF 2024 Lab Mentorship Chair
Liana Ermakova, Université de Bretagne Occidentale, France
Florina Piroi, TU Wien, Austria
[log in to unmask]
If you don't already have a password for the LISTSERV.ACM.ORG server, we recommend that you create one now. A LISTSERV password is linked to your email address and can be used to access the web interface and all the lists to which you are subscribed on the LISTSERV.ACM.ORG server.
To create a password, visit:
Once you have created a password, you can log in and view or change your subscription settings at: