Shared Tasks – Workshop on Sign Language Processing

Event Notification Type: 
Call for Participation
Abbreviated Title: 
WSLP 2025
Location: 
IJCNLP–AACL 2025, Victor Menezes Convention Centre, IIT Bombay, Mumbai, India , December 24, 2025
Wednesday, 24 December 2025
State: 
Maharashtra
Country: 
India
City: 
Mumbai
Contact: 
Ashutosh Modi
Abhinav Joshi
Sanjeet Singh
Submission Deadline: 
Wednesday, 15 October 2025

WSLP 2025 – Shared Tasks on Sign Language Processing
Victor Menezes Convention Centre, IIT Bombay, Mumbai, India
December 24, 2025 (Co-located with IJCNLP–AACL 2025)
https://exploration-lab.github.io/WSLP/task/
We are delighted to invite you to participate in the WSLP 2025 Shared Tasks on Indian Sign Language (ISL) Processing, a set of challenges aimed at advancing research on sign language translation and recognition.
Across the world, more than 430 million people experience disabling hearing loss, and for many, sign languages are their primary mode of communication. However, the technological support for sign language understanding lags far behind that for spoken and written languages.
With these shared tasks, we aim to bring together the computer vision, NLP, and multimodal learning communities to develop robust and inclusive solutions for ISL. These tasks will not only provide benchmark datasets but also encourage the development of systems with real-world applications such as accessible communication interfaces, sign-enabled chatbots, and video-based translation systems.
Shared Tasks Overview
Task 1 – ISL to English Translation (https://www.codabench.org/competitions/10118/ )
Goal: Translate sentence-level ISL videos/poses into English text.
Challenges: Visual-linguistic grounding, gesture ambiguity, grammar differences.
Task 2 – Word/Gloss Recognition (https://www.codabench.org/competitions/10135/ )
Goal: Recognize isolated ISL signs (words or glosses) from short video clips.
Challenges: Sign variability, subtle motion, similar gestures.
Task 3 – Word Presence Prediction (https://www.codabench.org/competitions/10066/ )
Goal: Predict whether a given word is present in an ISL sentence video.
Challenges: Sign spotting, context alignment.
All tasks are hosted on Codabench, with training and evaluation datasets publicly available.
Timeline
Start Date: August 15, 2025
Training Phase: Aug 15 – Oct 5, 2025
Testing Phase: Oct 5 – Oct 15, 2025
Paper Submission Deadline: October 25, 2025
Notification of Acceptance: November 3, 2025
Camera-ready Papers Due: November 11, 2025
Workshop Date: December 24, 2025
All deadlines are 23:59 UTC-12 (“anywhere on Earth”).
Participation:
We invite researchers, students, and practitioners from computer vision, natural language processing, gesture recognition, and related fields to participate.
By joining these shared tasks, you will contribute towards building inclusive technologies for the Deaf community and advancing the state of sign language processing.
To stay updated and connect with other participants, join our Discord server (Workshop: https://discord.gg/jP7j4NmUE4 , Shared Task: https://discord.gg/su2rRxSjkY).
Submission Guideline:
Platform: Codabench
Team Size: Up to 4 members
Paper Submission:
Long papers (up to 8 pages + references)
Short papers (up to 4 pages + references)
All submissions must follow the ACL 2025 formatting guidelines and use the ACL template: https://github.com/acl-org/acl-style-files.
Submissions via OpenReview : https://openreview.net/group?id=aclweb.org/AACL-IJCNLP/2025/Workshop/WSLP
Double-blind review
Accepted papers: +1 page for revisions
Contact:
Abhinav Joshi (IIT Kanpur, India): ajoshi@cse.iitk.ac.in
Sanjeet Singh (IIT Kanpur, India): sanjeet@cse.iitk.ac.in
Dr. Ashutosh Modi (IIT Kanpur, India): ashutoshm@cse.iitk.ac.in
We look forward to your participation in WSLP 2025 Shared Tasks.