← Back

Contents

  1. Overview
  2. Eligibility
  3. Registration & Participation
  4. Dataset
  5. Timeline
  6. Submissions
  7. Privacy
  8. Awards
  9. General
  10. Agreement

1. Overview

The Music-CRS Challenge 2026 is an academic research competition co-located with ACM RecSys 2026. It is organized by the challenge organizing committee ("Organizer") with the goal of advancing research in conversational music recommendation.

These guidelines describe the rules of participation. By registering, you agree to follow them. If you have any questions, please reach out to the organizers directly.

2. Eligibility

  1. The challenge is open to anyone — researchers, students, and practitioners — unless they are directly involved in the organization of the challenge.
  2. Participants are responsible for making sure their participation does not conflict with any obligations they may have (e.g., employer agreements). The organizers cannot be held responsible for any such conflicts.
  3. Participants must comply with the laws applicable in their jurisdiction.

3. Registration & Participation

a. Accounts

  1. One account per participant. Multiple accounts will result in disqualification.
  2. Each team registers one account and submits from that account only.
  3. Registrations are reviewed manually — please fill in your profile as completely as possible.

b. Teams

  1. Maximum team size is 10 members.
  2. Only one team member may make submissions. Team mergers must be reported to the organizers before the challenge ends.

c. Tracks

There are two participation tracks:

  • Academic Track — at least 50% of team members are currently enrolled students or academic staff.
  • Industry Track — all other teams.

Academic teams must indicate their track at registration. Both tracks are ranked separately based on the final Blind B leaderboard.

d. Conduct

  1. Please be respectful and honest. Cheating, plagiarism, or gaming the evaluation in bad faith will result in disqualification.
  2. Code and data should not be shared outside your team, unless you share it openly with all participants on the forum.

4. Dataset

TalkPlayData-Challenge is a large-scale synthetic multi-turn dialogue dataset grounded in real music listening histories, with pre-extracted multimodal track and user embeddings provided. conversational music recommendation. It was developed independently by the academic research team organizing this challenge.

Note: TalkPlayData-Challenge is a research dataset with no affiliation to Deezer or SiriusXM. It does not contain or represent any proprietary data from either company. Deezer and SiriusXM are involved solely as prize sponsors.

Four splits are provided:

  • Train & Development — released 25 March 2026, ground truth provided.
  • Blind A & Blind B — released 17 April 2026, ground truth withheld. Blind B determines final rankings.

The dataset is released under CC BY-NC 4.0 for non-commercial research use only. Please do not redistribute it outside the challenge.

5. Timeline

Dates are tentative and subject to change. Please check the website for the latest schedule.

  • 25 Mar 2026 — Challenge start; Train & Development data released
  • 10 Apr 2026 — Submission system opens; leaderboard live
  • 17 Apr 2026 — Blind A & B released; Phase 2 begins
  • 25 Jun 2026 — Final submission deadline
  • 26 Jun – 5 Jul 2026 — Code upload deadline for top teams
  • 30 Jun 2026 — Winners announced; paper submission opens
  • 7 – 12 Jul 2026 — Paper submission deadline
  • 24 – 28 Jul 2026 — Acceptance notifications
  • 5 – 7 Aug 2026 — Camera-ready deadline
  • Sep 2026 — RecSys 2026 Workshop

6. Submissions

  1. Submissions must follow the format and deadlines described on the website. Late or malformed submissions will not be counted.
  2. Top-ranked teams may be asked to share their code so results can be verified.
  3. By submitting, you confirm that your work is your own and does not violate anyone else's intellectual property.

7. Privacy

  1. Registration information is collected solely for the purpose of running this challenge. It will not be shared with third parties except where necessary to administer the awards.
  2. You may use a team pseudonym on the public leaderboard. However, winners must share their real identity with the organizers to receive their prize.
  3. You can contact the organizers at any time to ask about, update, or delete your personal information.

8. Awards

The total prize pool is $2,000 USD. Winners are determined by the final Blind B leaderboard:

  • Academic Track — 1st Place: $1,000
  • Industry Track — 1st Place: $1,000

Prize money is generously sponsored by SiriusXM and Deezer. Their sponsorship is limited to funding the prizes — they have no role in the dataset, evaluation criteria, or any other part of the challenge.

Winners are responsible for any taxes applicable in their jurisdiction. The organizers and sponsors cannot be held liable for tax obligations arising from prize receipt.

9. General

The organizers may update these guidelines as needed and will notify participants of any significant changes. If unforeseen circumstances prevent the challenge from running as planned, the organizers reserve the right to adjust, postpone, or cancel it.

This is an academic challenge run in good faith. We ask all participants to engage in the same spirit.

10. Agreement

By completing registration and checking the agreement box, you confirm that you have read these guidelines and agree to follow them.

Effective from 25 March, 2026.