Feedback and suggestion

We are constantly trying to make this challenge better for you and would appreciate any feedback you might have :raised_hands:.

Please reply to this thread with your suggestions and feedback on making the challenge better for you!

  • What have been your major pain points so far?
  • What would you like to see improved?

SDX Challenge Team


Thanks for organizing this challenge. I have a couple of points that I wanted to suggest or know:

  • Firstly, the background for the discussions page is very distracting from the text. the white musical instruments and the white text overlap and are difficult to read. Maybe put the graphic on the side or change the background/text/graphic color combination or opacity.

  • Secondly, I would like to know if, in the future GPUs will be available for inference. A submission timed out for me, if there will be no GPUs then, we would have to think about model designs keeping that in mind.


Hi @mohanty

I am citizen of Russia (and one of the winners of the Music Demixing Challenge 2021). Am I eligible for a prize in Sound Demixing Challenge 2023?

If I am eligible:
I don’t have a foreign bank card. If I win, will I be able to get the prize on a Russian bank card? In other words, do I have to get a foreign bank card?

If I am not eligible:
If possible, I would like my prize not to go to the next winner, but to go to some charity.


I also request GPU inference or a longer time limit before timeout. Teams have more data since mdx 21 and we would like to try bigger model.

1 Like

I’ve been submitting a lot lately and it seems that identical submissions can have very different evaluation throughputs.
I submitted the same model (same architecture with different model weights) 5 times and got throughputs that vary from 0.007 ts/s ~ 0.019 ts/s.

Is this something that can be fixed? I’m pretty sure it wasn’t like this in MDX 2021.


@lyghter : Unfortunately, residents of Russia are not eligible for the prizes in the Challenge. And in case, a leaderboard position is held by a team which is not eligible for the prizes, then the prizes will indeed be rolled over to the next position (and at this point, it will not be possible for us to logistically allow non eligible teams to pass on the prizes to a charity). The rules have been updated to categorically reflect that.


We are discussing this with the organizing committee, and will get back to you soon on this.

Got it. Another question, purely out of curiosity: will the prize be divided among the team members or will it be given to the team leader?

@lyghter : The team leader will be the main legal POC with AIcrowd, and the prizes will be transferred to the team leader. Any prize-sharing agreements between the team leader and the rest of the team members are internal to the team.

1 Like

I am having the same problem with throughput. These two pictures use the same models but the second one took 3 times longer and timed out


What is the current timeout?

There will be no GPUs for inference?

120 minute i believe.

I do not think there is GPUs as the organizer do not seem active with no responding to any issue

Apologies for the relative radio silence due to the holiday season.


We are still waiting for a response from the organizing team about the provision of GPUs, so we will have to hold off on answering the question until we hear back from the organizing team.


We are investigating the issue with differential throughputs across different evaluations. A new instance is instantiated for every evaluation on our cloud provider, and the instance type is exactly the same - and hence the resources available. We have confirmed that the exact same instance type is being made available to all the submissions as well. We will get back to you with more details on this soon as well.


The current timeouts are 1hr, or 60 mins.

Apologies on the slow response times due to the Holiday season. We will be providing support at full capacity again starting 2nd of January, 2023.


1 Like


Another suggestion/question:

  • Add a table of scores to the Issue template after the evaluation ends.

The link in the submission issue does not work.

The link (as in the screenshot) points to, which is just a blank page.

Is it because the competition does not have a submission page?

1 Like

Can you clarify? If a person is a resident of Russia, but lives in another country and has a bank account there, will he also not eligible for the prizes?

1 Like

@andrey1362010 : A person who is a citizen of Russia and is currently a resident of another eligible country, with an active bank account present in that eligible country, is eligible to receive prizes.

1 Like

Another suggestion from my end,

It would be nice to have a sample from each submission on the leaderboard, will give us an actual physical sense of the performance of the best submissions.

For some reason, the evaluation sample audios don’t load on mobile browsers. It’s just blank, the same page on a PC browser works fine.

Hi @dipam,

Some feedback on the submission system and leaderboards:

  • None of the Leaderboards for Phase I of the MDX track work, even if proper tags are used in aicrowd.json. Can still see the results on the Submissions page.
  • Same with Leaderboard A of CDX track
  • CDX track does not have a Submissions page, so unless there is an improvement in our score it’s difficult to see.
  • The submission result URL on the evaluation issue for both CDX and MDX tracks is wrong. They take you to the default Submissions page and not to the particular submission. Also, since CDX does not have a Submissions page, can’t see the score.

For the last point, as an example,

The URL in the issue is :

But the correct URL should be:


It would be more convenient if the submission counter resets at 00:00 UTC