Some evaluated and submitted Bugs

  1. At the end of the evaluation model, clicking on View the scores for the submission here. does not display the current model’s metrics.

  2. Remaining submissions: I have 5 submissions per day, but I can only submit once per day.

1 Like

You only have 5 total submissions, and you can submit once every 24 hours.
To not use up your 5 submissions and debug failed submissions, add “debug” : true
to the end of your aicrowd.json like this:

{
    "challenge_id": "cinematic-sound-demixing-track-cdx-23",
    "authors": [
      "YourNameHere"
    ],
    "external_dataset_used": false,
    "gpu": true,
    "description": "(optional) description about your awesome model",
    "debug" : true
}

Important Debugging Note
In the runtime.md docs, they point you to a FAQ from 2019 on How to specify runtime environment for your submission which is slightly out of date. Trying to debug locally using pip install aicrowd-repo2docker will install the incorrect version of aicrowd-repo2docker where the nvidia GPG keys are out of date
You should instead install the latest aicrowd-repo2docker using:

pip install -U aicrowd-repo2docker@git+https://github.com/AIcrowd/repo2docker

Hope that helps to answer your question/solve your problem!

I updated the incorrect FAQ to correct the outdated install method.

1 Like