Dockerfile Doesn't Exist

The food recognition starter kit repo doesn’t include a Dockerfile anymore, and the linked below has a broken link to a Dockerfile.

My model has an AP >45 on the validation set, so I’d definitely like to submit it, however I’m not sure how to.

My submission needs to clone my repository containing my model configuration (it is not a fork food-recognition-benchmark-starter) and download my model weights pth file, however I’m not sure how to do that without a Dockerfile.

Hi, you don’t require a Dockerfile and can just follow the steps shared in the predict_*.py files.

The AIcrowd submission system auto-generates Dockerfile [when not present] using configuration files like requirements.txt, etc, so we don’t have learning docker as prerequisite to participate. [full list of configuration files].

Example for Detectron2:

Example for MMdetection:

In case you are using completely different approach, you can use this example prediction boilerplate for the functions you need to implement.

Let us know in case you need more inputs, and looking forward for your submission! :rocket:

1 Like

Thanks for the response.

In AIcrowds repo2docker fork, there are no references to CUDA. So how is /usr/local/cuda-10.1 constructed in the build environment?

I believe my build (AIcrowd) is failing because of an incompatible CUDA version, however I’m not sure which part of the build should be modified to fix this.

I’ll mess around locally with aicrowd-repo2docker for a bit to see if I can override the existing cuda version even though I’m not sure where CUDA is installed.


We use nvidia/cuda:10.1-cudnn7-runtime-ubuntu18.04 as base image which takes care of it.

Example from one of your submission’s log:

In case you want a different base image or CUDA version due to any reason, you will have to use the Dockerfile to specify your environment. :slightly_frowning_face:
But the good part is that you seems to be comfortable with it. :smile:

Here is one which you can build on top of:

1 Like


Are you guys using nvidia-docker so that the GPUs are exposed during runtime AND the build?

Hi @lapp09 , GPUs are present only at runtime, not the build.

1 Like