How to view logs for your submission?

Background

When we write codes and run them, many of times we make errors. The best friend in such cases are the logs, using which we can debug and fix our codes quickly.

While we want to, but sharing logs by default is tricky because arbitrary codes are executed as part of submission in our competition. These may unintentionally leak part of whole of the datasets. As we all understand, these datasets are confidential and at the same time, knowing testing dataset can be used for undue advantage in a running competition.

Due to this our default policy is to hide the logs. BUT we do keep a close eye on submissions which fail, manually verify and share relevant traceback to the participants in a best effort basis, which can take few minutes to multiple hours. This is surely a issue and the process can’t be scaled. This was the reason which lead to an integration testing phase in our competitions known as “debug” mode. I will be explaining how to enable “debug” mode and what it does below.


Debug Mode

The debug mode when enabled, runs your code against extremely small datasets (different than testing, subpart of testing, subpart of training, etc based on competition), different seed or so on. In a nutshell, the data even when visible to participants hold no value.

Each of the competition have different policy for what all logs are visible and if debug mode should exist. But in majority cases, we enable debug mode by default and show logs for user’s submitted code (not the infrastructure/evaluator logs)

How to use this?

When you submit a solution, the metadata for competition along with other informations is present in aicrowd.json. You can specify debug: true to enable the same.

When enabled following happens with your submission:

  1. Run against different and small dataset for quicker runs
  2. Logs are visible by default to you when the submission fails, under the heading “Logs for participants reference”
  3. The submission is reflected back on AIcrowd.com / Leaderboard with lowest possible score for given competition.

Still facing issues?

We keep the environment for debug mode and actual submission exactly the same. But it is still possible that your code runs well in debug mode while don’t in actual submission. In this case, we will need to revert to traditional support method. The escalations for the same are, we will automatically post your logs -> you can tag competition organisers in Gitlab issue -> let us know in Discourse Forum.

We wish you best of luck for participating in competition!

2 Likes