Round 1 is open for submissions πŸš€

Hello all!

Thank you for your participation and enthusiasm during the warm-up round! We are accepting submissions for round 1.

Changes for Round 1

Hardware available for evaluations

The evaluations will run on the following hardware:

Resources
vCPUs 8
RAM 56 GB
GPU 16 GB Tesla V100

Evaluation configuration

The configuration used during evaluations is available at FAQ: Round 1 evaluations configuration.

Environments

This round will run on three public (coinrun, bigfish, miner) and one private environment.

Scoring

The final score will be an average of mean normalized rewards for public environments and the private environment

Score = \frac{1}{6}*R_{coinrun} + \frac{1}{6}*R_{bigfish} + \frac{1}{6}*R_{miner} + \frac{1}{2}*R_{privateEnv}

R_{env} = Mean normalized reward for env.

Prizes

We are super excited to share that AWS is the official sponsor of this competition. Apart from sponsoring all the compute for this competition, AWS has been generous to extend the following prizes:

  • $10,000 in AWS credits for the TOP 50 participants of the warm-up round. (you will be receiving a mail shortly!)
  • TOP 3 teams of the final round will each get $1000 (cash) and $3000 in AWS credits!

For the next round

Because significant computation is required to train and evaluate agents in the final round, only the top 50 submissions from Round 1 will be eligible to submit solutions for Round 2.

6 Likes

What should env_name write when submitting?

env_config:
env_name: <–here?

just coinrun is fine?

Hello @Paseul

You can put any value in env_name. This will be overridden by relavant value during evaluation.

I’m getting this error β€œSubmission failed : You have not qualified for this round. Please review the challenge rules at www.aicrowd.com”

Where can I accept these rules?

Hello @jurgisp

This was a config issue on our end. The issue is fixed and we re-queued your submission.

1 Like