It does not seem to work for us
yep, it does not work
"$ git push aicrowd submission-vXX" does not initiate a new evaluation
https://gitlab.aicrowd.com/ipv6/otc/tags
https://gitlab.aicrowd.com/ipv6/otc/issues
same issue over here
Hi, our system was healthy since last update but unfortunately not picking up newer submissions.
The issue has been addressed now but pending submission queue have lined up, the queue will be cleared shortly, you will continue to receive updates under Gitlab issues list.
Is there any way to know which tag corresponds to a given submission from the issue tab?
Now there are several queued but it is hard to know to which commit they correspond.
g
Hi,
You can view the commit id shown on issue page and match it with list of your tags.
For example:
https://gitlab.aicrowd.com/your-name/your-repo/tags
https://gitlab.aicrowd.com/your-name/your-repo/issues/1
At the same time, it sounds like a good idea to provide this information directly on issue page content. We have added it as enhancement and will make it available in next release.
I think it’s stuck again.
+1 on that. None of our submissions have been going through for the past couple hours.
+1
my last version not in QUEUE…
https://gitlab.aicrowd.com/wywarren/obstacle-tower-challenge/tags/v5.0
Upload at 14:30
Acknowledging that the Image Building service has been baked. But all the submissions have been received, and we will requeue them manually and ensure that all submissions which were submitted in time do get evaluated.
cc. @shivam
Update: Some of the evaluations which were broken due to the clog in the image build service have been requeued. Confirming that the rest of the submissions are still safe in the queue, and will slowly be evaluated. Thank you for your patience.
For any specific queries regarding your submissions, you can also reach out to us at : mohanty@aicrowd.com (with devops@aicrowd.com) in cc.
Will the requeued evaluations be reflected on the leaderboard? We still have the score of our 5th last submission.
@youngalou: Thats unusual o.O. We had checked that all the submissions during the outage were eventually evaluated (some of them failed because of unrelated errors in the code).
Can you point us to the relevant issues of the submissions which have not been evaluated ?
@mohanty Here is the latest submission:
https://gitlab.aicrowd.com/hermitcoder/obstacle_tower_by_42robolab/issues/95
It has been evaluated but the leaderboard does not reflect this evaluation.
@youngalou: The submission you reference in your comment above was made at July 16, 2019 09:22 UTC
, and according to the rules (and the configuration of the said challenge), the challenge officially ended at July 15, 2019 11:59 p.m. PST
which is July 16 6:59am UTC
.
Hence according to our records, your submission was made after the official end of the competition.
We will send across the actual submission scores and times to the organizers, and will let Unity Legal make the final determinations according to the Rules of the competition set by them.
Best of luck,
Cheers,
Mohanty
@mohanty That’s when the evaluation started, but I pushed these commits well before that. The commit for this particular evaluation is here: https://gitlab.aicrowd.com/hermitcoder/obstacle_tower_by_42robolab/tags/submission-v5.14
And was pushed at July 16, 5:25 AM UTC
@youngalou: Acknowleding the same. Will check in with the Unity team and decide accordingly.
Cheers,
Mohanty
There was a mention about the final standings for round 2 being based on more seeds than 5 to get a proper average performance. Is that going to happen? I didn’t try to repeatedly submit similar models to overfit the 5 seeds for that reason.
Are you going to fix the bug of the show post-challenge submission button?
Pressing this button does not change the leaderboard.