@youngalou: Thats unusual o.O. We had checked that all the submissions during the outage were eventually evaluated (some of them failed because of unrelated errors in the code).
Can you point us to the relevant issues of the submissions which have not been evaluated ?
@mohanty Here is the latest submission:
https://gitlab.aicrowd.com/hermitcoder/obstacle_tower_by_42robolab/issues/95
It has been evaluated but the leaderboard does not reflect this evaluation.
@youngalou: The submission you reference in your comment above was made at July 16, 2019 09:22 UTC
, and according to the rules (and the configuration of the said challenge), the challenge officially ended at July 15, 2019 11:59 p.m. PST
which is July 16 6:59am UTC
.
Hence according to our records, your submission was made after the official end of the competition.
We will send across the actual submission scores and times to the organizers, and will let Unity Legal make the final determinations according to the Rules of the competition set by them.
Best of luck,
Cheers,
Mohanty
@mohanty That’s when the evaluation started, but I pushed these commits well before that. The commit for this particular evaluation is here: https://gitlab.aicrowd.com/hermitcoder/obstacle_tower_by_42robolab/tags/submission-v5.14
And was pushed at July 16, 5:25 AM UTC
@youngalou: Acknowleding the same. Will check in with the Unity team and decide accordingly.
Cheers,
Mohanty
1 Like
There was a mention about the final standings for round 2 being based on more seeds than 5 to get a proper average performance. Is that going to happen? I didn’t try to repeatedly submit similar models to overfit the 5 seeds for that reason.
@mohanty
Are you going to fix the bug of the show post-challenge submission button?
Pressing this button does not change the leaderboard.
Hey @leckofunny!
A fix for it is on the way, should be live tomorrow!
Cheers
1 Like