Successful submissions do not appear on the leaderboard

#1

@mohanty,
The last few submissions of our team did not appear on the leaderboard, despite the evaluation process was successful

https://gitlab.aicrowd.com/sova876/obstacle-tower-challenge/issues/32
https://gitlab.aicrowd.com/sova876/obstacle-tower-challenge/issues/33
https://gitlab.aicrowd.com/sova876/obstacle-tower-challenge/issues/34

#2

Is the debug option off?

#3

Yes, debug mode is off. I haven’t changed .json file since the last successful submit/eval/leaderboard attempt.

#4

Did you score a mean floor that is lower than 5?

I’m wondering if there is a threshold.

#5

No, a mean floor was higher than 5.

#7

@sova876
Did you set the tag name to start with “submission-”?

#8

It doesn’t work for me as well.
I pushed the tag “submission-v2.0i”.

#9

I just tried it, and my submission did not appear on the leaderboard like you…

#10

@mohanty
I have a same issue.
I set the tag name to start with “submission-”.
Are there any other requirements to submit to Round 2?
https://gitlab.aicrowd.com/kenshi_abe/obstacle-tower-challenge/issues/70

#11

Looping in @shivam to have a quick look.

#12

We have manually recomputed the leaderboard. But also investigating why this was not previous automatically computed.

3 Likes
#13

Having the same issue here.
Has the issue been resolved?
https://gitlab.aicrowd.com/hermitcoder/obstacle_tower_by_42robolab/issues/71

1 Like
#14

Our submission appear to be graded, but not on leaderboard too
Seems like the problem still persist and leaderboard is not auto recomputing

https://gitlab.aicrowd.com/STAR.Lab/obstacle-tower-challenge/issues/28

#15

@mohanty could you please check these submissions? They did not appear on the leaderboard.

“State : Generating Video :movie_camera:” [5 days ago]
https://gitlab.aicrowd.com/ipv6/obstacle-tower-challenge/issues/22

“Unable to orchestrate submission, please contact Administrators.”
https://gitlab.aicrowd.com/ipv6/obstacle-tower-challenge/issues/23

#16

This has been a sneaky bug !!
We recomputed the leaderboard manually again, and will keep an eye on this thread for more such reports until we find the exact source of the bug.
Thanks in advance for keeping us informed when this happens again.
Cheers,
Mohanty

#17

@mohanty My agent has been graded, but the issue wouldn’t close for 4 hours now after saying “generating video” Could you look into this? ( #6984)

#18

We have identified and put a fix for getting stuck on the video generation submissions, they will now on adhere timeout of 20 minutes for video generation.
We are still looking on why video is not generated for some people.

#19

@devops @mohanty Now I’m getting this error message.

2019-06-19T15:35:13.556346001Z root
2019-06-19T15:35:16.228972099Z INFO:mlagents_envs:Start training by pressing the Play button in the Unity Editor.
2019-06-19T15:35:46.233275211Z Traceback (most recent call last):
2019-06-19T15:35:46.233307689Z   File "run.py", line 43, in <module>
2019-06-19T15:35:46.233312631Z     env = ObstacleTowerEnv(args.environment_filename, docker_training=args.docker_training, retro=True, greyscale=True)
2019-06-19T15:35:46.233318069Z   File "/srv/conda/envs/notebook/lib/python3.6/site-packages/obstacle_tower_env.py", line 45, in __init__
2019-06-19T15:35:46.233322404Z     timeout_wait=timeout_wait)
2019-06-19T15:35:46.233326328Z   File "/srv/conda/envs/notebook/lib/python3.6/site-packages/mlagents_envs/environment.py", line 69, in __init__
2019-06-19T15:35:46.233330448Z     aca_params = self.send_academy_parameters(rl_init_parameters_in)
2019-06-19T15:35:46.233334307Z   File "/srv/conda/envs/notebook/lib/python3.6/site-packages/mlagents_envs/environment.py", line 491, in send_academy_parameters
2019-06-19T15:35:46.233338144Z     return self.communicator.initialize(inputs).rl_initialization_output
2019-06-19T15:35:46.233341852Z   File "/srv/conda/envs/notebook/lib/python3.6/site-packages/mlagents_envs/rpc_communicator.py", line 80, in initialize
2019-06-19T15:35:46.233345601Z     "The Unity environment took too long to respond. Make sure that :\n"
2019-06-19T15:35:46.233349459Z mlagents_envs.exception.UnityTimeOutException: The Unity environment took too long to respond. Make sure that :
2019-06-19T15:35:46.233353065Z 	 The environment does not need user interaction to launch
2019-06-19T15:35:46.233356869Z 	 The Academy and the External Brain(s) are attached to objects in the Scene
2019-06-19T15:35:46.233360461Z 	 The environment and the Python interface have compatible versions.

I submitted the exact code that succeeded previously. Maybe the fix introduced a new issue? I’m suspecting the second line, which is something I’ve never seen before.

#20

@devops Now I don’t get this error but the submission is stuck for 21 hours after evaluation finished successfully. (Issue #173, Submission #7068) Could you look into this please? Appreciate your help!

#21

@hanschoi86: On it !