Yes, debug mode is off. I haven’t changed .json file since the last successful submit/eval/leaderboard attempt.
Did you score a mean floor that is lower than 5?
I’m wondering if there is a threshold.
No, a mean floor was higher than 5.
Did you set the tag name to start with “submission-”?
It doesn’t work for me as well.
I pushed the tag “submission-v2.0i”.
I just tried it, and my submission did not appear on the leaderboard like you…
I have a same issue.
I set the tag name to start with “submission-”.
Are there any other requirements to submit to Round 2?
Looping in @shivam to have a quick look.
We have manually recomputed the leaderboard. But also investigating why this was not previous automatically computed.
Our submission appear to be graded, but not on leaderboard too
Seems like the problem still persist and leaderboard is not auto recomputing
@mohanty could you please check these submissions? They did not appear on the leaderboard.
“State : Generating Video ” [5 days ago]
“Unable to orchestrate submission, please contact Administrators.”
This has been a sneaky bug !!
We recomputed the leaderboard manually again, and will keep an eye on this thread for more such reports until we find the exact source of the bug.
Thanks in advance for keeping us informed when this happens again.
@mohanty My agent has been graded, but the issue wouldn’t close for 4 hours now after saying “generating video” Could you look into this? ( #6984)
We have identified and put a fix for getting stuck on the video generation submissions, they will now on adhere timeout of 20 minutes for video generation.
We are still looking on why video is not generated for some people.
@devops @mohanty Now I’m getting this error message.
2019-06-19T15:35:16.228972099Z INFO:mlagents_envs:Start training by pressing the Play button in the Unity Editor.
2019-06-19T15:35:46.233275211Z Traceback (most recent call last):
2019-06-19T15:35:46.233307689Z File "run.py", line 43, in <module>
2019-06-19T15:35:46.233312631Z env = ObstacleTowerEnv(args.environment_filename, docker_training=args.docker_training, retro=True, greyscale=True)
2019-06-19T15:35:46.233318069Z File "/srv/conda/envs/notebook/lib/python3.6/site-packages/obstacle_tower_env.py", line 45, in __init__
2019-06-19T15:35:46.233326328Z File "/srv/conda/envs/notebook/lib/python3.6/site-packages/mlagents_envs/environment.py", line 69, in __init__
2019-06-19T15:35:46.233330448Z aca_params = self.send_academy_parameters(rl_init_parameters_in)
2019-06-19T15:35:46.233334307Z File "/srv/conda/envs/notebook/lib/python3.6/site-packages/mlagents_envs/environment.py", line 491, in send_academy_parameters
2019-06-19T15:35:46.233338144Z return self.communicator.initialize(inputs).rl_initialization_output
2019-06-19T15:35:46.233341852Z File "/srv/conda/envs/notebook/lib/python3.6/site-packages/mlagents_envs/rpc_communicator.py", line 80, in initialize
2019-06-19T15:35:46.233345601Z "The Unity environment took too long to respond. Make sure that :\n"
2019-06-19T15:35:46.233349459Z mlagents_envs.exception.UnityTimeOutException: The Unity environment took too long to respond. Make sure that :
2019-06-19T15:35:46.233353065Z The environment does not need user interaction to launch
2019-06-19T15:35:46.233356869Z The Academy and the External Brain(s) are attached to objects in the Scene
2019-06-19T15:35:46.233360461Z The environment and the Python interface have compatible versions.
I submitted the exact code that succeeded previously. Maybe the fix introduced a new issue? I’m suspecting the second line, which is something I’ve never seen before.
@devops Now I don’t get this error but the submission is stuck for 21 hours after evaluation finished successfully. (Issue #173, Submission #7068) Could you look into this please? Appreciate your help!