I tried submitting a new model, and even though it works on test data from /shared_data/data/test_data_sample, it fails while evaluating.
I can see the error, but without detailed information I have no idea what is going on, I only get from my Issue:
Submission failed : list index out of range
which is not helpful at all.
It also says that error logs are not accessible for users, and that “One of the admins will provide you access to the relevant section of the log that you might require for debugging.”
Can I then get access to my log? I even resubmitted the model with debug: True flag in aicrowd.json but it didn’t change anything.
Also, is this the only way to debug evaluation errors? If so, I can’t fathom trying anything advanced and then requesting access and waiting for it each time I have to debug.
Maybe they show some better way but what I was doing is run with the test file that you can find in the shared folder.
We share error logs on best effort directly in your failed Gitlab issues as comment, and looks like you get the response later on by our team in each gitlab issue.
The submission which you made with debug mode, actually had open “agent-logs” i.e. your submitted code. http://gitlab.aicrowd.com/michal-pikusa/dsai-challenge/issues/5#note_30139
Unfortunately, the message was saying “Logs for Admin Reference” instead of “Logs for Participant Reference” which would have caused the confusion that the logs aren’t available to you directly. I have updated the gitlab issue comment content for this challenge now, so this don’t cause any confusion going further.
FAQ Section: Why Debug Mode