Submission Evaluation - Queued since 2 hours

@shivam @kelleni2

Please let me know why this evaluation is showing like this.

@mohanty Can you please look into this?

@shravankoninti Can you please send the issue link? Or just tag me in the issue.


AIcrowd Submission Received #28401

AIcrowd Submission Received #28354

Please reply me asap.

Hi @shravankoninti,

The pipeline was blocked due to failed submissions which didn’t terminate with non-zero exit code. We have cleared the pipeline and adding a fix now so it don’t happen again.


Can you also check for

There are ~12+(6 running) submissions in queue due to stuck pipeline and getting cleared right now. Your submission will be evaluated shortly.

Thanks a lot very much @shivam


can you please check this.

It worked after a long time @shivam


Can you please check why this error is coming.

i have done close to 80 submissions - did not get this type of error of row_ids not matching.

Did it happen becuase you changed the test data path and test file? Please let me know

as of now I have test rows as 6947 . Am I correct with this? Let me know why this error has come asap.

Hi @shravankoninti,

Yes this condition is added with newer version of evaluator that is using the updated splitting announced here. I am looking into this and will keep you updated here.

@shivam Thanks very much.


Will be we able to submit predictions during holidays time? I meant from today to Jan 10th? Does Aridhia provides workspace?

Yes, I think workspaces will be available to you. Please go through the announcement made by Nick here. UPDATE / EXTENSION: DSAI Challenge: Leaderboard & Presentation deadlines

Any updates on fixing this issue?

Hi @shravankoninti,

This issue is resolved now, and your above submission have latest feedback i.e. newer dataset. Meanwhile other submissions by you and other participants are still in queue and being re-evaluated right now.


I am still getting the same error message for the new submission I have done.

Any updates? @kelleni2 @satyakantipudi @mohanty

We are getting the same error.Any updates on this?

Hi everyone, please make sure that your submissions are creating prediction file with correct row_id. The row_id was not being match strictly till the previous evaluator version and we have added assert for the same now. Due to which the submissions have failed with the row_ids in the generated prediction file do not match that of the ground_truth.

Your solution need to output row_id as shared in the test data and not hardcoded / sequential (0,1,2…). Also note, that row_id can be different on data present on evaluations v/s workspace, to make sure people aren’t hardcoding from that file.

We are trying to apply automatic patch wherever possible, but it need to be ultimately fixed in solutions submitted.