New tag does not create an issue or evaluation

Hello there !

I recently managed to get a lower model checkpoint to be able to push it via git lfs but now it seems the journey is not over. As mentioned here, I pushed my files and created a tag, which does appear on my Gitlab repo. The tag was created 15mins ago, and I was told that the corresponding issue should already be opened. But in my case, there is no issue that was opened.

Any insight about this?


Hi @frgfm,

Welcome to the Food Recognition Challenge. :wave:

:white_check_mark: Solution
To immediately start and make a submission, please create a new commit (editing any file) and submit again using submission- git tag.

:nerd_face: Description
I went through your git history and it happened because you pushed v0.1.0 followed by submission-v1. What happened here is, we only accept submissions from git tags having prefix submission-, due to which v0.1.0 failed to create a submission.

While when you retried using submission-v1 it looked into the history and found the same commit hash (v0.1.0) sent previously and didn’t trigger submission. Ideally, I believe it should cache/check history only for submission- prefix tags, which didn’t happen here and we will improve it on our side.

Sorry for the inconvenience caused.
Hoping to get exciting submissions from you in the challenge! :smiley:

1 Like

Hi @shivam,

I forgot about the caching of commit hash! I just pushed a commit with a different tag and the issue opened itself.

Thanks a lot :slight_smile:

By the way, I had trouble with pushing my checkpoint since it’s more than 200Mb. I had to convert it to half precision, but do you have any other solution in mind? (I also tried putting it in the release attachment but gitlab seems to limit them to 30Mb where github allows far larger attachments)

@frgfm: I suspect you are actually not using GIT-LFS to check in your models ?

@shivam: Maybe we should reduce the file size limits of file blobs that can be checked in directly via git. That helps ensure if checkpoints are by chance being checked into the repository directly via git, it should throw up a meaningful error much before.


I thought that was it, and followed again all steps to track the file with LFS (and on my repo, the file is indeed tagged with “LFS”). Unfortunately, it did not help. But as I suspected, switching to half precision made the checkpoint down below 200Mb which then allowed the push. So I really suspect the issue is Gitlab here rather than git or git lfs.

But if you have any way to get around that, I’d be interested!

Weird. I am pretty sure, I have checked in much larger models via LFS.

@shivam: do let us know if you figure out the reason.

My models (which are quite a lot bigger than 200MB) get checked into LFS with no issues. No special commands, just have LFS set up and then git add the checkpoint of the model (.pth file).

1 Like

Another hypothesis I have, is that in your git history you have large models checked in.
Even if you use git-lfs from now, your git client will try to push the large models in your git history to our github endpoint, which would reject the request because of it breaching the file limit.

A solution could be to scrub your repository history off of large file blobs.


1 Like

Hi @frgfm,

I have same hypothesis as Mohanty shared above.

Can you share exact output/error when you do git push, I can help based on the error.

Hypothesis in advance based on the exact error:

  1. In case it is throwing Fatal: Maximum size, etc.. then the reason would be file is already added and you need to migrate it from non-LFS to LFS (happens most of the times). Reference: How to upload large files (size) to your submission
  2. If the error is Failed to push LFS/stuck in uploading etc, it can be due to unstable/very-slow internet on your side causing the upload to stop/timeout in middle (rare, but happens). Reference: Cannot upload my model's weights to GitLab - filesize too large

I’ll dive into this further, thanks!