Question About How to free up Git LFS storage, i use filter-repo to delete files in every commit, local git/lfs space freed, but the remote storage seems not freed up

Hello, i am trying to clean up my git lfs repo in order to submit some new model and keep the lfs storage reasonable. I tried to use git filter repo accordding to [从 Git Large File Storage 中删除文件 - GitHub 文档], but it seems that, though I delete every appearance of my model, but the storage is not freed, I wonder if i am doing things the right way.

You can try this “git lfs prune” it worked for us :slight_smile:

1 Like

hello, thank you for your reply! I am trying out prune out some LFS no longer referenced, using these command, indeed my local .git/lfs reduced from 55GB to 30GB, but remotely the storage didnt decrease, Could you please take a moment to help me figure out what exactly went wrong? Thank you.

$ git filter-repo --invert-paths --path models/llama-2-7b-chat-hf --force
Parsed 6 commits
HEAD is now at c04d548 111
Enumerating objects: 250, done.
Counting objects: 100% (250/250), done.
Delta compression using up to 16 threads
Compressing objects: 100% (215/215), done.
Writing objects: 100% (250/250), done.
Total 250 (delta 36), reused 235 (delta 30), pack-reused 0

New history written in 0.33 seconds; now repacking/cleaning...
Repacking your repo and cleaning out old unneeded objects
Completely finished after 1.04 seconds.

$ git lfs prune
prune: 38 local object(s), 35 retained, done.
prune: Deleting objects: 100% (4/4), done.

$ git commit -m "remove llama2"
[master adb7b57] remove llama2
 2 files changed, 175 insertions(+), 4 deletions(-)
 create mode 100644 .idea/deployment.xml

$ git remote add origin git@gitlab.aicrowd.com:username/amazon-kdd-cup-2024.git
$ git push origin master --force
Locking support detected on remote "origin". Consider enabling it with:
  $ git config lfs.https://gitlab.aicrowd.com/username/amazon-kdd-cup-2024.git/info/lfs.locksverify true
Enumerating objects: 255, done.
Counting objects: 100% (255/255), done.
Delta compression using up to 16 threads
Compressing objects: 100% (214/214), done.
Writing objects: 100% (255/255), 3.72 MiB | 1.98 MiB/s, done.
Total 255 (delta 39), reused 248 (delta 36), pack-reused 0
remote: Resolving deltas: 100% (39/39), done.
To gitlab.aicrowd.com:username/amazon-kdd-cup-2024.git
 + 2d05ca1...adb7b57 master -> master (forced update)

Hi, again, you can use a branch to push this 30 gb (git checkout -b develop) and then mark this branch as default (this in gitlab web), so when you git clone after this, you only copy this 30 gb. If you do not understand me, tell me and I explain you with more detail :slight_smile:

Hi, thanks for your reply again! i think i found the reason why the gitlab remote storage wont shrink. I tried to do repo clean up after i pruned and push my repo, and it worked!, My repo storage decreased to 30GB, thank you for your patience and help~ :smiling_face_with_three_hearts: you are very helpful~

1 Like

You’re welcome :slight_smile: Good luck :crossed_fingers:

Hi chenwenwen Could you share how you clean up repo?