简体   繁体   中英

Git LFS skipping File but git starts pushing it to repo

I have a big file in my Github repo which i normally uploaded with git lfs. For a newer commit, i had to change the file, but now on pushing, git lfs is skipping the file and normal git tries to upload it. This, of course, fails, cause it exceeds the maximum file size of Github. When i run

GIT_TRACE=1 git push

this is the output:

trace git-lfs: run_command: 'git' version trace git-lfs: run_command: 'git' config -l trace git-lfs: tq: running as batched queue, batch size of 100 trace git-lfs: run_command: ssh -- git@github.com git-lfs-authenticate myRepo.git upload trace git-lfs: HTTP: POST https://lfs.github.com/myRepo/locks/verify trace git-lfs: HTTP: 200 trace git-lfs: HTTP: {"ours":[],"theirs":[],"next_cursor":""}

trace git-lfs: pre-push: refs/heads/master d7b0e4138403023433894f756d63bdadfabac125 refs/heads/master 683a30586bc68758230da6686fa902d4621b358a trace git-lfs: run_command: git rev-list --objects d7b0e4138403023433894f756d63bdadfabac125 --not --remotes=origin -- trace git-lfs: run_command: git cat-file --batch trace git-lfs: tq: sending batch of size 1 trace git-lfs: ssh cache: git@github.com git-lfs-authenticate myRepo.git upload trace git-lfs: api: batch 1 files trace git-lfs: HTTP: POST https://lfs.github.com/myRepo/objects/batch trace git-lfs: HTTP: 200 trace git-lfs: HTTP: {"objects":[{"oid":"1e24fed72634c9217ce7856d11ee204d38eb154fc90572a8ef047007f2211a6c","size":246116656}]} trace git-lfs: tq: starting transfer adapter "basic" Git LFS: (0 of 0 files, 1 skipped) 0 B / 0 B, 234.72 MB skipped
17:22:37.083227 run-command.c:343 trace: run_command: 'pack-objects' '--all-progress-implied' '--revs' '--stdout' '--thin' '--delta-base-offset' '--progress' 17:22:37.084316 exec_cmd.c:128
trace: exec: 'git' 'pack-objects' '--all-progress-implied' '--revs' '--stdout' '--thin' '--delta-base-offset' '--progress' 17:22:37.088704 git.c:348 trace: built-in: git 'pack-objects' '--all-progress-implied' '--revs' '--stdout' '--thin' '--delta-base-offset' '--progress' Counting objects: 109, done. Delta compression using up to 4 threads.

Compressing objects: 100% (106/106), done. Writing objects: 100% (109/109), 73.55 MiB | 1.81 MiB/s, done. Total 109 (delta 74), reused 0 (delta 0) remote: Resolving deltas: 100% (74/74), completed with 53 local objects. remote: error: GH001: Large files detected. You may want to try Git Large File Storage - https://git-lfs.github.com . remote: error: Trace: e87aee9bcda79c0a788ae345112c9d37 remote: error: See http://git.io/iEPt8g for more information. remote: error: File src/ios/sdk/myLib.framework/Framework is 234.72 MB; this exceeds GitHub's file size limit of 100.00 MB To git@github.com:myRepo.git ! [remote rejected] master -> master (pre-receive hook declined) error: failed to push some refs to 'git@github.com:myRepo.git'

It sounds like the lfs clean filter wasn't applied when adding the new version of the file to the index. If the filename didn't change, then that probably means there isn't a .gitattributes file associating that path with LFS. (Either there never was one and you somehow ran the clean filter manually when you first added the old version of the file; or there was one but it didn't get committed; or it's since been removed or modified in a way that no longer matches the file; etc...)

And to clarify - if the filename did change, it probably changed to something that doesn't match any path in the .gitattributes file. So you'd need to update .gitattributes to match the new filename.

Once a file is staged with LFS tracking, what git sees (in the index and in the database) is the LFS pointer file. So even if the attributes file were removed, it wouldn't immediately cause the large file to be uploaded in the repo database. But if you re-add the file (as you'd have to after modifying it) and the attributes aren't set up at that time, then the entire file gets staged instead of a pointer file.

I think what LFS is skipping in your above trace is the old version of the file - because the server already has that. This is normal.

But the commit you're trying to push is no good; it has the full file irrevocably embedded in it. You need to amend (or rebase, or otherwise rewrite) each commit that has the full file in it. Fortunately since this is preventing sharing of the commits you should be able to safely rewrite them without worrying about anyone else being put in an "upstream rebase" situation.

So to summarize:

Make sure you have a .gitattributes file that assigns the LFS attributes to a path matching the large file. You should add this .gitattributes file to the index.

Remove and re-index the new version of the large file.

git rm --cached path/to/big/file
git add path/to/big/file

If .gitattributes is set up correctly, the add will go through the LFS clean filter this time; a pointer file will be added to the index and a new LFS object will be created.

git commit --amend

to replace the commit that has the large BLOB embedded in it with a new one that uses LFS.

Now try to push. If it still fails, that means there are probably other commits you need to fix, and then things could get more involved.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM