简体   繁体   中英

How can I add large files to a Git repo?

I have a Microsoft Azure / Visual studio online repo managed with Git. I am using the Git GUI application to manage it.

I have a couple of files that are 535 MB and 620 MB in size. I would like to add these to the repo.

I have enabled Git large file support, and i have set the global post buffer with the command:

git config --global http.postBuffer 1048576000

No matter what I do, I cannot seem to add these files. The commit is fine, but when I push to the remote branch, I get:

POST git-receive-pack (547584390 bytes)
error: RPC failed; HTTP 503 curl 22 The requested URL returned error: 503
fatal: the remote end hung up unexpectedly
fatal: the remote end hung up unexpectedly
Everything up-to-date

As far as I know, adjusting the buffer like this should work in this case. What am I missing?

Activating LFS locally ( git-lfs.github.com as you mention) is a good first step.

Check also the prerequisites and limitations at Azure DevOps Azure Repos / Use Git Large File Storage (LFS)

Finally, if you just added/committed the large file, it is better to reset that commit (assuming you don't have any other work in progress), and then track it through lfs :

git reset @~
git lfs track lyLargeFile
git 

After fighting with git for weeks, I have finally solved this.

the answer for me was to clone the repo again using SSH instead of HTTP.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM