In Git, there is a command git clone --depth=<depth>
to retrieve historical data only specific length. there are also a command to gather more historical data by use git fetch --depth=<depth>
.
How about when we want to free some spaces from large repository? I know that we may use git gc
or git prune
but are there other way to specific like --depth=<depth>
to reduce number of commit store in local repository? AND it also should keep SHA1 to be able continue to working with it.
The easiest way would be to:
git clone --depth=n /url/of/remote/repo
That would clone the last n commits, while allowing fetch/pull/psuh to still work with the remote repo.
since Git 2.5, you can fetch a single commit , but unless that commit is the latest one (which is like a git clone --depth=1
), that would not allo for fetch/pull/push.
The other approach to make sure a given local repo is as lean as possible is to use a combination of gc/prune/repack :
git gc
git repack -Ad # kills in-pack garbage
git prune # kills loose garbage
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.