![](/img/trans.png)
[英]Can no longer git clone large repos via HTTPS since installing Big Sur
[英]gitlab: git clone https with large repos fails
当尝试通过https克隆大型仓库(~700MB)时,git失败:
c:\git-projects>git clone https://git.mycompany.de/fs.git
Cloning into 'fs'...
Username for 'https://git.mycompany.de': mwlo
Password for 'https://mwlo@git.mycompany.de':
efrror: RPC failed; result=22, HTTP code = 500
atal: The remote end hung up unexpectedly
克隆ssh作品:
c:\git-projects>git clone git@git.mycompany.de:fs.git
Cloning into 'fs'...
remote: Counting objects: 144564, done.
remote: Compressing objects: 100% (30842/30842), done.
remote: Total 144564 (delta 95360), reused 143746 (delta 94542)
Receiving objects: 100% (144564/144564), 601.34 MiB | 1.33 MiB/s, done.
Resolving deltas: 100% (95360/95360), done.
Checking out files: 100% (4649/4649), done.
使用https克隆较小的存储库也有效:
c:\git-projects>git clone https://git.mycompany.de/git-test.git
Cloning into 'git-test'...
remote: Counting objects: 135, done.
remote: Compressing objects: 100% (129/129), done.
remote: Total 135 (delta 68), reused 0 (delta 0)
Receiving objects: 100% (135/135), 18.77 KiB | 0 bytes/s, done.
Resolving deltas: 100% (68/68), done.
我已经调整了一些参数但没有成功:
/etc/nginx/nginx.conf
worker_processes 2; # have two cpu's
keepalive_timeout 120;
client_max_body_size 3072m;
/home/git/gitlab/config/gitlab.yml
## Git settings
# CAUTION!
# Use the default values unless you really know what you are doing
git:
bin_path: /usr/bin/git
# Max size of a git object (e.g. a commit), in bytes
# This value can be increased if you have very large commits
max_size: 3221225472 # 3072.megabytes
# Git timeout to read a commit, in seconds
timeout: 120
我们想使用git clone https,因为git的visual studio工具仍然没有实现ssh。
在服务器上有两个进程,CPU负载在一段时间后变为100%,然后进程终止。
git pack-objects --revs --all --stdout --progress --delta-base-offset
此致,马可
System information
System: Debian 6.0.7
Current User: root
Using RVM: no
Ruby Version: 1.9.3p392
Gem Version: 1.8.23
Bundler Version:1.3.5
Rake Version: 10.0.4
GitLab information
Version: 5.3.0
Revision: 148eade
Directory: /home/git/gitlab
DB Adapter: mysql2
URL: https://git.mycompany.de
HTTP Clone URL: https://git.mycompany.de/some-project.git
SSH Clone URL: git@git.mycompany.de:some-project.git
Using LDAP: yes
Using Omniauth: no
GitLab Shell
Version: 1.4.0
Repositories: /home/git/repositories/
Hooks: /home/git/gitlab-shell/hooks/
Git: /usr/bin/git
这在问题3079中报告:https克隆需要GitLab服务器上的大量资源(CPU,但主要是内存),以及当前(GitLab 5.x)大型回购。
甚至GitLab 6.0也提交了像7ecebdd这样的提交 ,在克隆大型 仓库时提到了超时。
我没有用GitLab 6测试过(明天会发布)。
考虑为nginx实现一个分块插件,例如HttpChunkinModule 。
我个人没有部署上述内容,但有一个类似问题的客户。
在他们的情况下,问题是在客户端,我们需要指示开发人员使用以下调整到他们的本地git配置:
git config http.postBuffer 524288000 #Set to 500MB
以上将只允许客户端更大的git相关的http请求(我们在服务器端有足够的内存)。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.