简体   繁体   English

更新大于1GB的文件?

[英]Updating files bigger then 1GB?

Currently I am using xdelta to make the update files and send only the difference of the user current application version but I would like to know if there are better ways or other ways to approch this sort of situation and the advices the community could give me. 目前我使用xdelta来制作更新文件并仅发送用户当前应用程序版本的差异,但我想知道是否有更好的方法或其他方法来解决这种情况以及社区可以给我的建议。

Our application is written in C# and our webserver is running on Linux with apache httpd. 我们的应用程序是用C#编写的,我们的web服务器在Linux上运行,带有apache httpd。

Using xdelta new patchs shrink to the size difference between old and new file so the updates vary between 10mb to 500 mb avoiding the need to re-upload those 1GB each time we change something. 使用xdelta新补丁缩小到旧文件和新文件之间的大小差异,因此更新在10mb到500mb之间变化,避免了每次更改内容时需要重新上载这些1GB。

  1. What other options do I have that you could recommend me ? 我还有什么其他选择可以推荐我吗?
  2. What else should I consider in this case ? 在这种情况下我还应该考虑什么?

PS: I am really at loss with this to be honest, this is the first updater app I have made and I really don't know if I am doing it right. PS:说实话,我真的很失落,这是我制作的第一个更新程序应用程序,我真的不知道我做得对。


if possible recommend open source or non-commercial applications 如果可能,建议使用开源或非商业应用程序

With the minimal amount of information provided, I dont see a problem with your approach. 由于提供的信息量极少,我发现您的方法没有问题。 If you need to patch a 1GB file you need to patch a 1GB file. 如果需要修补1GB文件,则需要修补1GB文件。 I assume it is a data file and therefore needs to be that size? 我假设它是一个数据文件,因此需要这么大? Is there any way you could split it into files that don't often change to possibly further reduce your patch size? 有没有什么办法可以将它拆分成不经常改变的文件,可能会进一步缩小你的补丁大小?

You might give bsdiff a try. 你可以尝试一下bsdiff。 It's the same thing as xdelta, with a somewhat different algorithm. 这与xdelta是一回事,算法有所不同。

Getting started under Windows was not all that straighforward when I first tried (that may be different now, have not looked in a while), but once it worked, it worked ok. 在我第一次尝试时开始使用Windows并不是那么简单(现在可能会有所不同,暂时没有看过),但一旦它起作用,它就可以了。
My mileage is that compression runs about 10 times slower than xdiff (but, who cares...) and generates patches that are usually about half the size. 我的里程是压缩比xdiff慢了大约10倍(但是,谁在乎...)并生成通常大约一半大小的补丁。 Obviously your mileage will depend a lot on your data, but for me it's big win. 显然你的里程数很大程度上取决于你的数据,但对我来说这是一个很大的胜利。 If you have some spare time, you could always give it a try. 如果你有空闲时间,你总是可以尝试一下。

The abysmal compress time may be a deterrent in some situations, but for my situation the fact that the patches are half the size excuses for everything, I wouldn't care if generating a patch ran over the entire weekend, if only that reduced the size again. 在某些情况下,糟糕的压缩时间可能是一种威慑,但对于我的情况,补丁的大小是一切的一半借口,我不在乎是否在整个周末生成补丁,如果只是缩小了尺寸再次。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM