简体   繁体   English

借助使用.Net SDK的Rackspace云文件,是否有办法在失败的地方继续上传失败?

[英]With Rackspace cloud files using .Net SDK, is there a way to continue a failed upload where it left off?

I'm currently using Rackspace cloud files for backing up files, some that can be rather large, and I would like to avoid having to start from the beginning every time there is a failure in the network. 我目前正在使用Rackspace云文件来备份文件,其中一些文件可能很大,并且我希望避免每次网络出现故障时都必须从头开始。 For example, some time ago my log showed a 503 error happening with the server being unavailable which caused the upload to stop. 例如,前段时间我的日志显示503错误,原因是服务器不可用,导致上传停止。

Is there anyway the .Net SDK can handle this? .NET SDK是否可以处理此问题? If not, is there another possible solution working around the SDK? 如果不是,是否还有其他可行的解决方案可以解决SDK? I've been searching for a solution, but have not yet come across anything. 我一直在寻找解决方案,但还没有发现任何问题。

Thank you. 谢谢。

EDIT: I've tried solving this in the meantime by creating my own method for segmentation for files as big as 2 GB, even though the SDK does that for you. 编辑:在此期间,我尝试通过创建自己的方法来对高达2 GB的文件进行分段来解决此问题,即使SDK可以为您做到这一点。 By dealing with smaller pieces of files, this helps, but it will result in take=ing up a lot of room in the container( 1000 object limit), so I'd still like to see if there is a better way to prevent this problem. 通过处理较小的文件,这会有所帮助,但会导致占用容器中的大量空间(限制1000个对象),因此我仍然想看看是否有更好的方法来防止这种情况问题。

I can't really speak for the .Net SDK, but I can give you some tips as far as Cloud Files goes. 我真的不能说.Net SDK,但是我可以给您一些有关Cloud Files的提示。

is there another possible solution working around the SDK? SDK是否有其他可行的解决方案?

We usually recommend segmenting large objects yourself. 我们通常建议您自己分割大对象。 This will allow you to upload multiple segments in parallel. 这样您就可以并行上传多个细分。 Then if a segment fails while uploading, you can just re-upload that single segment. 然后,如果某个段在上传时失败,则可以重新上传该单个段。 As a general rule we usually recommend ~100MB segments. 通常,我们通常建议〜100MB的段。

If you need to be able to access your file as a single object, you can use the segments to create a Static Large Object aka SLO. 如果需要能够将文件作为单个对象访问,则可以使用这些段来创建一个静态大对象 (又名SLO)。

will result in take=ing up a lot of room in the container( 1000 object limit), 将导致占用容器中的大量空间(最多1000个对象),

Containers don't have a hard limit on the number of objects they can contain, however if you expect to have a million objects you may consider spreading them across multiple containers. 容器对包含的对象数量没有硬性限制,但是,如果您希望拥有一百万个对象,则可以考虑将它们分布在多个容器中。 If you are talking about a SLO's 1000 segment limit, you could always create nested SLOs. 如果您正在谈论SLO的1000段限制,则始终可以创建嵌套的SLO。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 我们如何使用C#中的API将文件上传到机架云容器中 - How we can Upload files into rackspace cloud container with API in C# 一段时间后如何使用Asp.Net从RackSpace Cloud中删除文件? - How To Remove File After certain period from RackSpace Cloud using Asp.Net? 有没有办法继续使用不推荐使用的.net系统库? - Is there a way to continue using deprecated .net system libraries? 在C#中将文件上传到Rackspace Cloud Files时跟踪进度 - Track progress when uploading file to Rackspace Cloud Files in C# Rackspace云文件REST API莫名其妙地收到错误的请求响应 - Rackspace cloud files REST api inexplicably getting bad request response Rackspace云文件在容器C#中获取对象 - Rackspace Cloud Files Get Objects In Container C# Rackspace云文件c#,添加文件到目录 - Rackspace cloud file c#, adding files to directory 使用Parse Cloud Code与.NET SDK结合使用 - Using Parse Cloud Code in Conjunction with .NET SDK 使用电子邮件地址将文件上传到云中 - Upload files on cloud using email address Is there a way to restrict .NET Core projects to generate only.dll as output files when using .NET Core 3.1 sdk - Is there a way to restrict .NET Core projects to generate only .dll as output files when using .NET Core 3.1 sdk
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM