简体   繁体   English

Azure Blob存储和HDF文件存储

[英]Azure Blob storage and HDF file storage

I am in the middle of developing a cloud server and I need to store HDF files ( http://www.hdfgroup.org/HDF5/ ) using blob storage. 我正在开发云服务器,我需要使用blob存储来存储HDF文件( http://www.hdfgroup.org/HDF5/ )。

Functions related to creating, reading writing and modifying data elements within the file come from HDF APIs. 与创建,读取写入和修改文件中的数据元素相关的函数来自HDF API。

I need to get the file path to create the file or read or write it. 我需要获取文件路径来创建文件或读取或写入它。

Can anyone please tell me how to create a custom file on Azure Blob ? 任何人都可以告诉我如何在Azure Blob上创建自定义文件?

I need to be able to use the API like shown below, but passing the Azure storage path to the file. 我需要能够使用如下所示的API,但将Azure存储路径传递给该文件。 http://davis.lbl.gov/Manuals/HDF5-1.4.3/Tutor/examples/C/h5_crtfile.c http://davis.lbl.gov/Manuals/HDF5-1.4.3/Tutor/examples/C/h5_crtfile.c

These files i am trying to create can get really huge ~10-20GB, So downloading them locally and modifying them is not an option for me. 我试图创建的这些文件可以变得非常大~10-20GB,所以在本地下载它们并修改它们对我来说不是一个选择。

Thanks 谢谢

Shashi 沙市

One possible approach, admittedly fraught with challenges, would be to create the file in a temporary location using the code you included, and then use the Azure API to upload the file to Azure as a file input stream. 一种可能充满挑战的可能方法是使用您包含的代码在临时位置创建文件,然后使用Azure API将文件作为文件输入流上载到Azure。 I am in the process of researching how size restrictions are handled in Azure storage, so I can't say whether an entire 10-20GB file could be moved in a single upload operation, but since the Azure API reads from an input stream, you should be able to create a combination of operations that would result in the information you need residing in Azure storage. 我正在研究如何在Azure存储中处理大小限制,因此我不能说是否可以在单个上载操作中移动整个10-20GB文件,但由于Azure API从输入流中读取,因此应该能够创建一个操作组合,这些操作将导致您需要的信息驻留在Azure存储中。

Can anyone please tell me how to create a custom file on Azure Blob ? 任何人都可以告诉我如何在Azure Blob上创建自定义文件?

I need to be able to use the API like shown below, but passing the Azure storage path to the file. 我需要能够使用如下所示的API,但将Azure存储路径传递给该文件。 http://davis.lbl.gov/Manuals/HDF5-1.4.3/Tutor/examples/C/h5_crtfile.c http://davis.lbl.gov/Manuals/HDF5-1.4.3/Tutor/examples/C/h5_crtfile.c

Windows Azure Blob storage is a service for storing large amounts of unstructured data that can be accessed via HTTP or HTTPS . Windows Azure Blob存储是一种用于存储可通过HTTPHTTPS访问的大量非结构化数据的服务。 So from application point of view Azure Blob does not work as regular disk. 因此,从应用程序的角度来看,Azure Blob不能像普通磁盘一样工作。

Microsoft provides quite good API (c#, Java) to work with the blob storage. Microsoft提供了非常好的API (c#,Java)来处理blob存储。 They also provide Blob Service REST API to access blobs from any other language (where specific blob storage API is not provided like C++). 他们还提供Blob服务REST API来访问来自任何其他语言的blob(其中没有像C ++那样提供特定的blob存储API)。

A single block blob can be up to 200GB so it should easily store files of ~10-20GB size. 单个块blob最高可达200​​GB,因此它可以轻松存储大小约为10-20GB的文件。

I am afraid that the provided example will not work with Windows Azure Blob. 我担心提供的示例不适用于Windows Azure Blob。 However, I do not know HDF file storage; 但是,我不知道HDF文件存储; maybe they provide some Azure Blob storage support. 也许他们提供了一些Azure Blob存储支持。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM