简体   繁体   English

如何在不编写自己的程序的情况下将一些文件上传到 Azure blob 存储?

[英]How do I upload some file into Azure blob storage without writing my own program?

I created an Azure Storage account.我创建了一个 Azure 存储帐户。 I have a 400 megabytes.zip file that I want to put into blob storage for later use.我有一个 400 兆字节的 zip 文件,我想将其放入 blob 存储以供以后使用。

How can I do that without writing code?我怎么能不写代码做到这一点? Is there some interface for that?有什么接口吗?

Free tools:免费工具:

  1. Visual Studio 2010 -- install Azure tools and you can find the blobs in the Server Explorer Visual Studio 2010 -- 安装 Azure 工具,您可以在服务器资源管理器中找到 blob
  2. Cloud Berry Lab's CloudBerry Explorer for Azure Blob Storage Cloud Berry Lab 的 CloudBerry Explorer 用于 Azure Blob 存储
  3. ClumpsyLeaf CloudXplorer ClumpsyLeaf CloudXplorer
  4. Azure Storage Explorer from CodePlex (try version 4 beta) CodePlex 的 Azure 存储资源管理器(试用版本 4 beta)

There was an old program called Azure Blob Explorer or something that no longer works with the new Azure SDK.有一个旧程序称为 Azure Blob Explorer 或不再适用于新的 Azure SDK 的程序。

Out of these, I personally like CloudBerry Explorer the best.其中,我个人最喜欢 CloudBerry Explorer。

The easiest way is to use Azure Storage PowerShell.最简单的方法是使用 Azure 存储 PowerShell。 It provided many commands to manage your storage container/blob/table/queue.它提供了许多命令来管理您的存储容器/blob/table/queue。

For your mentioned case, you could use Set-AzureStorageBlobContent which could upload a local file into azure storage as a block blob or page blob.对于您提到的情况,您可以使用Set-AzureStorageBlobContent将本地文件作为块 blob 或页面 blob 上传到 azure 存储中。

Set-AzureStorageBlobContent -Container containerName -File .\filename -Blob blobname

For details, please refer to http://msdn.microsoft.com/en-us/library/dn408487.aspx .详情请参考http://msdn.microsoft.com/en-us/library/dn408487.aspx

If you're looking for a tool to do so, may I suggest that you take a look at our tool Cloud Storage Studio ( http://www.cerebrata.com/Products/CloudStorageStudio ).如果您正在寻找这样做的工具,我建议您看看我们的工具 Cloud Storage Studio( http://www.cerebrata.com/Products/CloudStorageStudio )。 It's a commercial tool for managing Windows Azure Storage and Hosted Service.它是用于管理 Windows Azure 存储和托管服务的商业工具。 You can also find a comprehensive list of Windows Azure Storage Management tools here: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx You can also find a comprehensive list of Windows Azure Storage Management tools here: http://blogs.msdn.com/b/windowsazurestorage/archive/2010/04/17/windows-azure-storage-explorers.aspx

Hope this helps.希望这可以帮助。

The StorageClient has this built into it. StorageClient 内置了这个。 No need to write really anything:不需要写任何东西:

var account = new CloudStorageAccount(creds, false);

var client = account.CreateCloudBlobClient();

var blob = client.GetBlobReference("/somecontainer/hugefile.zip");

//1MB seems to be a pretty good all purpose size
client.WriteBlockSizeInBytes = 1024;

//this sets # of parallel uploads for blocks
client.ParallelOperationThreadCount = 4; //normally set to one per CPU core

//this will break blobs up automatically after this size
client.SingleBlobUploadThresholdInBytes = 4096;

blob.UploadFile("somehugefile.zip");

There is a new OpenSource tool provided by Microsoft:微软提供了一个新的开源工具:

  • Project Deco - Crossplatform Microsoft Azure Storage Account Explorer. Project Deco - 跨平台 Microsoft Azure 存储帐户资源管理器。

Please, check those links:请检查这些链接:

I use Cyberduck to manage my blob storage.我使用Cyberduck来管理我的 blob 存储。

It is free and very easy to use.它是免费的并且非常易于使用。 It works with other cloud storage solutions as well.它也适用于其他云存储解决方案。

I recently found this one as well: CloudXplorer我最近也发现了这个: CloudXplorer

Hope it helps.希望能帮助到你。

You can use Cloud Combine for reliable and quick file upload to Azure blob storage.您可以使用Cloud Combine将文件可靠快速地上传到 Azure blob 存储。

A simple batch file using Microsoft's AzCopy utility will do the trick.使用 Microsoft 的AzCopy实用程序的简单批处理文件就可以解决问题。 You can drag-and-drop your files on the following batch file to upload into your blob storage container:您可以将文件拖放到以下批处理文件中以上传到您的 Blob 存储容器中:

upload.bat上传.bat

@ECHO OFF

SET BLOB_URL=https://<<<account name>>>.blob.core.windows.net/<<<container name>>>
SET BLOB_KEY=<<<your access key>>>

:AGAIN
IF "%~1" == "" GOTO DONE

AzCopy /Source:"%~d1%~p1" /Dest:%BLOB_URL% /DestKey:%BLOB_KEY% /Pattern:"%~n1%~x1" /destType:blob

SHIFT
GOTO AGAIN

:DONE
PAUSE

Note that the above technique only uploads one or more files individually (since the Pattern flag is specified) instead of uploading an entire directory.请注意,上述技术仅单独上传一个或多个文件(因为指定了Pattern标志),而不是上传整个目录。

You can upload files to Azure Storage Account Blob using Command Prompt .您可以使用Command Prompt将文件上传到 Azure 存储帐户 Blob。

Install Microsoft Azure Storage tools .安装Microsoft Azure 存储工具

And then Upload it to your account blob will CLI command:然后将其上传到您的帐户 blob 将 CLI 命令:

AzCopy /Source:"filepath" /Dest:bloburl /DestKey:accesskey /destType:blob

Hope it Helps.. :)希望能帮助到你.. :)

The new Azure Portal has an 'Editor' menu option in preview when in the container view.新的 Azure 门户在容器视图中的预览中具有“编辑器”菜单选项。 Allows you to upload a file directly to the container from the Portal UI允许您从 Portal UI 将文件直接上传到容器

You can upload large files directly to the Azure Blob Storage directly using the HTTP PUT verb, the biggest file I have tried with the code below is 4,6 Gb.您可以使用 HTTP PUT 动词直接将大文件上传到 Azure Blob 存储,我用下面的代码尝试过的最大文件是 4,6 Gb。 You can do this in C# like this:您可以像这样在 C# 中执行此操作:

// write up to ChunkSize of data to the web request
void WriteToStreamCallback(IAsyncResult asynchronousResult)
{
    var webRequest = (HttpWebRequest)asynchronousResult.AsyncState;
    var requestStream = webRequest.EndGetRequestStream(asynchronousResult);
    var buffer = new Byte[4096];
    int bytesRead;
    var tempTotal = 0;

    File.FileStream.Position = DataSent;

    while ((bytesRead = File.FileStream.Read(buffer, 0, buffer.Length)) != 0
        && tempTotal + bytesRead < CHUNK_SIZE 
        && !File.IsDeleted 
        && File.State != Constants.FileStates.Error)
    {
        requestStream.Write(buffer, 0, bytesRead);
        requestStream.Flush();

        DataSent += bytesRead;
        tempTotal += bytesRead;

        File.UiDispatcher.BeginInvoke(OnProgressChanged);
    }

    requestStream.Close();

    if (!AbortRequested) webRequest.BeginGetResponse(ReadHttpResponseCallback, webRequest);
}

void StartUpload()
{
    var uriBuilder = new UriBuilder(UploadUrl);

    if (UseBlocks)
    {
        // encode the block name and add it to the query string
        CurrentBlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(Guid.NewGuid().ToString()));
        uriBuilder.Query = uriBuilder.Query.TrimStart('?') + string.Format("&comp=block&blockid={0}", CurrentBlockId);
    }

    // with or without using blocks, we'll make a PUT request with the data
    var webRequest = (HttpWebRequest)WebRequestCreator.ClientHttp.Create(uriBuilder.Uri);
    webRequest.Method = "PUT";
    webRequest.BeginGetRequestStream(WriteToStreamCallback, webRequest);
}

The UploadUrl is generated by Azure itself and contains a Shared Access Signature, this SAS URL says where the blob is to be uploaded to and how long time the security access (write access in your case) is given. UploadUrl 由 Azure 本身生成,并包含一个共享访问签名,这个 SAS URL 说明了在您的情况下,该 blob 的访问权限是多长时间。 You can generate a SAS URL like this:您可以像这样生成 SAS URL :

readonly CloudBlobClient BlobClient;
readonly CloudBlobContainer BlobContainer;

public UploadService()
{
    // Setup the connection to Windows Azure Storage
    var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString");
    BlobClient = storageAccount.CreateCloudBlobClient();

    // Get and create the container
    BlobContainer = BlobClient.GetContainerReference("publicfiles");
}

string JsonSerializeData(string url)
{
    var serializer = new DataContractJsonSerializer(url.GetType());
    var memoryStream = new MemoryStream();

    serializer.WriteObject(memoryStream, url);

    return Encoding.Default.GetString(memoryStream.ToArray());
}

public string GetUploadUrl()
{
    var sasWithIdentifier = BlobContainer.GetSharedAccessSignature(new SharedAccessPolicy
    {
        Permissions = SharedAccessPermissions.Write,
        SharedAccessExpiryTime =
            DateTime.UtcNow.AddMinutes(60)
    });
    return JsonSerializeData(BlobContainer.Uri.AbsoluteUri + "/" + Guid.NewGuid() + sasWithIdentifier);
}

I also have a thread on the subject where you can find more information here How to upload huge files to the Azure blob from a web page我也有一个关于该主题的主题,您可以在此处找到更多信息How to upload large files to the Azure blob from a web page

I've used all the tools mentioned in post, and all work moderately well with block blobs.我已经使用了帖子中提到的所有工具,并且都可以很好地处理块 blob。 My favorite however is BlobTransferUtility然而,我最喜欢的是BlobTransferUtility

By default BlobTransferUtility only does block blobs.默认情况下,BlobTransferUtility 只阻止 blob。 However changing just 2 lines of code and you can upload page blobs as well.但是,只需更改 2 行代码,您也可以上传页面 blob。 If you, like me, need to upload a virtual machine image it needs to be a page blob.如果您像我一样需要上传虚拟机映像,则它必须是页面 blob。

(for the difference please see this MSDN article. ) (有关差异,请参阅此 MSDN 文章。

To upload page blobs just change lines 53 and 62 of BlobTransferHelper.cs from要上传页面 blob,只需将 BlobTransferHelper.cs 的第 53 和 62 行从

new Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob

to

new Microsoft.WindowsAzure.Storage.Blob.CloudPageBlob

The only other thing to know about this app is to uncheck HELP when you first run the program to see the actual UI.关于此应用程序唯一需要了解的另一件事是在您第一次运行程序以查看实际 UI 时取消选中帮助。

Check out this post Uploading to Azure Storage where it is explained how to easily upload any file via PowerShell to Azure Blob Storage.查看这篇上传到 Azure 存储的帖子,其中解释了如何通过 PowerShell 轻松地将任何文件上传到 Azure Blob 存储。

You can use Azcopy tool to upload the required files to the azure default storage is block blob u can change pattern according to your requirement您可以使用 Azcopy 工具将所需文件上传到 azure 默认存储是块 blob 您可以根据您的要求更改模式

Syntax句法

AzCopy /Source :  /Destination /s

Try the Blob Service API尝试Blob Service API

http://msdn.microsoft.com/en-us/library/dd135733.aspx http://msdn.microsoft.com/en-us/library/dd135733.aspx

However, 400mb is a large file and I am not sure a single API call will deal with something of this size, you may need to split it and reconstruct using custom code.但是,400mb 是一个文件,我不确定单个 API 调用是否可以处理这种大小的内容,您可能需要将其拆分并使用自定义代码进行重建。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 在 Azure blob 存储中存储我的角色服务 package 有多安全? - How safe is storing my role service package in Azure blob storage? 编写自己的“混合模式”调试器时,如何与Visual Studios本机调试器进行交互? - How do I interact with Visual Studios native debugger when writing my own “mixed mode” debugger? 我想通过编写自己的JNI代码来学习如何做JNA的工作 - I want to learn how to do what JNA does by writing my own JNI code Azure教程 - 如何使用本地blob存储 - Azure tutorial - How to use local blob storage 如何连接Azure存储以从blob存储中读取.txt文件 - How to connect Azure Storage to read .txt files from blob storage 如何将文件从 Google Storage 上传(传输)到 Google Computer Engine VM - How do I upload (transfer) file from Google Storage to Google Computer Engine VM 如何在不创建本地文件的情况下将.xlsx文件上传到FTP? - How do I upload an .xlsx file to an FTP without creating a local file? 如何将CData值插入到Azure表存储行中 - How do I insert CData value into azure table storage row 如何将文件保存到手机的内部存储或SD卡? - How do I save a file to my Phone's Internal Storage or SD Card? 如何在 C 中枚举引导条目而不使我的程序崩溃? - How Do I Enumerate Boot Entries without Crashing My Program in C?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM