[英]How to upload a large file as chunk in azure media service using only c#?
I was trying to upload files into azure media service(console application), if the file size is less i can use normal upload method as follows 我正在尝试将文件上传到azure媒体服务(控制台应用程序)中,如果文件大小较小,我可以使用以下正常的上传方法
static public IAsset CreateAssetAndUploadSingleFile(AssetCreationOptions assetCreationOptions, string singleFilePath)
{
if (!File.Exists(singleFilePath))
{
Console.WriteLine("File does not exist.");
return null;
}
var assetName = Path.GetFileNameWithoutExtension(singleFilePath);
IAsset inputAsset = _context.Assets.Create(assetName, assetCreationOptions);
var assetFile = inputAsset.AssetFiles.Create(Path.GetFileName(singleFilePath));
Console.WriteLine("Upload {0}", assetFile.Name);
assetFile.Upload(singleFilePath);
Console.WriteLine("Done uploading {0}", assetFile.Name);
return inputAsset;
}
this will upload the file into azure media service, but if the file size is too large, want to split it into chunks and need to upload, what changes should i make to achieve this only using c#. 这会将文件上传到azure媒体服务中,但是如果文件太大,想要将其拆分成块并需要上传,我应该进行哪些更改以仅使用c#来实现此目的。
i have seen a link which explain this using javascript(i have no idea) gaurav's link . 我看过一个使用javascript解释此问题的链接(我不知道) gaurav的链接 。 how can i implement similar using c#.
我如何使用C#实现类似。
please help. 请帮忙。
Have you taken a look at this article already? 您已经看过这篇文章了吗? https://docs.microsoft.com/en-us/azure/media-services/media-services-dotnet-upload-files#upload-multiple-files-with-media-services-net-sdk
https://docs.microsoft.com/zh-cn/azure/media-services/media-services-dotnet-upload-files#upload-multiple-files-with-media-services-net-sdk
Look at the section on uploading multiple files. 查看有关上传多个文件的部分。 It allows you to adjust the number of threads used by the BobTransferClient built into the SDK
它允许您调整SDK内置的BobTransferClient使用的线程数
var blobTransferClient = new BlobTransferClient();
blobTransferClient.NumberOfConcurrentTransfers = 20;
blobTransferClient.ParallelTransferThreadCount = 20;
Another option would also be to just get a SAS URL to upload directly into the container in Blob Storage, and then use the Azure Storage Net Data movement library that is used as the underlying component of the AzCopy tool. 另一个选择是仅获取SAS URL,直接上传到Blob Storage的容器中,然后使用用作AzCopy工具基础组件的Azure Storage Net Data移动库。 This allows you to completely control the parallel uploads, chunks sizes, concurrent threads, etc. Gives you all the power and the knobs!
这使您可以完全控制并行上传,块大小,并发线程等。为您提供所有功能和旋钮!
https://github.com/Azure/azure-storage-net-data-movement https://github.com/Azure/azure-storage-net-data-movement
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.