简体   繁体   中英

Maximum request length exceeded exception when UploadFromStream to Azure blobs

I write an API, trying to upload the HTTP post content(I'm trying to upload a video) to Azure blobs through Stream in c#. Some thing like this:

Stream videoStream = await Request.Content.ReadAsStreamAsync();
CloudBlockBlob blockBlob = container.GetBlockBlobReference(blobName);
blockBlob.UploadFromStream(videoStream);

It works OK in my local environment. But when I deploy this web service to Azure and call the API, I got a "Maximum request length exceeded" exception like this:

System.Web.HttpException (0x80004005): Maximum request length exceeded.           
at System.Web.HttpBufferlessInputStream.ValidateRequestEntityLength()
at System.Web.HttpBufferlessInputStream.GetPreloadedContent(Byte[] buffer, Int32& offset, Int32& count)
at System.Web.HttpBufferlessInputStream.Read(Byte[] buffer, Int32 offset, Int32 count)
at Microsoft.Owin.Host.SystemWeb.CallStreams.DelegatingStream.Read(Byte[] buffer, Int32 offset, Int32 count)
at System.Web.Http.NonOwnedStream.Read(Byte[] buffer, Int32 offset, Int32 count)
at System.Net.Http.DelegatingStream.Read(Byte[] buffer, Int32 offset, Int32 count)
at Microsoft.WindowsAzure.Storage.Core.Util.StreamExtensions.WriteToSync[T](Stream stream, Stream toStream, Nullable`1 copyLength, Nullable`1 maxLength, Boolean calculateMd5, Boolean syncRead, ExecutionState`1 executionState, StreamDescriptor streamCopyState)
at Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob.UploadFromStreamHelper(Stream source, Nullable`1 length, AccessCondition accessCondition, BlobRequestOptions options, OperationContext operationContext)
at Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob.UploadFromStream(Stream source, AccessCondition accessCondition, BlobRequestOptions options, OperationContext operationContext)

I've changed my Web.config according to some tutorials on web,

<system.web>
  <httpRuntime targetFramework="4.5" maxRequestLength="600000" executionTimeout="3600" />
  <compilation debug="true" targetFramework="4.5" />
</system.web>

<system.webServer>
  <security>
    <requestFiltering>
      <requestLimits maxAllowedContentLength="600000000" />
    </requestFiltering>
  </security>
</system.webServer>

And the API works correctly on my local machine, even with some large videos(larger than 10MB). But it doesn't work as soon as I deploy in Azure Web Service. Why is that?

Update: I used some other methods, tring to bypass the problem. Such as I tried copy the HTTP request content to a FileStream, and upload the FileStream to Azure blob. The code like that:

using (Stream output = File.OpenWrite(TmpFile))
{
    await Request.Content.CopyToAsync(output);
}

CloudBlockBlob blockBlob = container.GetBlockBlobReference(blobName);

using (var fileStream = System.IO.File.OpenRead(TmpFile))
{
    blockBlob.UploadFromStream(fileStream);
}

But the same exception "Maximum request length exceeded" throw at function CopyToAsync.

System.Web.HttpException (0x80004005): Maximum request length exceeded.
at System.Web.HttpBufferlessInputStream.ValidateRequestEntityLength()
at System.Web.HttpBufferlessInputStream.GetPreloadedContent(Byte[] buffer, Int32& offset, Int32& count)
at System.Web.HttpBufferlessInputStream.BeginRead(Byte[] buffer, Int32 offset, Int32 count, AsyncCallback callback, Object state)
at Microsoft.Owin.Host.SystemWeb.CallStreams.DelegatingStream.BeginRead(Byte[] buffer, Int32 offset, Int32 count, AsyncCallback callback, Object state)
at System.Web.Http.NonOwnedStream.BeginRead(Byte[] buffer, Int32 offset, Int32 count, AsyncCallback callback, Object state)
at System.Net.Http.StreamToStreamCopy.StartRead()

CHANGES ARE TO THE Web.Config FILE.

The above solution work's I thought I would just add the code changes necessary instead of needing to download the solution and search through it to find what changes are necessary. Add the following between the system.web tabs which should already exist with other information between them.

<system.web>
<compilation debug="true" targetFramework="4.5.2" />
<httpRuntime targetFramework="4.5.2" maxRequestLength="600000" executionTimeout="3600" />
</system.web>

Then add the following between the system.webserver tabs which should already exist with other information between them.

<system.webserver>
<security>
  <requestFiltering>
    <requestLimits maxAllowedContentLength="600000000" />
  </requestFiltering>
</security>

After that I sent a 12.5 mb .zip file as a blob to azure file storage. Using the code provided by Azure.

 // Retrieve storage account from connection string.
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(
    CloudConfigurationManager.GetSetting("StorageConnectionString"));

// Create the blob client.
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();

// Retrieve reference to a previously created container.
CloudBlobContainer container = blobClient.GetContainerReference("mycontainer");

// Retrieve reference to a blob named "myblob".
CloudBlockBlob blockBlob = container.GetBlockBlobReference("myblob");

// Create or overwrite the "myblob" blob with contents from a local file.
using (var fileStream = System.IO.File.OpenRead(@"path\myfile"))
{
    blockBlob.UploadFromStream(fileStream);
}  

I hope this helps someone.

Here's a sample that uploads a random 20MB byte array from a console client to a Web API controller, which streams the HTTP request content into Azure blob storage:

sample

Note that it buffers on the client side (builds the entire 20MB byte array in memory), but it does not buffer on the server.

I tested this successfully with the client running on my laptop, and the server running both locally and in an Azure Web App (no extra config, just published the Web API project into the Azure web app).

Hope this helps... best of luck!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM