简体   繁体   中英

.NET Web API Blueimp multiple file upload error “Unexpected end of MIME multipart stream. MIME multipart message is not complete.”

I am building a video management app that allows for uploading multiple videos to Azure Storage which is then encoded by Azure Media Services.

My issue is that if I upload just 1 file at a time with blueimp, everything works fine. When I add more than one file to the upload, I get the error on the second file.

Unexpected end of MIME multipart stream. MIME multipart message is not complete.

I have read that it could be to the stream missing the end of file terminator, so I added in the suggested tweak to append the line terminator (per this article ASP.NET Web API, unexpected end of MIME multi-part stream when uploading from Flex FileReference ) with no luck.

If I post as single files (by either iterating over the files selected for upload) and send them as individual posts, it works. My issue is that I want to select several files as well as add additional metadata and hit one submit button. When I do it this way, the first file uploads, and the second one appears to start to upload but then I get the error 500 "Unexpected end of MIME multipart stream. MIME multipart message is not complete" message.

Here is my upload code (Web API):

[HttpPost]
    public async Task<HttpResponseMessage> UploadMedia()
    {
        HttpResponseMessage result = null;
        var httpRequest = HttpContext.Current.Request;
        if (httpRequest.Headers["content-type"] != null)
        {
            httpRequest.Headers.Remove("content-type");
        }
        httpRequest.Headers.Add("enctype", "multipart/form-data");
        if (httpRequest.Files.Count > 0)
        {
            var docfiles = new List<string>();
            foreach (string file in httpRequest.Files)
            {
                var postedFile = httpRequest.Files[file];
                var filePath = HttpContext.Current.Server.MapPath("~/" + postedFile.FileName);
                string assignedSectionList = string.Empty;
                postedFile.SaveAs(filePath);
                docfiles.Add(filePath);

                string random = Helpers.Helper.RandomDigits(10).ToString();

                string ext = System.IO.Path.GetExtension(filePath);

                string newFileName = (random + ext).ToLower();

                MediaType mediaType = MediaType.Video;
                if (newFileName.Contains(".mp3"))
                {
                    mediaType = MediaType.Audio;
                }

               if (httpRequest.Form["sectionList"] != null)
                {
                    assignedSectionList = httpRequest.Form["sectionList"];
                }

                MediaUploadQueue mediaUploadQueueItem = new MediaUploadQueue();
                mediaUploadQueueItem.OriginalFileName = postedFile.FileName;
                mediaUploadQueueItem.FileName = newFileName;
                mediaUploadQueueItem.UploadedDateTime = DateTime.UtcNow;
                mediaUploadQueueItem.LastUpdatedDateTime = DateTime.UtcNow;
                mediaUploadQueueItem.Status = "pending";
                mediaUploadQueueItem.Size = postedFile.ContentLength;
                mediaUploadQueueItem.Identifier = random;
                mediaUploadQueueItem.MediaType = mediaType;
                mediaUploadQueueItem.AssignedSectionList = assignedSectionList;
                db.MediaUploadQueue.Add(mediaUploadQueueItem);
                db.SaveChanges();

     

                byte[] chunk = new byte[httpRequest.ContentLength];
                httpRequest.InputStream.Read(chunk, 0, Convert.ToInt32(httpRequest.ContentLength));
                var provider = new AzureStorageMultipartFormDataStreamProviderNoMod(new AzureMediaServicesHelper().container);
                provider.fileNameOverride = newFileName;
                await Request.Content.ReadAsMultipartAsync(provider); //this uploads it to the storage account
                

                //AzureMediaServicesHelper amsHelper = new AzureMediaServicesHelper();
                string assetId = amsHelper.CommitAsset(mediaUploadQueueItem); //begin the process of encoding the file

                mediaUploadQueueItem.AssetId = assetId;
                db.SaveChanges();

                ////start the encoding
                amsHelper.EncodeAsset(assetId);

            }
            result = Request.CreateResponse(HttpStatusCode.Created, docfiles);

        }
        else
        {
            result = Request.CreateResponse(HttpStatusCode.BadRequest);
        }
        return result;
    }

Here is the code for the upload handler that sends to Azure Blob storage

    public override Stream GetStream(HttpContent parent, HttpContentHeaders headers)
    {
        if (parent == null) throw new ArgumentNullException(nameof(parent));
        if (headers == null) throw new ArgumentNullException(nameof(headers));

        if (!_supportedMimeTypes.Contains(headers.ContentType.ToString().ToLower()))
        {
            throw new NotSupportedException("Only jpeg and png are supported");
        }

        // Generate a new filename for every new blob
        var fileName = Guid.NewGuid().ToString();

        if (!String.IsNullOrEmpty(fileNameOverride))
            fileName = fileNameOverride;

        CloudBlockBlob blob = _blobContainer.GetBlockBlobReference(fileName);

        if (headers.ContentType != null)
        {
            // Set appropriate content type for your uploaded file
            blob.Properties.ContentType = headers.ContentType.MediaType;
        }

        this.FileData.Add(new MultipartFileData(headers, blob.Name));

        return blob.OpenWrite();
    }

Here is the javascript code. The first one is sending the files individual as separate posts, it works.

$("#fileupload").fileupload({
    autoUpload: false,
    dataType: "json",
    add: function (e, data) {
        data.context = $('<p class="file">')
            .append($('<a target="_blank">').text(data.files[0].name))
            .appendTo(document.body);
        data.submit();
    },
    progress: function (e, data) {
        var progress = parseInt((data.loaded / data.total) * 100, 10);
        data.context.css("background-position-x", 100 - progress + "%");
    },
    done: function (e, data) {
        data.context
            .addClass("done")
            .find("a")
            .prop("href", data.result.files[0].url);
    }
});

This code below does not work. It pushes all the files into and array and sends them in one single post. This one fails on the second file. If I upload just one file using this code, it works.

var filesList = new Array();
$(function () {
    $('#fileupload').fileupload({
        autoUpload: false,
        dropZone: $('#dropzone'),
        add: function (e, data) {
            filesList.push(data.files[0]);
            data.context = $('<div class="file"/>', { class: 'thumbnail pull-left' }).appendTo('#files');
            var node = $('<p />').append($('<span/>').text(data.files[0].name).data(data));
            node.appendTo(data.context);
        },
        progress: function (e, data) { //Still working on this part
            //var progress = parseInt((data.loaded / data.total) * 100, 10);
            //data.context.css("background-position-x", 100 - progress + "%");
        },
    }).on('fileuploadprocessalways', function (e, data) {
        var index = data.index,
            file = data.files[index],
            node = $(data.context.children()[index]);
        if (file.preview) {
            node.prepend('<br>').prepend(file.preview);
        }
        if (file.error) {
            node.append('<br>').append($('<span class="text-danger"/>').text(file.error));
        }
    }).prop('disabled', !$.support.fileInput)
        .parent().addClass($.support.fileInput ? undefined : 'disabled');
    $("#uploadform").submit(function (event) {
        if (filesList.length > 0) {
            console.log("multi file submit");
            event.preventDefault();
            $('#fileupload').fileupload('send', { files: filesList })
                .success(function (result, textStatus, jqXHR) { console.log('success'); })
                .error(function (jqXHR, textStatus, errorThrown) { console.log('error'); })
                .complete(function (result, textStatus, jqXHR) {
                    console.log('complete: ' + JSON.stringify(result)); //The error 500 is returned here. In fiddler, it shows and error 500. If I try to trap in Visual Studio, I can't seem to pinpoint the exception.
                    // window.location='back to view-page after submit?'
                });
        } else {
            console.log("plain default form submit");
        }
    });
});

Any thoughts on why this would be happening? I have tried every approach I can think about with no luck. Thank you in advance!

I want to point out that the architecture of your code might cause timeouts or errors.

I would first upload everything to azure storage, storage the status in cache or database.

then I would fire a background job (hangfire, azure functions, webjobs) to process uploading to media service to do the other stuff.

I would suggest doing this asynchronously from the user input.

as per the documentation of dropzone make sure you add name in the HTML tag

<form action="/file-upload" class="dropzone">
  <div class="fallback">
    <input name="file" type="file" multiple />
  </div>
</form>

if you are doing it programatically:

function param() {
                return "files";
            }

 Dropzone.options.myDropzone = {
                uploadMultiple: true,
                paramName: param,
}

on the backend you need to add \r\n after each stream:

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM