简体   繁体   中英

ASP.NET MVC/WEB API File Upload: Memory not being deallocated after upload

I'm investigating a possible memory leak problem on a project where the user uploads files. The files are usually .zip or .exe zipped files that are used for other software. The average size of the files is 80MB

There is a MVC app that has the interface to upload the files (View). This view sends a POST request to an action within a controller. This controller action gets the file using the MultipartFormDataContent similiar to this: Sending binary data along with a REST API request and this: WEB API FILE UPLOAD, SINGLE OR MULTIPLE FILES

Inside the action, I get the file and convert it to a byte array. After converting, I send a post request to my API with the byte[] array.

Here is the MVC APP code that does that:

[HttpPost]
    public async Task<ActionResult> Create(ReaderCreateViewModel model)
    {
        HttpPostedFileBase file = Request.Files["Upload"];

        string fileName = file.FileName;

        using (var client = new HttpClient())
        {
            using (var content = new MultipartFormDataContent())
            {                   
                using (var binaryReader = new BinaryReader(file.InputStream))
                {
                    model.File = binaryReader.ReadBytes(file.ContentLength);
                }

                var fileContent = new ByteArrayContent(model.File);
                fileContent.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment")
                {
                    FileName = file.FileName
                };
                content.Add(fileContent);

                var requestUri = "http://localhost:52970/api/upload";
                HttpResponseMessage response = client.PostAsync(requestUri, content).Result;

                if (response.IsSuccessStatusCode)
                {                       
                    return RedirectToAction("Index");
                }
            }
        }

        return View("Index", model);
    }

After investigating using several memory tools such as this: Best Practices No. 5: Detecting .NET application memory leaks I have discovered that after converting the file to a byte array at this line:

using (var binaryReader = new BinaryReader(file.InputStream))
{
      model.File = binaryReader.ReadBytes(file.ContentLength);
}

The memory usage increases from 70MB + or - to 175mb + or - and even after sending and finishing the request, the memory is never deallocated. If I keep uploading files, the memory just keep increasing until the server is completly down.

We can't send the files directly from a multi-part form to the API because we need to send and validate some data before (business requirements/rules). After researching, I've came with this approach but the memory leak problem is concerning me.

Am I missing something? Should the garbage collector collect the memory right away? In all disposable objects I'm using the "using" syntax but it doesn't help.

I'm also curious about this approach to upload the files. Should I be doing in a different way?

Just for clarification, the API is separated from the MVC application(each one is hosted on a separated web site in IIS), and it is all in C#.

1. Should the garbage collector collect the memory right away?

The garbage collector does not release the memory immediately because it is a time-consuming operation. When a garbage collection occurs, all your application's managed threads are paused. This introduces unwanted latency. So, the garbage collector only acts occasionally, based on a sophisticated algorithm.

2. In all disposable objects I'm using the "using" syntax but it doesn't help.

The using statement deals with unmanaged resources which are in limited supply (usually IO-related, like file handles, database and network connections). Thus, this statement does not affect garbage collection.

3. Am I missing something?

It looks like you do not need the original byte array after you have wrapped it with ByteArrayContent . You do not clean up model.File after wrapping it, and the array can end up being passed to the Index view.

I would replace:

using(var binaryReader = new BinaryReader(file.InputStream)) {
    model.File = binaryReader.ReadBytes(file.ContentLength);
}
var fileContent = new ByteArrayContent(model.File);

with:

ByteArrayContent fileContent = null;
using(var binaryReader = new BinaryReader(file.InputStream)) {
    fileContent = new ByteArrayContent(binaryReader.ReadBytes(file.ContentLength));
}

to avoid the need to clean up model.File explicitely.

4. If I keep uploading files, the memory just keeps increasing until the server is completely down.

If your files are 80MB on average, they end up on the large object heap. The heap is not compacted automatically and usually is not garbage collected. It looks like in your case, the large object heap grows indefinitely (which can happen).

Provided you are using (or can upgrade to) .NET 4.5.1 or newer, you can force the large object heap to be compacted by setting:

System.Runtime.GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce;

You will need to invoke this line of code each time you want to schedule a large object heap compaction at the next full garbage collection.

You can also force an immediate compaction by calling:

System.Runtime.GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce;
System.GC.Collect();

However, if you need to free a lot of memory, this will be a costly operation in terms of time.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM