简体   繁体   English

ASP.NET MVC / WEB API文件上载:上载后内存未被释放

[英]ASP.NET MVC/WEB API File Upload: Memory not being deallocated after upload

I'm investigating a possible memory leak problem on a project where the user uploads files. 我正在研究用户上传文件的项目可能存在的内存泄漏问题。 The files are usually .zip or .exe zipped files that are used for other software. 这些文件通常是.zip或.exe压缩文件,用于其他软件。 The average size of the files is 80MB 文件的平均大小为80MB

There is a MVC app that has the interface to upload the files (View). 有一个MVC应用程序,它具有上传文件的界面(View)。 This view sends a POST request to an action within a controller. 此视图向控制器内的操作发送POST请求。 This controller action gets the file using the MultipartFormDataContent similiar to this: Sending binary data along with a REST API request and this: WEB API FILE UPLOAD, SINGLE OR MULTIPLE FILES 此控制器操作使用与此类似的MultipartFormDataContent获取文件: 发送二进制数据以及REST API请求,并且: WEB API文件上传,单个或多个文件

Inside the action, I get the file and convert it to a byte array. 在操作中,我获取文件并将其转换为字节数组。 After converting, I send a post request to my API with the byte[] array. 转换后,我使用byte []数组向我的API发送一个post请求。

Here is the MVC APP code that does that: 这是执行该操作的MVC APP代码:

[HttpPost]
    public async Task<ActionResult> Create(ReaderCreateViewModel model)
    {
        HttpPostedFileBase file = Request.Files["Upload"];

        string fileName = file.FileName;

        using (var client = new HttpClient())
        {
            using (var content = new MultipartFormDataContent())
            {                   
                using (var binaryReader = new BinaryReader(file.InputStream))
                {
                    model.File = binaryReader.ReadBytes(file.ContentLength);
                }

                var fileContent = new ByteArrayContent(model.File);
                fileContent.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment")
                {
                    FileName = file.FileName
                };
                content.Add(fileContent);

                var requestUri = "http://localhost:52970/api/upload";
                HttpResponseMessage response = client.PostAsync(requestUri, content).Result;

                if (response.IsSuccessStatusCode)
                {                       
                    return RedirectToAction("Index");
                }
            }
        }

        return View("Index", model);
    }

After investigating using several memory tools such as this: Best Practices No. 5: Detecting .NET application memory leaks I have discovered that after converting the file to a byte array at this line: 在调查使用如下的几个内存工具之后: 最佳实践5:检测.NET应用程序内存泄漏我发现在将此文件转换为此行的字节数组之后:

using (var binaryReader = new BinaryReader(file.InputStream))
{
      model.File = binaryReader.ReadBytes(file.ContentLength);
}

The memory usage increases from 70MB + or - to 175mb + or - and even after sending and finishing the request, the memory is never deallocated. 内存使用量从70MB +或 - 增加到175mb +或 - 甚至在发送和完成请求后,内存永远不会被释放。 If I keep uploading files, the memory just keep increasing until the server is completly down. 如果我继续上传文件,内存会不断增加,直到服务器完全关闭。

We can't send the files directly from a multi-part form to the API because we need to send and validate some data before (business requirements/rules). 我们无法将文件直接从多部分表单发送到API,因为我们需要先发送和验证一些数据(业务需求/规则)。 After researching, I've came with this approach but the memory leak problem is concerning me. 经过研究,我采用了这种方法,但内存泄漏问题与我有关。

Am I missing something? 我错过了什么吗? Should the garbage collector collect the memory right away? 垃圾收集器应该立即收集内存吗? In all disposable objects I'm using the "using" syntax but it doesn't help. 在所有一次性对象中,我使用的是“使用”语法,但它没有帮助。

I'm also curious about this approach to upload the files. 我也很好奇这种上传文件的方法。 Should I be doing in a different way? 我应该以不同的方式做吗?

Just for clarification, the API is separated from the MVC application(each one is hosted on a separated web site in IIS), and it is all in C#. 只是为了澄清,API与MVC应用程序分离(每个应用程序都托管在IIS中的一个独立的网站上),它全部都在C#中。

1. Should the garbage collector collect the memory right away? 1.垃圾收集器是否应立即收集内存?

The garbage collector does not release the memory immediately because it is a time-consuming operation. 垃圾收集器不会立即释放内存,因为这是一个非常耗时的操作。 When a garbage collection occurs, all your application's managed threads are paused. 发生垃圾收集时,将暂停所有应用程序的托管线程。 This introduces unwanted latency. 这会引入不必要的延迟 So, the garbage collector only acts occasionally, based on a sophisticated algorithm. 因此,垃圾收集器仅基于复杂的算法偶尔执行。

2. In all disposable objects I'm using the "using" syntax but it doesn't help. 2.在所有一次性对象中,我使用的是“使用”语法,但它没有帮助。

The using statement deals with unmanaged resources which are in limited supply (usually IO-related, like file handles, database and network connections). using语句处理有限供应的非托管资源(通常与IO相关,如文件句柄,数据库和网络连接)。 Thus, this statement does not affect garbage collection. 因此,此语句不会影响垃圾回收。

3. Am I missing something? 我错过了什么吗?

It looks like you do not need the original byte array after you have wrapped it with ByteArrayContent . 在用ByteArrayContent包装它之后,看起来你不需要原始的字节数组。 You do not clean up model.File after wrapping it, and the array can end up being passed to the Index view. 在包装之后你不会清理model.File ,并且数组最终可以传递给Index视图。

I would replace: 我会替换:

using(var binaryReader = new BinaryReader(file.InputStream)) {
    model.File = binaryReader.ReadBytes(file.ContentLength);
}
var fileContent = new ByteArrayContent(model.File);

with: 有:

ByteArrayContent fileContent = null;
using(var binaryReader = new BinaryReader(file.InputStream)) {
    fileContent = new ByteArrayContent(binaryReader.ReadBytes(file.ContentLength));
}

to avoid the need to clean up model.File explicitely. 避免需要清理model.File 。明确表示文件。

4. If I keep uploading files, the memory just keeps increasing until the server is completely down. 4.如果我继续上传文件,内存会不断增加,直到服务器完全关闭。

If your files are 80MB on average, they end up on the large object heap. 如果您的文件平均为80MB,则它们最终会出现在大对象堆上。 The heap is not compacted automatically and usually is not garbage collected. 堆不会自动压缩,通常不会被垃圾回收。 It looks like in your case, the large object heap grows indefinitely (which can happen). 看起来在你的情况下,大对象堆无限增长(可能会发生)。

Provided you are using (or can upgrade to) .NET 4.5.1 or newer, you can force the large object heap to be compacted by setting: 如果您正在使用(或可以升级到).NET 4.5.1或更高版本,则可以通过设置来强制压缩大对象堆:

System.Runtime.GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce;

You will need to invoke this line of code each time you want to schedule a large object heap compaction at the next full garbage collection. 每次要在下一个完整垃圾回收集中调度大对象堆压缩时,都需要调用此行代码。

You can also force an immediate compaction by calling: 您还可以通过调用强制立即压缩:

System.Runtime.GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce;
System.GC.Collect();

However, if you need to free a lot of memory, this will be a costly operation in terms of time. 但是,如果您需要释放大量内存,那么就时间而言,这将是一项代价高昂的操作。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM