简体   繁体   English

DbContext OutOfMemoryException

[英]DbContext OutOfMemoryException

I have a DbContext with a dataset of >20M records, that has to be converted to a different data format. 我有一个DbContext,其数据集大于20M,必须转换为不同的数据格式。 Therefore, I read the data into memory, perform some tasks and then dispose the DbContext. 因此,我将数据读入内存,执行一些任务然后处理DbContext。 The code works fine, but after a while I get OutOfMemoryExceptions. 代码工作正常,但过了一段时间我得到OutOfMemoryExceptions。 I have been able to narrow it down to the following piece of code, where I retrieve 2M records, then release them and fetch them again. 我已经能够将其缩小到下面的代码片段,在那里我检索2M记录,然后释放它们并再次获取它们。 The first retrieval works just fine, the second one throws an exception. 第一次检索工作得很好,第二次检索抛出异常。

// first call runs fine
using (var dbContext = new CustomDbContext())
{
    var list = dbContext.Items.Take(2000000).ToArray();
    foreach (var item in list)
    {
        // perform conversion tasks...
        item.Converted = true;
    }
}

// second call throws exception
using (var dbContext = new CustomDbContext())
{
    var list = dbContext.Items.Take(2000000).ToArray();
    foreach (var item in list)
    {
        // perform conversion tasks...
        item.Converted = true;
    }
}

Shouldn't the GC automatically release all memory allocated in the first using block, such that the second block should run as fine as the first one? GC不应该自动释放第一个使用块中分配的所有内存,这样第二个块应该像第一个块一样运行吗?

In my actual code, I do not retrieve 2 million records at once, but something between 0 and 30K in each iteration. 在我的实际代码中,我不会一次检索200万条记录,而是在每次迭代中检索0到30K之间的记录。 However, after about 15 minutes, I run out of memory, although all objects should have been released. 然而,大约15分钟后,我的内存耗尽,尽管所有对象都应该被释放。

I suspect you met LOH. 我怀疑你遇到了LOH。 Probably your objects are bigger than threashold and they are getting there, thus GC doesnt help by default. 可能你的对象比threashold更大,他们到达那里,因此默认情况下GC没有帮助。

Try this: https://www.simple-talk.com/dotnet/.net-framework/large-object-heap-compaction-should-you-use-it/ 试试这个: https//www.simple-talk.com/dotnet/.net-framework/large-object-heap-compaction-should-you-use-it/

and see if your exception goes away. 看看你的例外是否消失了。

ie add this between first and second part: 即在第一和第二部分之间添加:

GCSettings.LargeObjectHeapCompactionMode = GCLargeObjectHeapCompactionMode.CompactOnce;
GC.Collect(); 

IEnumerable has GetEnumerator() so you could try this to avoid .ToArray() or .ToList() that aren´t necessary if you just want to read: IEnumerable有GetEnumerator()所以你可以尝试这个来避免.ToArray()或.ToList(),如果你只是想读取它是不必要的:

// first call
using (var dbContext = new CustomDbContext())
{
    foreach (var item in dbContext.Items.Take(2000000))
    {
        // perform conversion tasks...
        item.Converted = true;
    }
}

// second call
using (var dbContext = new CustomDbContext())
{
    foreach (var item in dbContext.Items.Take(2000000))
    {
        // perform conversion tasks...
        item.Converted = true;
    }
}

Running GC will not help you, you have to run each iteration in different context. 运行GC对您没有帮助,您必须在不同的上下文中运行每次迭代。 And dispose your context. 并处理你的背景。

// ID is your primary key

long startID = 0;
while(true){
    using(var db = new CustomDbContext()){
        var slice = db.Items.Where(x=>x.ID > startID)
                            .OrderBy(x=>x.ID)
                            .Take(1000).ToList();

        // stop if there is nothing to process
        if(!slice.Any())
             break;

        foreach(var item in slice){
            // your logic...
            item.Converted = true;
        }

        startID = slice.Last().ID;
    }
}

If you want to process these things faster, alternate approach would be to run slices in parallel .... 如果你想更快地处理这些事情,替代方法是并行运行切片....

Alternate Approach 替代方法

I would recommend using dividing slices in 100x100, then I can process 100 slices of 100 items in parallel. 我建议在100x100中使用分割片,然后我可以并行处理100片100片。

You can always easily customize slicing to meet your speed needs. 您可以随时轻松自定义切片以满足您的速度需求。

public IEnumerable<IEnumerable<T>> Slice(IEnumerable<T> src, int size){
     while(src.Any()){
         var s = src.Take(size);
         src = src.Skip(size);
         yield return s;
     }
}

long startID = 0;
while(true){
    using(var db = new CustomDbContext()){
        var src = db.Items.Where(x=>x.ID > startID)
                            .OrderBy(x=>x.ID)
                            .Take(10000).Select(x=>x.ID).ToList();

        // stop if there is nothing to process
        if(!src.Any())
             break;

        Parallel.ForEach(src.Slice(100), slice => {

             using(var sdb = new CustomDbContext()){
                 foreach(var item in sdb.Items.Where(x=> slice.Contains(x.ID)){
                     item.Converted = true;
                 }
             }

        } );

        startID = src.Last();
    }
}

After refactoring, memory gets released. 重构后,内存被释放。 I don't know why, but it works. 我不知道为什么,但它确实有效。

private static void Debug()
{
    var iteration = 0;
    while(true)
    {
        Console.WriteLine("Iteration {0}", iteration++);
        Convert();
    }
}

private static void Convert()
{
    using (var dbContext = new CustomDbContext(args[0]))
    {
        var list = dbContext.Items.Take(2000000).ToList();
        foreach (var item in list)
        {
            item.Converted = true;
        }
    }
}

When I move the content of Convert() to the while loop in Debug(), the OutOfMemoryExceptions is thrown. 当我将Convert()的内容移动到Debug()中的while循环时,抛出OutOfMemoryExceptions。

private static void Debug()
{
    var iteration = 0;
    while(true)
    {
        Console.WriteLine("Iteration {0}", iteration++);
        using (var dbContext = new CustomDbContext(args[0]))
        {
            // OutOfMemoryException in second iteration
            var list = dbContext.Items.Take(2000000).ToList(); 
            foreach (var item in list)
            {
                item.Converted = true;
            }
        }
    }
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM