简体   繁体   English

迁移过程中使用Entity Framework内存不足-C#

[英]Out of memory with Entity Framework in a migration process- C#

I am doing a migration process that has to copy all information from one database to another one. 我正在执行一个迁移过程,该过程必须将所有信息从一个数据库复制到另一个数据库。 When it starts to run all seems good, the problem was when the process reach a complex table. 当它开始运行时,一切似乎都很好,问题出在进程到达复杂表时。

This table is special because there is some enitities that dependes on it. 该表是特殊的,因为有一些实体依赖于此。 I'm going to explain it better with an example: 我将通过一个示例更好地解释它:

  • entityX is a person. EntityX是一个人。
  • entityX can have entityY(id account) 实体X可以具有实体Y(id帐户)
  • entityX can have a lof of entityZ(friends) 实体X可以有一个实体Z(朋友)
  • entityX can have a lot of entityV(addresses) EntityX可以有很多entityV(地址)
  • entityX can have a lot of entityW(cars) 实体X可以有很多实体W(汽车)

The out of memory exception appears when the process has inserted 1159 entityX with its dependencies, more less. 当进程插入了1159个带有其依赖项的EntityX时,就会出现内存不足异常,而更少。

After each entityX we use a function call CleanMemory that uses the GarbageCollector to clean memory. 在每个entityX之后,我们使用函数调用CleanMemory,该函数使用GarbageCollector清理内存。

Any solution to this issue? 这个问题有解决方案吗?

public static void LoadExample(origin agmCtx, destinity.AgroPelayoEntities agpCtx)
    {//GET ALL THE INFO THAT WE NEED 
        List<EntityX> listOriginX = agmCtx.EntityX.AsNoTracking().ToList();

        foreach (EntityX  ent in list)
        {

            ///LISTS INSERTS//////
            List<destinityEntityX> listInsert = new List<destinity.EntityX>();
            List<destinity.EntityY> listInsertY = new List<destinity.EntityY>();
            List<destinity.EntityZ> listInsertZ = new List<destinity.EntityZ>();
            List<destinity.EntityV> listInsertV = new List<destinity.EntityV>();
            List<destinity.EntityW> listInsertW = new List<destinity.EntityW>();

            ///LISTS UPDATES//////
            List<destinity.EntityX> listUpdate = new List<destinity.EntityX>();

                Boolean exists = listOriginX.Any(e => (e.n_id == ent.n_id));
                if (!exists)
                {
                    //HERE GOES CODE TO CREATE NEW ENTITY AND HIS CHILD(EntityY,List<listInsertZ>, List<EntityV>....)
                    listInsertY.Add(newEntityW);
                    listInsertY.Add(newEntityV);
                    listInsertY.Add(newEntityZ);
                    listInsertY.Add(newEntityY);
                    listInsert.Add(newEntityX);
                }
                else
                {
                    //MODIFY TO HAVE NEW INFO

                    listUpdateV.Add(oldEntityV_Modified);


                }

                int batchSizeX = ClassCommonFuncts.GetNumBatchCount(listInsert.Count());
                int batchSizeY= ClassCommonFuncts.GetNumBatchCount(listInsertY.Count());
                int batchSizeZ = ClassCommonFuncts.GetNumBatchCount(listInsertZ.Count());
                int batchSizeV = ClassCommonFuncts.GetNumBatchCount(listInsertV.Count());
                int batchSizeW = ClassCommonFuncts.GetNumBatchCount(listInsertW.Count());

                int batchSizeUpdateX = ClassCommonFuncts.GetNumBatchCount(listUpdateV.Count());


                agpCtx.BulkInsert<destinity.EntityW>(listInsertW, bulk => bulk.BatchSize = batchSizeW);
                agpCtx.BulkInsert<destinity.EntityV>(listInsertV, bulk => bulk.BatchSize = batchSizeV);
                agpCtx.BulkInsert<destinity.EntityZ>(listInsertZ, bulk => bulk.BatchSize = batchSizeZ);
                agpCtx.BulkInsert<destinity.EntityY>(listInsertY, bulk => bulk.BatchSize = batchSizeY);
                agpCtx.BulkInsert<destinity.EntityX>(listInsert, bulk => bulk.BatchSize = batchSizeX);

                agpCtx.BulkUpdate<destinity.EntityX>(listUpdate, bulk => bulk.BatchSize = batchSizeUpdateX);

                ClassCommonFuncts.CleanMemory();


        }


    }

Function CleanMemory 功能CleanMemory

[DllImport("kernel32.dll", EntryPoint = "SetProcessWorkingSetSize", ExactSpelling = true, CharSet = CharSet.Ansi, SetLastError = true)]
        private static extern int SetProcessWorkingSetSize(IntPtr process, int minimumWorkingSetSize, int maximumWorkingSetSize);
        public static void CleanMemory()
        {
            GC.Collect();
            GC.WaitForPendingFinalizers();
            SetProcessWorkingSetSize(System.Diagnostics.Process.GetCurrentProcess().Handle, -1, -1);
        }

Function GetNumBatchCount 函数GetNumBatchCount

public static int GetNumBatchCount(decimal records)
        {
            if (records > 1000000)
            {
                return (int)Math.Ceiling((double)records / 1000);
            }
            else if (records > 500000)
            {
                return (int)Math.Ceiling((double)records / 100);
            }
            else if (records > 100000)
            {
                return (int)Math.Ceiling((double)records / 50);
            }
            else if (records > 5000)
            {
                return (int)Math.Ceiling((double)records  / 10);
            }
            else
            {
                return (int)Math.Ceiling((double)records / 1);
            }

        }

Not long term solution. 没有长期解决方案。 First try to set capacity for your Lists. 首先尝试为您的列表设置容量。

Also its a bad practice to call GC in place where you did. 在您做的地方调用GC也是一种不好的做法。

One more thing. 还有一件事。 Making new lists in every iteration is bad practice and in code like this memory leak is expected. 在每次迭代中创建新列表都是不明智的做法,并且在类似这种内存泄漏的代码中,这种情况是可预期的。

The final solution was "simple". 最终的解决方案是“简单”。 The problem is EntityY has a relation with another entity (1 to n), then when app run over 1000 EntityX crash throwing OutOfMemory Exception. 问题是EntityY与另一个实体(从1到n)有关系,然后当应用程序运行超过1000个EntityX崩溃时,抛出OutOfMemory Exception。

I fixed it, loading them in small groups, for example grouping by age and male. 我固定了它,将它们分成小组,例如按年龄和男性分组。 (In my case I did another group) (就我而言,我做了另一个小组)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM