简体   繁体   中英

Out of memory with Entity Framework in a migration process- C#

I am doing a migration process that has to copy all information from one database to another one. When it starts to run all seems good, the problem was when the process reach a complex table.

This table is special because there is some enitities that dependes on it. I'm going to explain it better with an example:

  • entityX is a person.
  • entityX can have entityY(id account)
  • entityX can have a lof of entityZ(friends)
  • entityX can have a lot of entityV(addresses)
  • entityX can have a lot of entityW(cars)

The out of memory exception appears when the process has inserted 1159 entityX with its dependencies, more less.

After each entityX we use a function call CleanMemory that uses the GarbageCollector to clean memory.

Any solution to this issue?

public static void LoadExample(origin agmCtx, destinity.AgroPelayoEntities agpCtx)
    {//GET ALL THE INFO THAT WE NEED 
        List<EntityX> listOriginX = agmCtx.EntityX.AsNoTracking().ToList();

        foreach (EntityX  ent in list)
        {

            ///LISTS INSERTS//////
            List<destinityEntityX> listInsert = new List<destinity.EntityX>();
            List<destinity.EntityY> listInsertY = new List<destinity.EntityY>();
            List<destinity.EntityZ> listInsertZ = new List<destinity.EntityZ>();
            List<destinity.EntityV> listInsertV = new List<destinity.EntityV>();
            List<destinity.EntityW> listInsertW = new List<destinity.EntityW>();

            ///LISTS UPDATES//////
            List<destinity.EntityX> listUpdate = new List<destinity.EntityX>();

                Boolean exists = listOriginX.Any(e => (e.n_id == ent.n_id));
                if (!exists)
                {
                    //HERE GOES CODE TO CREATE NEW ENTITY AND HIS CHILD(EntityY,List<listInsertZ>, List<EntityV>....)
                    listInsertY.Add(newEntityW);
                    listInsertY.Add(newEntityV);
                    listInsertY.Add(newEntityZ);
                    listInsertY.Add(newEntityY);
                    listInsert.Add(newEntityX);
                }
                else
                {
                    //MODIFY TO HAVE NEW INFO

                    listUpdateV.Add(oldEntityV_Modified);


                }

                int batchSizeX = ClassCommonFuncts.GetNumBatchCount(listInsert.Count());
                int batchSizeY= ClassCommonFuncts.GetNumBatchCount(listInsertY.Count());
                int batchSizeZ = ClassCommonFuncts.GetNumBatchCount(listInsertZ.Count());
                int batchSizeV = ClassCommonFuncts.GetNumBatchCount(listInsertV.Count());
                int batchSizeW = ClassCommonFuncts.GetNumBatchCount(listInsertW.Count());

                int batchSizeUpdateX = ClassCommonFuncts.GetNumBatchCount(listUpdateV.Count());


                agpCtx.BulkInsert<destinity.EntityW>(listInsertW, bulk => bulk.BatchSize = batchSizeW);
                agpCtx.BulkInsert<destinity.EntityV>(listInsertV, bulk => bulk.BatchSize = batchSizeV);
                agpCtx.BulkInsert<destinity.EntityZ>(listInsertZ, bulk => bulk.BatchSize = batchSizeZ);
                agpCtx.BulkInsert<destinity.EntityY>(listInsertY, bulk => bulk.BatchSize = batchSizeY);
                agpCtx.BulkInsert<destinity.EntityX>(listInsert, bulk => bulk.BatchSize = batchSizeX);

                agpCtx.BulkUpdate<destinity.EntityX>(listUpdate, bulk => bulk.BatchSize = batchSizeUpdateX);

                ClassCommonFuncts.CleanMemory();


        }


    }

Function CleanMemory

[DllImport("kernel32.dll", EntryPoint = "SetProcessWorkingSetSize", ExactSpelling = true, CharSet = CharSet.Ansi, SetLastError = true)]
        private static extern int SetProcessWorkingSetSize(IntPtr process, int minimumWorkingSetSize, int maximumWorkingSetSize);
        public static void CleanMemory()
        {
            GC.Collect();
            GC.WaitForPendingFinalizers();
            SetProcessWorkingSetSize(System.Diagnostics.Process.GetCurrentProcess().Handle, -1, -1);
        }

Function GetNumBatchCount

public static int GetNumBatchCount(decimal records)
        {
            if (records > 1000000)
            {
                return (int)Math.Ceiling((double)records / 1000);
            }
            else if (records > 500000)
            {
                return (int)Math.Ceiling((double)records / 100);
            }
            else if (records > 100000)
            {
                return (int)Math.Ceiling((double)records / 50);
            }
            else if (records > 5000)
            {
                return (int)Math.Ceiling((double)records  / 10);
            }
            else
            {
                return (int)Math.Ceiling((double)records / 1);
            }

        }

Not long term solution. First try to set capacity for your Lists.

Also its a bad practice to call GC in place where you did.

One more thing. Making new lists in every iteration is bad practice and in code like this memory leak is expected.

The final solution was "simple". The problem is EntityY has a relation with another entity (1 to n), then when app run over 1000 EntityX crash throwing OutOfMemory Exception.

I fixed it, loading them in small groups, for example grouping by age and male. (In my case I did another group)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM