简体   繁体   English

在实体框架中插入的最快方法

[英]Fastest Way of Inserting in Entity Framework

I'm looking for the fastest way of inserting into Entity Framework.我正在寻找插入实体框架的最快方法。

I'm asking this because of the scenario where you have an active TransactionScope and the insertion is huge (4000+).我问这个是因为您有一个活动的TransactionScope并且插入量很大(4000+)。 It can potentially last more than 10 minutes (default timeout of transactions), and this will lead to an incomplete transaction.它可能会持续超过 10 分钟(交易的默认超时),这将导致交易不完整。

To your remark in the comments to your question:对于您在问题评论中的评论:

"...SavingChanges ( for each record )..." “...SavingChanges(对于每条记录)...”

That's the worst thing you can do!这是你能做的最糟糕的事情! Calling SaveChanges() for each record slows bulk inserts extremely down.为每条记录调用SaveChanges()会极大地减慢批量插入的速度。 I would do a few simple tests which will very likely improve the performance:我会做一些简单的测试,这很可能会提高性能:

  • Call SaveChanges() once after ALL records.在所有记录后调用SaveChanges()一次。
  • Call SaveChanges() after for example 100 records.例如,在 100 条记录后调用SaveChanges()
  • Call SaveChanges() after for example 100 records and dispose the context and create a new one.在例如 100 条记录后调用SaveChanges()并处理上下文并创建一个新的。
  • Disable change detection禁用更改检测

For bulk inserts I am working and experimenting with a pattern like this:对于批量插入,我正在尝试使用这样的模式:

using (TransactionScope scope = new TransactionScope())
{
    MyDbContext context = null;
    try
    {
        context = new MyDbContext();
        context.Configuration.AutoDetectChangesEnabled = false;

        int count = 0;            
        foreach (var entityToInsert in someCollectionOfEntitiesToInsert)
        {
            ++count;
            context = AddToContext(context, entityToInsert, count, 100, true);
        }

        context.SaveChanges();
    }
    finally
    {
        if (context != null)
            context.Dispose();
    }

    scope.Complete();
}

private MyDbContext AddToContext(MyDbContext context,
    Entity entity, int count, int commitCount, bool recreateContext)
{
    context.Set<Entity>().Add(entity);

    if (count % commitCount == 0)
    {
        context.SaveChanges();
        if (recreateContext)
        {
            context.Dispose();
            context = new MyDbContext();
            context.Configuration.AutoDetectChangesEnabled = false;
        }
    }

    return context;
}

I have a test program which inserts 560.000 entities (9 scalar properties, no navigation properties) into the DB.我有一个测试程序,它将 560.000 个实体(9 个标量属性,没有导航属性)插入到数据库中。 With this code it works in less than 3 minutes.使用此代码,它可以在不到 3 分钟的时间内工作。

For the performance it is important to call SaveChanges() after "many" records ("many" around 100 or 1000).为了提高性能,在“许多”记录(“许多”大约 100 或 1000)之后调用SaveChanges()很重要。 It also improves the performance to dispose the context after SaveChanges and create a new one.它还提高了在 SaveChanges 之后处理上下文并创建新上下文的性能。 This clears the context from all entites, SaveChanges doesn't do that, the entities are still attached to the context in state Unchanged .这会清除所有实体的上下文, SaveChanges不会这样做,实体仍以Unchanged状态附加到上下文。 It is the growing size of attached entities in the context what slows down the insertion step by step.上下文中附加实体的大小不断增加,从而逐步减慢了插入速度。 So, it is helpful to clear it after some time.因此,一段时间后清除它会很有帮助。

Here are a few measurements for my 560000 entities:以下是我的 560000 个实体的一些测量结果:

  • commitCount = 1, recreateContext = false: many hours (That's your current procedure) commitCount = 1, recreateContext = false: many hours (那是你目前的程序)
  • commitCount = 100, recreateContext = false: more than 20 minutes commitCount = 100,recreateContext = false:超过20分钟
  • commitCount = 1000, recreateContext = false: 242 sec commitCount = 1000,recreateContext = false: 242 秒
  • commitCount = 10000, recreateContext = false: 202 sec commitCount = 10000,recreateContext = false: 202 秒
  • commitCount = 100000, recreateContext = false: 199 sec commitCount = 100000,recreateContext = false: 199 秒
  • commitCount = 1000000, recreateContext = false: out of memory exception commitCount = 1000000, recreateContext = false:内存不足异常
  • commitCount = 1, recreateContext = true: more than 10 minutes commitCount = 1,recreateContext = true:超过10分钟
  • commitCount = 10, recreateContext = true: 241 sec commitCount = 10,recreateContext = true: 241 秒
  • commitCount = 100, recreateContext = true: 164 sec commitCount = 100,recreateContext = true: 164 秒
  • commitCount = 1000, recreateContext = true: 191 sec commitCount = 1000,recreateContext = true: 191 秒

The behaviour in the first test above is that the performance is very non-linear and decreases extremely over time.上面第一个测试中的行为是性能非常非线性并且随着时间的推移急剧下降。 ("Many hours" is an estimation, I never finished this test, I stopped at 50.000 entities after 20 minutes.) This non-linear behaviour is not so significant in all other tests. (“许多小时”是一个估计值,我从未完成此测试,20 分钟后我在 50.000 个实体处停止。)这种非线性行为在所有其他测试中并不那么重要。

This combination increase speed well enough.这种组合可以很好地提高速度。

context.Configuration.AutoDetectChangesEnabled = false;
context.Configuration.ValidateOnSaveEnabled = false;

The fastest way would be using bulk insert extension , which I developed最快的方法是使用我开发的批量插入扩展

note: this is a commercial product, not free of charge注意:这是一个商业产品,不是免费的

It uses SqlBulkCopy and custom datareader to get max performance.它使用 SqlBulkCopy 和自定义数据读取器来获得最大性能。 As a result it is over 20 times faster than using regular insert or AddRange因此,它比使用常规插入或 AddRange 快 20 倍以上EntityFramework.BulkInsert 与 EF AddRange

usage is extremely simple使用极其简单

context.BulkInsert(hugeAmountOfEntities);

You should look at using the System.Data.SqlClient.SqlBulkCopy for this.为此,您应该考虑使用System.Data.SqlClient.SqlBulkCopy Here's the documentation , and of course there are plenty of tutorials online.这是文档,当然还有很多在线教程。

Sorry, I know you were looking for a simple answer to get EF to do what you want, but bulk operations are not really what ORMs are meant for.抱歉,我知道您正在寻找一个简单的答案来让 EF 做您想做的事情,但是批量操作并不是 ORM 真正的用途。

I agree with Adam Rackis.我同意亚当·拉基斯的观点。 SqlBulkCopy is the fastest way of transferring bulk records from one data source to another. SqlBulkCopy是将批量记录从一个数据源传输到另一个数据源的最快方法。 I used this to copy 20K records and it took less than 3 seconds.我用它复制了 20K 条记录,耗时不到 3 秒。 Have a look at the example below.看看下面的例子。

public static void InsertIntoMembers(DataTable dataTable)
{           
    using (var connection = new SqlConnection(@"data source=;persist security info=True;user id=;password=;initial catalog=;MultipleActiveResultSets=True;App=EntityFramework"))
    {
        SqlTransaction transaction = null;
        connection.Open();
        try
        {
            transaction = connection.BeginTransaction();
            using (var sqlBulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.TableLock, transaction))
            {
                sqlBulkCopy.DestinationTableName = "Members";
                sqlBulkCopy.ColumnMappings.Add("Firstname", "Firstname");
                sqlBulkCopy.ColumnMappings.Add("Lastname", "Lastname");
                sqlBulkCopy.ColumnMappings.Add("DOB", "DOB");
                sqlBulkCopy.ColumnMappings.Add("Gender", "Gender");
                sqlBulkCopy.ColumnMappings.Add("Email", "Email");

                sqlBulkCopy.ColumnMappings.Add("Address1", "Address1");
                sqlBulkCopy.ColumnMappings.Add("Address2", "Address2");
                sqlBulkCopy.ColumnMappings.Add("Address3", "Address3");
                sqlBulkCopy.ColumnMappings.Add("Address4", "Address4");
                sqlBulkCopy.ColumnMappings.Add("Postcode", "Postcode");

                sqlBulkCopy.ColumnMappings.Add("MobileNumber", "MobileNumber");
                sqlBulkCopy.ColumnMappings.Add("TelephoneNumber", "TelephoneNumber");

                sqlBulkCopy.ColumnMappings.Add("Deleted", "Deleted");

                sqlBulkCopy.WriteToServer(dataTable);
            }
            transaction.Commit();
        }
        catch (Exception)
        {
            transaction.Rollback();
        }

    }
}

as it was never mentioned here I want to recomment EFCore.BulkExtensions here因为这里从未提到过我想在这里推荐 EFCore.BulkExtensions

context.BulkInsert(entitiesList);                 context.BulkInsertAsync(entitiesList);
context.BulkUpdate(entitiesList);                 context.BulkUpdateAsync(entitiesList);
context.BulkDelete(entitiesList);                 context.BulkDeleteAsync(entitiesList);
context.BulkInsertOrUpdate(entitiesList);         context.BulkInsertOrUpdateAsync(entitiesList);         // Upsert
context.BulkInsertOrUpdateOrDelete(entitiesList); context.BulkInsertOrUpdateOrDeleteAsync(entitiesList); // Sync
context.BulkRead(entitiesList);                   context.BulkReadAsync(entitiesList);

I would recommend this article on how to do bulk inserts using EF.我会推荐这篇关于如何使用 EF 进行批量插入的文章。

Entity Framework and slow bulk INSERTs 实体框架和慢速批量插入

He explores these areas and compares perfomance:他探索了这些领域并比较了性能:

  1. Default EF (57 minutes to complete adding 30,000 records)默认 EF(57 分钟完成添加 30,000 条记录)
  2. Replacing with ADO.NET Code (25 seconds for those same 30,000)替换为 ADO.NET 代码(相同的 30,000 需要 25
  3. Context Bloat- Keep the active Context Graph small by using a new context for each Unit of Work (same 30,000 inserts take 33 seconds)上下文膨胀 - 通过为每个工作单元使用新上下文来保持活动上下文图较小(相同的 30,000 次插入需要 33 秒)
  4. Large Lists - Turn off AutoDetectChangesEnabled (brings the time down to about 20 seconds)大列表 - 关闭 AutoDetectChangesEnabled(将时间缩短到大约 20 秒)
  5. Batching (down to 16 seconds)批处理(低至 16 秒)
  6. DbTable.AddRange() - (performance is in the 12 range) DbTable.AddRange() - (性能在 12 范围内)

I've investigated Slauma's answer (which is awesome, thanks for the idea man), and I've reduced batch size until I've hit optimal speed.我已经调查了 Slauma 的答案(这很棒,感谢您的想法),并且我已经减少了批量大小,直到达到最佳速度。 Looking at the Slauma's results:查看 Slauma 的结果:

  • commitCount = 1, recreateContext = true: more than 10 minutes commitCount = 1,recreateContext = true:超过10分钟
  • commitCount = 10, recreateContext = true: 241 sec commitCount = 10,recreateContext = true:241 秒
  • commitCount = 100, recreateContext = true: 164 sec commitCount = 100,recreateContext = true:164 秒
  • commitCount = 1000, recreateContext = true: 191 sec commitCount = 1000,recreateContext = true:191 秒

It is visible that there is speed increase when moving from 1 to 10, and from 10 to 100, but from 100 to 1000 inserting speed is falling down again.可见从1移动到10,从10到100有速度提升,但是从100到1000插入速度又下降了。

So I've focused on what's happening when you reduce batch size to value somewhere in between 10 and 100, and here are my results (I'm using different row contents, so my times are of different value):所以我专注于当你将批量大小减少到 10 到 100 之间的某个值时会发生什么,这是我的结果(我使用不同的行内容,所以我的时间有不同的价值):

Quantity    | Batch size    | Interval
1000    1   3
10000   1   34
100000  1   368

1000    5   1
10000   5   12
100000  5   133

1000    10  1
10000   10  11
100000  10  101

1000    20  1
10000   20  9
100000  20  92

1000    27  0
10000   27  9
100000  27  92

1000    30  0
10000   30  9
100000  30  92

1000    35  1
10000   35  9
100000  35  94

1000    50  1
10000   50  10
100000  50  106

1000    100 1
10000   100 14
100000  100 141

Based on my results, actual optimum is around value of 30 for batch size.根据我的结果,批量大小的实际最佳值约为 30。 It's less than both 10 and 100. Problem is, I have no idea why is 30 optimal, nor could have I found any logical explanation for it.它小于 10 和 100。问题是,我不知道为什么 30 是最佳的,我也找不到任何合乎逻辑的解释。

As other people have said SqlBulkCopy is the way to do it if you want really good insert performance.正如其他人所说,如果您想要真正良好的插入性能,那么 SqlBulkCopy 就是这样做的方法。

It's a bit cumbersome to implement but there are libraries that can help you with it.实现起来有点麻烦,但有一些库可以帮助你。 There are a few out there but I will shamelesslyplug my own library this time: https://github.com/MikaelEliasson/EntityFramework.Utilities#batch-insert-entities有一些,但这次我会无耻地插入我自己的库: https : //github.com/MikaelEliasson/EntityFramework.Utilities#batch-insert-entities

The only code you would need is:您需要的唯一代码是:

 using (var db = new YourDbContext())
 {
     EFBatchOperation.For(db, db.BlogPosts).InsertAll(list);
 }

So how much faster is it?那么它的速度有多快? Very hard to say because it depends on so many factors, computer performance, network, object size etc etc. The performance tests I've made suggests 25k entities can be inserted at around 10s the standard way on localhost IF you optimize your EF configuration like mentioned in the other answers.很难说,因为它取决于很多因素,计算机性能、网络、对象大小等。我所做的性能测试表明,如果您优化了 EF 配置,则可以在 10 秒左右以标准方式在本地主机上插入 25k 个实体在其他答案中提到。 With EFUtilities that takes about 300ms.使用 EFUtilities 大约需要 300 毫秒。 Even more interesting is that I have saved around 3 millions entities in under 15 seconds using this method, averaging around 200k entities per second.更有趣的是,我使用这种方法在 15 秒内保存了大约 300 万个实体,平均每秒大约 20 万个实体。

The one problem is ofcourse if you need to insert releated data.一个问题当然是如果您需要插入相关数据。 This can be done efficently into sql server using the method above but it requires you to have an Id generation strategy that let you generate id's in the app-code for the parent so you can set the foreign keys.这可以使用上面的方法在 sql server 中有效地完成,但它需要您有一个 Id 生成策略,让您在应用程序代码中为父级生成 id,以便您可以设置外键。 This can be done using GUIDs or something like HiLo id generation.这可以使用 GUID 或 HiLo id 生成之类的东西来完成。

Dispose() context create problems if the entities you Add() rely on other preloaded entities (eg navigation properties) in the context如果您Add()的实体依赖于上下文中的其他预加载实体(例如导航属性),则Dispose()上下文会产生问题

I use similar concept to keep my context small to achieve the same performance我使用类似的概念来保​​持我的上下文较小以实现相同的性能

But instead of Dispose() the context and recreate, I simply detach the entities that already SaveChanges()但不是Dispose()上下文并重新创建,我只是分离已经SaveChanges()的实体

public void AddAndSave<TEntity>(List<TEntity> entities) where TEntity : class {

const int CommitCount = 1000; //set your own best performance number here
int currentCount = 0;

while (currentCount < entities.Count())
{
    //make sure it don't commit more than the entities you have
    int commitCount = CommitCount;
    if ((entities.Count - currentCount) < commitCount)
        commitCount = entities.Count - currentCount;

    //e.g. Add entities [ i = 0 to 999, 1000 to 1999, ... , n to n+999... ] to conext
    for (int i = currentCount; i < (currentCount + commitCount); i++)        
        _context.Entry(entities[i]).State = System.Data.EntityState.Added;
        //same as calling _context.Set<TEntity>().Add(entities[i]);       

    //commit entities[n to n+999] to database
    _context.SaveChanges();

    //detach all entities in the context that committed to database
    //so it won't overload the context
    for (int i = currentCount; i < (currentCount + commitCount); i++)
        _context.Entry(entities[i]).State = System.Data.EntityState.Detached;

    currentCount += commitCount;
} }

wrap it with try catch and TrasactionScope() if you need, not showing them here for keeping the code clean如果需要,用 try catch 和TrasactionScope()包装它,不要在此处显示它们以保持代码清洁

I know this is a very old question, but one guy here said that developed an extension method to use bulk insert with EF, and when I checked, I discovered that the library costs $599 today (for one developer).我知道这是一个非常古老的问题,但是这里的一个人说开发了一种扩展方法来使用 EF 批量插入,当我检查时,我发现今天该库的成本为 599 美元(对于一个开发人员)。 Maybe it makes sense for the entire library, however for just the bulk insert this is too much.也许这对整个库都有意义,但是对于批量插入来说这太多了。

Here is a very simple extension method I made.这是我制作的一个非常简单的扩展方法。 I use that on pair with database first (do not tested with code first, but I think that works the same).我首先将它与数据库配对使用(先不使用代码进行测试,但我认为它的工作原理相同)。 Change YourEntities with the name of your context:使用您的上下文名称更改YourEntities

public partial class YourEntities : DbContext
{
    public async Task BulkInsertAllAsync<T>(IEnumerable<T> entities)
    {
        using (var conn = new SqlConnection(Database.Connection.ConnectionString))
        {
            await conn.OpenAsync();

            Type t = typeof(T);

            var bulkCopy = new SqlBulkCopy(conn)
            {
                DestinationTableName = GetTableName(t)
            };

            var table = new DataTable();

            var properties = t.GetProperties().Where(p => p.PropertyType.IsValueType || p.PropertyType == typeof(string));

            foreach (var property in properties)
            {
                Type propertyType = property.PropertyType;
                if (propertyType.IsGenericType &&
                    propertyType.GetGenericTypeDefinition() == typeof(Nullable<>))
                {
                    propertyType = Nullable.GetUnderlyingType(propertyType);
                }

                table.Columns.Add(new DataColumn(property.Name, propertyType));
            }

            foreach (var entity in entities)
            {
                table.Rows.Add(
                    properties.Select(property => property.GetValue(entity, null) ?? DBNull.Value).ToArray());
            }

            bulkCopy.BulkCopyTimeout = 0;
            await bulkCopy.WriteToServerAsync(table);
        }
    }

    public void BulkInsertAll<T>(IEnumerable<T> entities)
    {
        using (var conn = new SqlConnection(Database.Connection.ConnectionString))
        {
            conn.Open();

            Type t = typeof(T);

            var bulkCopy = new SqlBulkCopy(conn)
            {
                DestinationTableName = GetTableName(t)
            };

            var table = new DataTable();

            var properties = t.GetProperties().Where(p => p.PropertyType.IsValueType || p.PropertyType == typeof(string));

            foreach (var property in properties)
            {
                Type propertyType = property.PropertyType;
                if (propertyType.IsGenericType &&
                    propertyType.GetGenericTypeDefinition() == typeof(Nullable<>))
                {
                    propertyType = Nullable.GetUnderlyingType(propertyType);
                }

                table.Columns.Add(new DataColumn(property.Name, propertyType));
            }

            foreach (var entity in entities)
            {
                table.Rows.Add(
                    properties.Select(property => property.GetValue(entity, null) ?? DBNull.Value).ToArray());
            }

            bulkCopy.BulkCopyTimeout = 0;
            bulkCopy.WriteToServer(table);
        }
    }

    public string GetTableName(Type type)
    {
        var metadata = ((IObjectContextAdapter)this).ObjectContext.MetadataWorkspace;
        var objectItemCollection = ((ObjectItemCollection)metadata.GetItemCollection(DataSpace.OSpace));

        var entityType = metadata
                .GetItems<EntityType>(DataSpace.OSpace)
                .Single(e => objectItemCollection.GetClrType(e) == type);

        var entitySet = metadata
            .GetItems<EntityContainer>(DataSpace.CSpace)
            .Single()
            .EntitySets
            .Single(s => s.ElementType.Name == entityType.Name);

        var mapping = metadata.GetItems<EntityContainerMapping>(DataSpace.CSSpace)
                .Single()
                .EntitySetMappings
                .Single(s => s.EntitySet == entitySet);

        var table = mapping
            .EntityTypeMappings.Single()
            .Fragments.Single()
            .StoreEntitySet;

        return (string)table.MetadataProperties["Table"].Value ?? table.Name;
    }
}

You can use that against any collection that inherit from IEnumerable , like that:您可以对继承自IEnumerable任何集合使用它,如下所示:

await context.BulkInsertAllAsync(items);

[2019 Update] EF Core 3.1 [2019 更新] EF Core 3.1

Following what have been said above, disabling AutoDetectChangesEnabled in EF Core worked perfectly: the insertion time was divided by 100 (from many minutes to a few seconds, 10k records with cross tables relationships)按照上面所说的,在 EF Core 中禁用 AutoDetectChangesEnabled 效果很好:插入时间除以 100(从几分钟到几秒,具有交叉表关系的 10k 记录)

The updated code is :更新后的代码是:

context.ChangeTracker.AutoDetectChangesEnabled = false;
foreach (IRecord record in records) {
    //Add records to your database        
}
context.ChangeTracker.DetectChanges();
context.SaveChanges();
context.ChangeTracker.AutoDetectChangesEnabled = true; //do not forget to re-enable

I have made an generic extension of @Slauma s example above;我对上面的@Slauma 的例子做了一个通用的扩展;

public static class DataExtensions
{
    public static DbContext AddToContext<T>(this DbContext context, object entity, int count, int commitCount, bool recreateContext, Func<DbContext> contextCreator)
    {
        context.Set(typeof(T)).Add((T)entity);

        if (count % commitCount == 0)
        {
            context.SaveChanges();
            if (recreateContext)
            {
                context.Dispose();
                context = contextCreator.Invoke();
                context.Configuration.AutoDetectChangesEnabled = false;
            }
        }
        return context;
    }
}

Usage:用法:

public void AddEntities(List<YourEntity> entities)
{
    using (var transactionScope = new TransactionScope())
    {
        DbContext context = new YourContext();
        int count = 0;
        foreach (var entity in entities)
        {
            ++count;
            context = context.AddToContext<TenancyNote>(entity, count, 100, true,
                () => new YourContext());
        }
        context.SaveChanges();
        transactionScope.Complete();
    }
}

One of the fastest ways to save a list you must apply the following code保存列表的最快方法之一,您必须应用以下代码

context.Configuration.AutoDetectChangesEnabled = false;
context.Configuration.ValidateOnSaveEnabled = false;

AutoDetectChangesEnabled = false AutoDetectChangesEnabled = false

Add, AddRange & SaveChanges: Doesn't detect changes.添加、添加范围和保存更改:不检测更改。

ValidateOnSaveEnabled = false; ValidateOnSaveEnabled = false;

Doesn't detect change tracker不检测更改跟踪器

You must add nuget你必须添加nuget

Install-Package Z.EntityFramework.Extensions

Now you can use the following code现在您可以使用以下代码

var context = new MyContext();

context.Configuration.AutoDetectChangesEnabled = false;
context.Configuration.ValidateOnSaveEnabled = false;

context.BulkInsert(list);
context.BulkSaveChanges();

尝试使用存储过程来获取要插入的数据的 XML。

I'm looking for the fastest way of inserting into Entity Framework我正在寻找插入实体框架的最快方法

There are some third-party libraries supporting Bulk Insert available:有一些支持批量插入的第三方库可用:

  • Z.EntityFramework.Extensions ( Recommended ) Z.EntityFramework.Extensions(推荐
  • EFUtilities EF公用事业
  • EntityFramework.BulkInsert EntityFramework.BulkInsert

See: Entity Framework Bulk Insert library请参阅:实体框架批量插入库

Be careful, when choosing a bulk insert library.选择批量插入库时要小心。 Only Entity Framework Extensions supports all kind of associations and inheritances and it's the only one still supported.只有实体框架扩展支持所有类型的关联和继承,并且它是唯一仍然受支持的。


Disclaimer : I'm the owner of Entity Framework Extensions免责声明:我是实体框架扩展的所有者

This library allows you to perform all bulk operations you need for your scenarios:该库允许您执行场景所需的所有批量操作:

  • Bulk SaveChanges批量保存更改
  • Bulk Insert批量插入
  • Bulk Delete批量删除
  • Bulk Update批量更新
  • Bulk Merge批量合并

Example例子

// Easy to use
context.BulkSaveChanges();

// Easy to customize
context.BulkSaveChanges(bulk => bulk.BatchSize = 100);

// Perform Bulk Operations
context.BulkDelete(customers);
context.BulkInsert(customers);
context.BulkUpdate(customers);

// Customize Primary Key
context.BulkMerge(customers, operation => {
   operation.ColumnPrimaryKeyExpression = 
        customer => customer.Code;
});

SqlBulkCopy is super quick SqlBulkCopy 超级快

This is my implementation:这是我的实现:

// at some point in my calling code, I will call:
var myDataTable = CreateMyDataTable();
myDataTable.Rows.Add(Guid.NewGuid,tableHeaderId,theName,theValue); // e.g. - need this call for each row to insert

var efConnectionString = ConfigurationManager.ConnectionStrings["MyWebConfigEfConnection"].ConnectionString;
var efConnectionStringBuilder = new EntityConnectionStringBuilder(efConnectionString);
var connectionString = efConnectionStringBuilder.ProviderConnectionString;
BulkInsert(connectionString, myDataTable);

private DataTable CreateMyDataTable()
{
    var myDataTable = new DataTable { TableName = "MyTable"};
// this table has an identity column - don't need to specify that
    myDataTable.Columns.Add("MyTableRecordGuid", typeof(Guid));
    myDataTable.Columns.Add("MyTableHeaderId", typeof(int));
    myDataTable.Columns.Add("ColumnName", typeof(string));
    myDataTable.Columns.Add("ColumnValue", typeof(string));
    return myDataTable;
}

private void BulkInsert(string connectionString, DataTable dataTable)
{
    using (var connection = new SqlConnection(connectionString))
    {
        connection.Open();
        SqlTransaction transaction = null;
        try
        {
            transaction = connection.BeginTransaction();

            using (var sqlBulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.TableLock, transaction))
            {
                sqlBulkCopy.DestinationTableName = dataTable.TableName;
                foreach (DataColumn column in dataTable.Columns) {
                    sqlBulkCopy.ColumnMappings.Add(column.ColumnName, column.ColumnName);
                }

                sqlBulkCopy.WriteToServer(dataTable);
            }
            transaction.Commit();
        }
        catch (Exception)
        {
            transaction?.Rollback();
            throw;
        }
    }
}

Use SqlBulkCopy :使用SqlBulkCopy

void BulkInsert(GpsReceiverTrack[] gpsReceiverTracks)
{
    if (gpsReceiverTracks == null)
    {
        throw new ArgumentNullException(nameof(gpsReceiverTracks));
    }

    DataTable dataTable = new DataTable("GpsReceiverTracks");
    dataTable.Columns.Add("ID", typeof(int));
    dataTable.Columns.Add("DownloadedTrackID", typeof(int));
    dataTable.Columns.Add("Time", typeof(TimeSpan));
    dataTable.Columns.Add("Latitude", typeof(double));
    dataTable.Columns.Add("Longitude", typeof(double));
    dataTable.Columns.Add("Altitude", typeof(double));

    for (int i = 0; i < gpsReceiverTracks.Length; i++)
    {
        dataTable.Rows.Add
        (
            new object[]
            {
                    gpsReceiverTracks[i].ID,
                    gpsReceiverTracks[i].DownloadedTrackID,
                    gpsReceiverTracks[i].Time,
                    gpsReceiverTracks[i].Latitude,
                    gpsReceiverTracks[i].Longitude,
                    gpsReceiverTracks[i].Altitude
            }
        );
    }

    string connectionString = (new TeamTrackerEntities()).Database.Connection.ConnectionString;
    using (var connection = new SqlConnection(connectionString))
    {
        connection.Open();
        using (var transaction = connection.BeginTransaction())
        {
            using (var sqlBulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.TableLock, transaction))
            {
                sqlBulkCopy.DestinationTableName = dataTable.TableName;
                foreach (DataColumn column in dataTable.Columns)
                {
                    sqlBulkCopy.ColumnMappings.Add(column.ColumnName, column.ColumnName);
                }

                sqlBulkCopy.WriteToServer(dataTable);
            }
            transaction.Commit();
        }
    }

    return;
}

Here is a performance comparison between using Entity Framework and using SqlBulkCopy class on a realistic example: How to Bulk Insert Complex Objects into SQL Server Database以下是在实际示例中使用实体框架和使用 SqlBulkCopy 类之间的性能比较:如何将复杂对象批量插入 SQL Server 数据库

As others already emphasized, ORMs are not meant to be used in bulk operations.正如其他人已经强调的那样,ORM 不打算用于批量操作。 They offer flexibility, separation of concerns and other benefits, but bulk operations (except bulk reading) are not one of them.它们提供了灵活性、关注点分离和其他好处,但批量操作(批量读取除外)不是其中之一。

Another option is to use SqlBulkTools available from Nuget.另一种选择是使用 Nuget 提供的 SqlBulkTools。 It's very easy to use and has some powerful features.它非常易于使用并具有一些强大的功能。

Example:例子:

var bulk = new BulkOperations();
var books = GetBooks();

using (TransactionScope trans = new TransactionScope())
{
    using (SqlConnection conn = new SqlConnection(ConfigurationManager
    .ConnectionStrings["SqlBulkToolsTest"].ConnectionString))
    {
        bulk.Setup<Book>()
            .ForCollection(books)
            .WithTable("Books") 
            .AddAllColumns()
            .BulkInsert()
            .Commit(conn);
    }

    trans.Complete();
}

See the documentation for more examples and advanced usage.有关更多示例和高级用法,请参阅文档 Disclaimer: I am the author of this library and any views are of my own opinion.免责声明:我是这个图书馆的作者,任何观点都是我自己的观点。

[NEW SOLUTION FOR POSTGRESQL] Hey, I know it's quite an old post, but I have recently run into similar problem, but we were using Postgresql. [POSTGRESQL 的新解决方案] 嘿,我知道这是一篇很老的帖子,但我最近遇到了类似的问题,但我们使用的是 Postgresql。 I wanted to use effective bulkinsert, what turned out to be pretty difficult.我想使用有效的bulkinsert,结果非常困难。 I haven't found any proper free library to do so on this DB.我还没有找到任何合适的免费库来在这个数据库上这样做。 I have only found this helper: https://bytefish.de/blog/postgresql_bulk_insert/ which is also on Nuget.我只找到了这个助手: https ://bytefish.de/blog/postgresql_bulk_insert/,它也在 Nuget 上。 I have written a small mapper, which auto mapped properties the way Entity Framework:我编写了一个小型映射器,它以实体框架的方式自动映射属性:

public static PostgreSQLCopyHelper<T> CreateHelper<T>(string schemaName, string tableName)
        {
            var helper = new PostgreSQLCopyHelper<T>("dbo", "\"" + tableName + "\"");
            var properties = typeof(T).GetProperties();
            foreach(var prop in properties)
            {
                var type = prop.PropertyType;
                if (Attribute.IsDefined(prop, typeof(KeyAttribute)) || Attribute.IsDefined(prop, typeof(ForeignKeyAttribute)))
                    continue;
                switch (type)
                {
                    case Type intType when intType == typeof(int) || intType == typeof(int?):
                        {
                            helper = helper.MapInteger("\"" + prop.Name + "\"",  x => (int?)typeof(T).GetProperty(prop.Name).GetValue(x, null));
                            break;
                        }
                    case Type stringType when stringType == typeof(string):
                        {
                            helper = helper.MapText("\"" + prop.Name + "\"", x => (string)typeof(T).GetProperty(prop.Name).GetValue(x, null));
                            break;
                        }
                    case Type dateType when dateType == typeof(DateTime) || dateType == typeof(DateTime?):
                        {
                            helper = helper.MapTimeStamp("\"" + prop.Name + "\"", x => (DateTime?)typeof(T).GetProperty(prop.Name).GetValue(x, null));
                            break;
                        }
                    case Type decimalType when decimalType == typeof(decimal) || decimalType == typeof(decimal?):
                        {
                            helper = helper.MapMoney("\"" + prop.Name + "\"", x => (decimal?)typeof(T).GetProperty(prop.Name).GetValue(x, null));
                            break;
                        }
                    case Type doubleType when doubleType == typeof(double) || doubleType == typeof(double?):
                        {
                            helper = helper.MapDouble("\"" + prop.Name + "\"", x => (double?)typeof(T).GetProperty(prop.Name).GetValue(x, null));
                            break;
                        }
                    case Type floatType when floatType == typeof(float) || floatType == typeof(float?):
                        {
                            helper = helper.MapReal("\"" + prop.Name + "\"", x => (float?)typeof(T).GetProperty(prop.Name).GetValue(x, null));
                            break;
                        }
                    case Type guidType when guidType == typeof(Guid):
                        {
                            helper = helper.MapUUID("\"" + prop.Name + "\"", x => (Guid)typeof(T).GetProperty(prop.Name).GetValue(x, null));
                            break;
                        }
                }
            }
            return helper;
        }

I use it the following way (I had entity named Undertaking):我按以下方式使用它(我有一个名为 Undertaking 的实体):

var undertakingHelper = BulkMapper.CreateHelper<Model.Undertaking>("dbo", nameof(Model.Undertaking));
undertakingHelper.SaveAll(transaction.UnderlyingTransaction.Connection as Npgsql.NpgsqlConnection, undertakingsToAdd));

I showed an example with transaction, but it can also be done with normal connection retrieved from context.我展示了一个事务示例,但也可以使用从上下文中检索到的正常连接来完成。 undertakingsToAdd is enumerable of normal entity records, which I want to bulkInsert into DB. workingsToAdd 是可枚举的普通实体记录,我想将其批量插入到数据库中。

This solution, to which I've got after few hours of research and trying, is as you could expect much faster and finally easy to use and free!经过几个小时的研究和尝试,我得到了这个解决方案,正如您所期望的那样,速度更快,最终易于使用且免费! I really advice you to use this solution, not only for the reasons mentioned above, but also because it's the only one with which I had no problems with Postgresql itself, many other solutions work flawlessly for example with SqlServer.我真的建议你使用这个解决方案,不仅是因为上面提到的原因,而且因为它是唯一一个我对 Postgresql 本身没有问题的解决方案,许多其他解决方案可以完美地工作,例如与 SqlServer。

Yes, SqlBulkUpdate is indeed the fastest tool for this type of task.是的, SqlBulkUpdate确实是处理此类任务的最快工具。 I wanted to find "least effort" generic way for me in .NET Core so I ended up using great library from Marc Gravell called FastMember and writing one tiny extension method for entity framework DB context.我想在 .NET Core 中为我找到“最省力”的通用方式,所以我最终使用了来自 Marc Gravell 的名为 FastMember 的优秀库,并为实体框架 DB 上下文编写了一个微小的扩展方法。 Works lightning fast:工作闪电般快速:

using System.Collections.Generic;
using System.Linq;
using FastMember;
using Microsoft.Data.SqlClient;
using Microsoft.EntityFrameworkCore;

namespace Services.Extensions
{
    public static class DbContextExtensions
    {
        public static void BulkCopyToServer<T>(this DbContext db, IEnumerable<T> collection)
        {
            var messageEntityType = db.Model.FindEntityType(typeof(T));

            var tableName = messageEntityType.GetSchema() + "." + messageEntityType.GetTableName();
            var tableColumnMappings = messageEntityType.GetProperties()
                .ToDictionary(p => p.PropertyInfo.Name, p => p.GetColumnName());

            using (var connection = new SqlConnection(db.Database.GetDbConnection().ConnectionString))
            using (var bulkCopy = new SqlBulkCopy(connection))
            {
                foreach (var (field, column) in tableColumnMappings)
                {
                    bulkCopy.ColumnMappings.Add(field, column);
                }

                using (var reader = ObjectReader.Create(collection, tableColumnMappings.Keys.ToArray()))
                {
                    bulkCopy.DestinationTableName = tableName;
                    connection.Open();
                    bulkCopy.WriteToServer(reader);
                    connection.Close();
                }
            }
        }
    }
}

Have you ever tried to insert through a background worker or task?你有没有试过通过后台工作人员或任务插入?

In my case, im inserting 7760 registers, distributed in 182 different tables with foreign key relationships ( by NavigationProperties).就我而言,我插入了 7760 个寄存器,分布在 182 个具有外键关系的不同表中(通过 NavigationProperties)。

Without the task, it took 2 minutes and a half.没有任务,花了2分半钟。 Within a Task ( Task.Factory.StartNew(...) ), it took 15 seconds.在一个任务( Task.Factory.StartNew(...) )中,花费了 15 秒。

Im only doing the SaveChanges() after adding all the entities to the context.我只在将所有实体添加到上下文后才执行SaveChanges() (to ensure data integrity) (确保数据完整性)

All the solutions written here don't help because when you do SaveChanges(), insert statements are sent to database one by one, that's how Entity works.这里写的所有解决方案都无济于事,因为当您执行 SaveChanges() 时,插入语句被一条一条地发送到数据库,这就是 Entity 的工作原理。

And if your trip to database and back is 50 ms for instance then time needed for insert is number of records x 50 ms.例如,如果您到数据库和返回的行程是 50 毫秒,那么插入所需的时间是记录数 x 50 毫秒。

You have to use BulkInsert, here is the link: https://efbulkinsert.codeplex.com/您必须使用 BulkInsert,这是链接: https ://efbulkinsert.codeplex.com/

I got insert time reduced from 5-6 minutes to 10-12 seconds by using it.通过使用它,我将插入时间从 5-6 分钟减少到 10-12 秒。

As per my knowledge there is no BulkInsert in EntityFramework to increase the performance of the huge inserts.据我所知, EntityFramework no BulkInsert来提高巨大插入的性能。

In this scenario you can go with SqlBulkCopy in ADO.net to solve your problem在这种情况下,您可以去使用SqlBulkCopyADO.net为您解决问题

You may use Bulk package library.您可以使用批量包库。 Bulk Insert 1.0.0 version is used in projects having Entity framework >=6.0.0 .批量插入 1.0.0 版本用于具有实体框架 >=6.0.0 的项目。

More description can be found here- Bulkoperation source code更多描述可以在这里找到-批量操作源代码

The secret is to insert into an identical blank staging table.秘诀是插入到相同的空白临时表中。 Inserts are lightening quick.插入物正在快速减轻。 Then run a single insert from that into your main large table.然后从中运行单个插入到您的主大表中。 Then truncate the staging table ready for the next batch.然后截断为下一批准备的临时表。

ie. IE。

insert into some_staging_table using Entity Framework.

-- Single insert into main table (this could be a tiny stored proc call)
insert into some_main_already_large_table (columns...)
   select (columns...) from some_staging_table
truncate table some_staging_table

TL;DR I know it is an old post, but I have implemented a solution starting from one of those proposed by extending it and solving some problems of this; TL; DR我知道这是一篇旧帖子,但我已经实现了一个解决方案,从一个提出的解决方案开始,通过扩展它并解决了一些问题; moreover I have also read the other solutions presented and compared to these it seems to me to propose a solution that is much more suited to the requests formulated in the original question.此外,我还阅读了提出的其他解决方案,与这些解决方案相比,在我看来,提出了一个更适合原始问题中提出的要求的解决方案。

In this solution I extend Slauma's approach which I would say is perfect for the case proposed in the original question, and that is to use Entity Framework and Transaction Scope for an expensive write operation on the db.在这个解决方案中,我扩展了Slauma 的方法,我认为它非常适合原始问题中提出的案例,即使用实体框架和事务范围对数据库进行昂贵的写操作。

In Slauma's solution - which incidentally was a draft and was only used to get an idea of ​​the speed of EF with a strategy to implement bulk-insert - there were problems due to:在 Slauma 的解决方案中 - 顺便说一句,这是一个草案,仅用于通过实施批量插入的策略来了解 EF 的速度 - 存在以下问题:

  1. the timeout of the transaction (by default 1 minute extendable via code to max 10 minutes);交易的超时时间(默认为 1 分钟,可通过代码延长至最多 10 分钟);
  2. the duplication of the first block of data with a width equal to the size of the commit used at the end of the transaction (this problem is quite weird and circumvented by means of a workaround).第一个数据块的重复,其宽度等于事务结束时使用的提交的大小(这个问题很奇怪,可以通过变通方法规避)。

I also extended the case study presented by Slauma by reporting an example that includes the contextual insertion of several dependent entities.我还通过报告一个示例扩展了 Slauma 提供的案例研究,该示例包括几个依赖实体的上下文插入。

The performances that I have been able to verify have been of 10K rec/min inserting in the db a block of 200K wide records approximately 1KB each.我已经能够验证的性能是 10K rec/min 在 db 中插入一个 200K 宽的记录块,每个记录块大约为 1KB。 The speed was constant, there was no degradation in performance and the test took about 20 minutes to run successfully.速度恒定,性能没有下降,测试大约需要 20 分钟才能成功运行。

The solution in detail详细的解决方案

the method that presides over the bulk-insert operation inserted in an example repository class:主持插入到示例存储库类中的批量插入操作的方法:

abstract class SomeRepository { 

    protected MyDbContext myDbContextRef;

    public void ImportData<TChild, TFather>(List<TChild> entities, TFather entityFather)
            where TChild : class, IEntityChild
            where TFather : class, IEntityFather
    {

        using (var scope = MyDbContext.CreateTransactionScope())
        {

            MyDbContext context = null;
            try
            {
                context = new MyDbContext(myDbContextRef.ConnectionString);

                context.Configuration.AutoDetectChangesEnabled = false;

                entityFather.BulkInsertResult = false;
                var fileEntity = context.Set<TFather>().Add(entityFather);
                context.SaveChanges();

                int count = 0;

                //avoids an issue with recreating context: EF duplicates the first commit block of data at the end of transaction!!
                context = MyDbContext.AddToContext<TChild>(context, null, 0, 1, true);

                foreach (var entityToInsert in entities)
                {
                    ++count;
                    entityToInsert.EntityFatherRefId = fileEntity.Id;
                    context = MyDbContext.AddToContext<TChild>(context, entityToInsert, count, 100, true);
                }

                entityFather.BulkInsertResult = true;
                context.Set<TFather>().Add(fileEntity);
                context.Entry<TFather>(fileEntity).State = EntityState.Modified;

                context.SaveChanges();
            }
            finally
            {
                if (context != null)
                    context.Dispose();
            }

            scope.Complete();
        }

    }

}

interfaces used for example purposes only:仅用于示例目的的接口:

public interface IEntityChild {

    //some properties ...

    int EntityFatherRefId { get; set; }

}

public interface IEntityFather {

    int Id { get; set; }
    bool BulkInsertResult { get; set; }
}

db context where I implemented the various elements of the solution as static methods:我将解决方案的各种元素实现为静态方法的 db 上下文:

public class MyDbContext : DbContext
{

    public string ConnectionString { get; set; }


    public MyDbContext(string nameOrConnectionString)
    : base(nameOrConnectionString)
    {
        Database.SetInitializer<MyDbContext>(null);
        ConnectionString = Database.Connection.ConnectionString;
    }


    /// <summary>
    /// Creates a TransactionScope raising timeout transaction to 30 minutes
    /// </summary>
    /// <param name="_isolationLevel"></param>
    /// <param name="timeout"></param>
    /// <remarks>
    /// It is possible to set isolation-level and timeout to different values. Pay close attention managing these 2 transactions working parameters.
    /// <para>Default TransactionScope values for isolation-level and timeout are the following:</para>
    /// <para>Default isolation-level is "Serializable"</para>
    /// <para>Default timeout ranges between 1 minute (default value if not specified a timeout) to max 10 minute (if not changed by code or updating max-timeout machine.config value)</para>
    /// </remarks>
    public static TransactionScope CreateTransactionScope(IsolationLevel _isolationLevel = IsolationLevel.Serializable, TimeSpan? timeout = null)
    {
        SetTransactionManagerField("_cachedMaxTimeout", true);
        SetTransactionManagerField("_maximumTimeout", timeout ?? TimeSpan.FromMinutes(30));

        var transactionOptions = new TransactionOptions();
        transactionOptions.IsolationLevel = _isolationLevel;
        transactionOptions.Timeout = TransactionManager.MaximumTimeout;
        return new TransactionScope(TransactionScopeOption.Required, transactionOptions);
    }

    private static void SetTransactionManagerField(string fieldName, object value)
    {
        typeof(TransactionManager).GetField(fieldName, BindingFlags.NonPublic | BindingFlags.Static).SetValue(null, value);
    }


    /// <summary>
    /// Adds a generic entity to a given context allowing commit on large block of data and improving performance to support db bulk-insert operations based on Entity Framework
    /// </summary>
    /// <typeparam name="T"></typeparam>
    /// <param name="context"></param>
    /// <param name="entity"></param>
    /// <param name="count"></param>
    /// <param name="commitCount">defines the block of data size</param>
    /// <param name="recreateContext"></param>
    /// <returns></returns>
    public static MyDbContext AddToContext<T>(MyDbContext context, T entity, int count, int commitCount, bool recreateContext) where T : class
    {
        if (entity != null)
            context.Set<T>().Add(entity);

        if (count % commitCount == 0)
        {
            context.SaveChanges();
            if (recreateContext)
            {
                var contextConnectionString = context.ConnectionString;
                context.Dispose();
                context = new MyDbContext(contextConnectionString);
                context.Configuration.AutoDetectChangesEnabled = false;
            }
        }

        return context;
    }
}

Taking several notes, comments and answers, this is my implementation, getting the SQL connection string from my Entity (I'm using SQLBulk just in some parts, the rest Entity Framework) and Using the same Datetable column names that uses the SQL Database.做几个笔记、评论和答案,这是我的实现,从我的实体获取 SQL 连接字符串(我只在某些部分使用 SQLBulk,其余的实体框架)并使用与 SQL 数据库相同的 Datetable 列名称。

   public  void InsertBulkDatatable(DataTable dataTable)
        {
            EntityConnectionStringBuilder entityBuilder =  new EntityConnectionStringBuilder(ConfigurationManager.ConnectionStrings["MyDbContextConnectionName"].ConnectionString);
            string cs = entityBuilder.ProviderConnectionString;
            using (var connection = new SqlConnection(cs))
            {
                SqlTransaction transaction = null;
                connection.Open();
                try
                {
                    transaction = connection.BeginTransaction();
                    using (var sqlBulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.TableLock, transaction))
                    {
                        sqlBulkCopy.DestinationTableName = dataTable.TableName; //Uses the SQL datatable to name the datatable in c#
                        //Maping Columns
                        foreach (DataColumn column in dataTable.Columns) {
                            sqlBulkCopy.ColumnMappings.Add(column.ColumnName, column.ColumnName);

                        }


                        sqlBulkCopy.WriteToServer(dataTable);
                    }
                    transaction.Commit();
                }
                catch (Exception)
                {
                    transaction.Rollback();
                }

            }
        }

Use stored procedure that takes input data in form of xml to insert data. 使用以XML形式获取输入数据的存储过程来插入数据。

From your c# code pass insert data as xml. 从您的C#代码中将插入数据作为xml传递。

eg in c#, syntax would be like this: 例如在C#中,语法将如下所示:

object id_application = db.ExecuteScalar("procSaveApplication", xml)

Configuration.LazyLoadingEnabled = false; Configuration.LazyLoadingEnabled = false; Configuration.ProxyCreationEnabled = false; Configuration.ProxyCreationEnabled = false;

these are too effect to speed without AutoDetectChangesEnabled = false;如果没有 AutoDetectChangesEnabled = false,这些对速度影响太大; and i advise to use different table header from dbo.我建议使用与 dbo 不同的表头。 generally i use like nop,sop,tbl etc..通常我使用像 nop、sop、tbl 等。

But, for more than (+4000) inserts i recommend to use stored procedure.但是,对于超过(+4000)次插入,我建议使用存储过程。 attached the time elapsed.附上经过的时间。 I did inserted it 11.788 rows in 20"我确实在 20" 中插入了 11.788 行在此处输入图片说明

thats it code这就是代码

 public void InsertDataBase(MyEntity entity)
    {
        repository.Database.ExecuteSqlCommand("sp_mystored " +
                "@param1, @param2"
                 new SqlParameter("@param1", entity.property1),
                 new SqlParameter("@param2", entity.property2));
    }

Use this technique to increase the speed of inserting records in Entity Framework. 使用此技术可以提高在Entity Framework中插入记录的速度。 Here I use a simple stored procedure to insert the records. 在这里,我使用一个简单的存储过程来插入记录。 And to execute this stored procedure I use .FromSql() method of Entity Framework which executes Raw SQL. 为了执行此存储过程,我使用了执行原始SQL 的Entity Framework的.FromSql()方法

The stored procedure code: 存储过程的代码:

CREATE PROCEDURE TestProc
@FirstParam VARCHAR(50),
@SecondParam VARCHAR(50)

AS
  Insert into SomeTable(Name, Address) values(@FirstParam, @SecondParam) 
GO

Next, loop through all your 4000 records and add the Entity Framework code which executes the stored 接下来,遍历所有4000条记录并添加执行存储的Entity Framework代码

procedure onces every 100th loop. 程序每100次循环一次。

For this I create a string query to execute this procedure, keep on appending to it every sets of record. 为此,我创建了一个字符串查询来执行此过程,继续将每组记录追加到该字符串查询。

Then check it the loop is running in the multiples of 100 and in that case execute it using .FromSql() . 然后检查循环是否以100的倍数运行,在这种情况下,请使用.FromSql()执行.FromSql()

So for 4000 records I only have to execute the procedure for only 4000/100 = 40 times . 因此,对于4000条记录,我只需要执行该过程仅4000/100 = 40次

Check the below code: 检查以下代码:

string execQuery = "";
var context = new MyContext();
for (int i = 0; i < 4000; i++)
{
    execQuery += "EXEC TestProc @FirstParam = 'First'" + i + "'', @SecondParam = 'Second'" + i + "''";

    if (i % 100 == 0)
    {
        context.Student.FromSql(execQuery);
        execQuery = "";
    }
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM