简体   繁体   English

插入一行列表的批量插入效率

[英]Bulk insert efficiency for inserting a list with one row

I'm attemping to use bulk insert and replace it with current common insertion in my project.我正在尝试使用批量插入并将其替换为我项目中当前的常见插入。 Some of insertion requests (BillType.Booklet) are a list with one row and others are a list with multiple row.一些插入请求 (BillType.Booklet) 是一个单行列表,而另一些是多行列表。

   public async Task CreateBill(List<BillReceiverDto> Receivers, BillType BillType)
    {
       
       var bulkList =new List<BillReceiverDto>();

        if (BillType == BillType.Booklet)
        {
            bulkList.Add(Receivers.FirstOrDefault());
        }
        else
        {
            bulkList.AddRange(Receivers);
        }

        await _dbContextProvider.GetDbContext().BulkInsertAsync(bulkList);
    }

Bulk insert have a great performance for inserting huge data, specially more than 100. It insert 5,000 entities in 75 millisecond.批量插入对于插入大量数据有很好的性能,特别是超过100个。它在75毫秒内插入5000个实体。 But Is it efficient to use bulk insert a list with one row?但是使用批量插入一行的列表是否有效? Is there any drawbacks such as overhead or etc...?是否有任何缺点,例如开销或等......?

I believe the BulkInsertAsync extension uses SqlBulkCopy under the hood.我相信 BulkInsertAsync 扩展在幕后使用 SqlBulkCopy。 In which case, I blogged some benchmarks not that long ago, that might be useful.在这种情况下,不久前我在博客上写了一些基准测试,这可能会有用。

While I didn't focus on single row inserts, there did seem to be a cost when using SqlBulkCopy for lower numbers of rows (100) versus Table Valued Parameter approach.虽然我没有关注单行插入,但与表值参数方法相比,使用 SqlBulkCopy 处理较少的行数 (100) 时似乎确实存在成本。 As the volumes ramp up, SqlBulkCopy pulls away for performance but there was a noticeable overhead for low volume.随着卷的增加,SqlBulkCopy 会降低性能,但对于低卷有明显的开销。 How much of an overhead?有多少开销? In the grand scheme of things you're probably not going to notice 10s of milliseconds.在宏伟的计划中,您可能不会注意到 10 毫秒。

If you're dealing with up to hundreds of rows, I'd actually recommend a Table Valued Parameter approach for performance.如果您要处理多达数百行,我实际上建议使用表值参数方法来提高性能。 Larger volumes - SqlBulkCopy.更大的卷 - SqlBulkCopy。

Depending on your needs/views on overheads here, I'd be tempted to check how many rows you have to insert and use the mechanism that best fits the volume.根据您对此处开销的需求/看法,我很想检查您必须插入多少行并使用最适合该数量的机制。 Personally, I wouldn't use SqlBulkCopy for low numbers of rows if that is a very typical scenario, because of the overhead.就个人而言,如果这是一个非常典型的场景,我不会将 SqlBulkCopy 用于低行数,因为开销很大。

Blog: https://www.sentryone.com/blog/sqlbulkcopy-vs-table-valued-parameters-bulk-loading-data-into-sql-server博客: https : //www.sentryone.com/blog/sqlbulkcopy-vs-table-valued-parameters-bulk-loading-data-into-sql-server

Disclaimer : I'm the owner of Entity Framework Extensions免责声明:我是实体框架扩展的所有者

It depends on the library you are using Aref Hemati,这取决于您使用的库,Aref Hemati,

In our library, if there are 10 entities or less to insert, we directly use a SQL statement.在我们的库中,如果要插入的实体不超过 10 个,我们直接使用 SQL 语句。 So the SqlBulkCopy overhead is not used.所以不使用SqlBulkCopy开销。

So using our library even with one row is fine but obviously optimized for hundreds and thousands of rows.因此,即使只有一行也可以使用我们的库,但显然针对成百上千的行进行了优化。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM