简体   繁体   English

SQL Server批量复制插入百万条记录很慢

[英]SQL Server bulkcopy insert million records is slow

I have a table with 1 million records. 我有一张有100万条记录的表。 I need to be able to move those records to another database and another table. 我需要能够将这些记录移动到另一个数据库和另一个表。

I'm using a stored procedure to get the data. 我正在使用存储过程来获取数据。 It fills a data adapter, it then bcp's the data into the new table. 它填充一个数据适配器,然后bcp将数据放入新表中。

We're on SQL Server 2005 and C# 4. We will be moving to SQL Server 2012 or 2014 and Visual Studio 2015 with C# 4.6 or 5.0. 我们使用的是SQL Server 2005和C#4。我们将使用C#4.6或5.0迁移到SQL Server 2012或2014和Visual Studio 2015。 If there are any functions that would make this work well. 如果有任何功能可以使其正常工作。

  • for 10k records the process takes less than 1 second 10k记录,此过程花费不到1秒的时间
  • for 500k records, the dataadapter runs out of memory and the process fails. 对于500k条记录,dataadapter内存不足,并且该过程失败。 batching to 100k records, the select statement is the issue in SQL returning 100k records at a time takes 2 minutes per loop. 批处理到100k条记录时,select语句是SQL中的问题,一次返回100k条记录每个循环需要2分钟。

Is there a way, or what is wrong with my code below to keep the data adapter from being filled and instead map the columns and have BulkCopy stay server side and just push the records from the db to the new table like maybe SSIS? 有没有办法,或者下面的代码有什么问题,可以防止数据适配器被填充,而是映射列并使BulkCopy保留在服务器端,而只是将记录从db推送到新表,例如SSIS?

It seems the bulk copy itself is lightning fast, but the adapter fill fails because it runs out of memory trying to populate the adapter with 1 million records. 似乎大容量副本本身快如闪电,但适配器填充失败,因为它用尽了内存,试图用一百万条记录填充适配器。 Without doing 1 row at a time, I'd just like to move the data between tables. 一次不做1行,我只想在表之间移动数据。

One table has 27 columns with 5 of the columns not being in table 2 which has 32 columns and some columns are not named the same in both tables. 一个表具有27列,其中5个不在表2中,而表2具有32列,并且某些列在两个表中的名称不同。

This is a Proof of Concept (PoC). 这是概念证明(PoC)。

sourceConn_transDB.Open();
SqlCommand sourceCommand = new SqlCommand(queryString, sourceConn_transDB);
DataTable table = new DataTable();

sourceCommand.CommandTimeout = 600;

using (var adapter = new SqlDataAdapter(sourceCommand))
{
    WriteLineWithTime("Adapter Fill");
    adapter.Fill(table);
}

if ((table == null) || (table.Rows.Count <= 0))
   break;

using (SqlBulkCopy bulk = new SqlBulkCopy(targetConn_reportDB, SqlBulkCopyOptions.KeepIdentity, null) { DestinationTableName = "PatientEvent" })
{
    bulk.ColumnMappings.Add(new SqlBulkCopyColumnMapping("PatientID", "PatientID"));
}

Have you tried using the WriteToServer overloads that take a IDataReader , then you don't need to use a DataTable at all. 您是否尝试过使用带有IDataReaderWriteToServer重载,那么根本不需要使用DataTable

sourceConn_transDB.Open();
using(SqlCommand sourceCommand = new SqlCommand(queryString, sourceConn_transDB))
{
    sourceCommand.CommandTimeout = 600;

    using (SqlDataReader reader = sourceCommand.ExecuteReader())
    using (SqlBulkCopy bulk = new SqlBulkCopy(targetConn_reportDB, SqlBulkCopyOptions.KeepIdentity, null) { DestinationTableName = "PatientEvent" })
    {
        bulk.ColumnMappings.Add(new SqlBulkCopyColumnMapping("PatientID", "PatientID"));
        bulk.WriteToServer(reader);
    }
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM