简体   繁体   中英

reducing insert statement time when dealing with large amount of data

I read about SqlBulkCopy and the way it can reduce the amount of time used when inserting large amount of rows my scenario is : I have a an excel file wish I convert it into a dataTable then I send this dataTable to a stored procedure ( wish I can't change its code ) that insert all the rows in the dataTable to an sql table in the database

the problem is that I have like 10 000 to 50 000 rows to insert is there any work around to reduce the time took by the stored procedure ?

The best way to do this would be to use SqlBulkCopy to add the data to a temporary table and then feed that data into the stored proc. You will need to write some SQL code to do the processing but the performance benefits of doing it this way should be worth the effort.

If you create a new stored proc then you have the added benefit of running all of this code inside the database engine so you will not be switching back and forth between your application and the DB engine.

Some Code:

    var importData = new DataSet();
    xmlData.Position = 0;
    importData.ReadXml(xmlData);

    using (var connection = new SqlConnection(myConnectionString))
    {
      connection.Open();
      using (var trans = connection.BeginTransaction())
      {
        using (var sbc = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, trans) { DestinationTableName = myTableName })
        {
          foreach (DataColumn col in importData.Tables[0].Columns)
          {
            sbc.ColumnMappings.Add(col.ColumnName, col.ColumnName);
          }

          sbc.WriteToServer(importData.Tables[0]); //table 0 is the main table in this dataset

          // Now lets call the stored proc.
          var cmd = new SqlCommand("ProcessDataImport", connection)
              {
                CommandType = CommandType.StoredProcedure
              };
          cmd.CommandTimeout = 1200;
          cmd.ExecuteNonQuery();

          trans.Commit();
        }
        connection.Close();
        return null;
      }
    }

Where XmlData is a stream with the Xml data matching your bulk import and myTableName contains the table you want to import into. Rememeber, when doing a bulk copy, the column names must match 100%. Case is important too.

The proc would look something like this:

CREATE PROCEDURE [ProcessDataImport]
AS
BEGIN
  DECLARE @IMPORTCOL INT

  WHILE EXISTS (SELECT X FROM TEMPTABLE)
  BEGIN
    SELECT @IMPORTCOL = (SELECT TOP 1 COLUMN1 FROM TEMPTABLE)
    EXEC DOTHEIMPORT @IMPORTCOL
    DELETE FROM TEMPTABLE WHERE COLUMN1 = @IMPORTCOL
  END
END

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM