简体   繁体   中英

Insert records into a database from a hashtable

I'm looking for any advice on what's the optimum way of inserting a large number of records into a database (SQL2000 onwards) based upon a collection of objects.

Currently my code looks something similar to the snippet below and each record is inserted using a single sql simple INSERT command (opening and closing the database connection each time the function is called! - I'm sure this must slow things down?).

The routine needs to be able to cope with routinely inserting up to 100,000 records and I was wondering if there is a faster way (I'm sure there must be???). I've seen a few posts mentioning using xml based data and bulk copy routines - is this something I should consider or can anyone provide any simple examples which I could build upon?

foreach (DictionaryEntry de in objectList)
{
  eRecord record = (eRecord)de.Value;

  if (!record.Deleted)
  {
    createDBRecord(record.Id,               
                   record.Index,
                   record.Name,
                   record.Value);
  }
}

Thanks for any advice,

Paul.

Doing that way will be relatively slow. You need to consider a bulk INSERT technique either using BCP or BULK INSERT, or if you are using .NET 2.0 you can use the SqlBulkCopy class . Here's an example of using SqlBulkCopy: SqlBulkCopy - Copy Table Data Between SQL Servers at High Speeds

Here's a simple example for SQL Server 2000:

If you have a CSV file, csvtest.txt, in the following format:

1,John,Smith

2,Bob,Hope

3,Kate,Curie

4,Peter,Green

This SQL script will load the contents of the csv file into a database table (If any row contains errors it will be not inserted but other rows will be):

USE myDB
GO

CREATE TABLE CSVTest
(
    ID INT,
    FirstName VARCHAR(60),
    LastName VARCHAR(60)
)
GO

BULK INSERT 
CSVTest
FROM 'c:\temp\csvtest.txt'
WITH
(
    FIELDTERMINATOR = ',',
    ROWTERMINATOR = '\n'
)
GO


SELECT *
FROM CSVTest
GO

You could write out the dictionary contents into a CSV file and then BULK INERT that file.

See also: Using bcp and BULK INSERT

If you implement your own IDataReader method, you could avoid writing to an imtermediate file. See ADO.NET 2.0 Tutorial : SqlBulkCopy Revisited for Transferring Data at High Speeds

Related SO question: how to pass variables like arrays / datatable to SQL server?

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM