[英]Insert records into a database from a hashtable
I'm looking for any advice on what's the optimum way of inserting a large number of records into a database (SQL2000 onwards) based upon a collection of objects. 我正在寻找关于基于对象集合将大量记录插入数据库(从SQL2000开始)的最佳方式的任何建议。
Currently my code looks something similar to the snippet below and each record is inserted using a single sql simple INSERT command (opening and closing the database connection each time the function is called! - I'm sure this must slow things down?). 目前,我的代码看起来类似于下面的代码片段,并且每条记录都使用一个简单的sql简单INSERT命令插入(每次调用该函数时都要打开和关闭数据库连接!-我确定这一定会使速度变慢吗?)。
The routine needs to be able to cope with routinely inserting up to 100,000 records and I was wondering if there is a faster way (I'm sure there must be???). 该例程需要能够应付常规地插入多达100,000条记录的问题,我想知道是否有更快的方法(我敢肯定一定有??)。 I've seen a few posts mentioning using xml based data and bulk copy routines - is this something I should consider or can anyone provide any simple examples which I could build upon? 我看到一些帖子提到使用基于xml的数据和批量复制例程-这是我应该考虑的问题,还是任何人都可以提供任何可以建立的简单示例?
foreach (DictionaryEntry de in objectList)
{
eRecord record = (eRecord)de.Value;
if (!record.Deleted)
{
createDBRecord(record.Id,
record.Index,
record.Name,
record.Value);
}
}
Thanks for any advice, 感谢您的任何建议,
Paul. 保罗。
Doing that way will be relatively slow. 这样做会比较慢。 You need to consider a bulk INSERT technique either using BCP or BULK INSERT, or if you are using .NET 2.0 you can use the SqlBulkCopy class . 您需要考虑使用BCP或BULK INSERT的批量INSERT技术,或者如果使用的是.NET 2.0,则可以使用SqlBulkCopy类 。 Here's an example of using SqlBulkCopy: SqlBulkCopy - Copy Table Data Between SQL Servers at High Speeds 这是使用SqlBulkCopy的示例: SqlBulkCopy-在SQL Server之间高速复制表数据
Here's a simple example for SQL Server 2000: 这是SQL Server 2000的一个简单示例:
If you have a CSV file, csvtest.txt, in the following format: 如果您有CSV文件csvtest.txt,格式如下:
1,John,Smith 1,约翰·史密斯
2,Bob,Hope 2,鲍勃,霍普
3,Kate,Curie 3,凯特,居里
4,Peter,Green 4,彼得,绿色
This SQL script will load the contents of the csv file into a database table (If any row contains errors it will be not inserted but other rows will be): 此SQL脚本会将csv文件的内容加载到数据库表中(如果任何行包含错误,将不会插入该错误,而将插入其他行):
USE myDB
GO
CREATE TABLE CSVTest
(
ID INT,
FirstName VARCHAR(60),
LastName VARCHAR(60)
)
GO
BULK INSERT
CSVTest
FROM 'c:\temp\csvtest.txt'
WITH
(
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)
GO
SELECT *
FROM CSVTest
GO
You could write out the dictionary contents into a CSV file and then BULK INERT that file. 您可以将字典内容写到CSV文件中,然后大容量插入该文件。
See also: Using bcp and BULK INSERT 另请参见: 使用bcp和BULK INSERT
If you implement your own IDataReader method, you could avoid writing to an imtermediate file. 如果实现自己的IDataReader方法,则可以避免写入中间文件。 See ADO.NET 2.0 Tutorial : SqlBulkCopy Revisited for Transferring Data at High Speeds 请参阅ADO.NET 2.0教程:重新讨论了SqlBulkCopy以高速传输数据
Related SO question: how to pass variables like arrays / datatable to SQL server? 相关的SO问题: 如何将变量如数组/数据表传递给SQL Server?
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.