[英]Speed up update of 185k rows in SQL Server 2008?
I have a binary file with about 185k rows in it. 我有一个大约185k行的二进制文件。
C#
parses file in seconds. C#
在几秒钟内解析文件。 What would be the best way to update MSSQL
table with that data? 用这些数据更新
MSSQL
表的最佳方法是什么?
What I've tried: 我尝试过的:
Any advice on how to speed up the update process? 有关如何加快更新过程的任何建议?
不确定你真的想通过C#来做这件事:可能想要使用BULK INSERT并给它一个文件,你的数据格式正确。
I would try table-valued parameter: 我会尝试表值参数:
http://www.codeproject.com/Articles/22392/SQL-Server-2008-Table-Valued-Parameters http://www.codeproject.com/Articles/22392/SQL-Server-2008-Table-Valued-Parameters
Use SqlBulkCopy
(to a temporary table) followed by MERGE
. 使用
SqlBulkCopy
(到临时表),然后使用MERGE
。
The bulk-copy method can efficiently transfer data using a "push" from a .NET client. 批量复制方法可以使用.NET客户端的“推送”有效地传输数据。 Using
BULK INSERT
requires a "pull" from the server (along with the required local-file access). 使用
BULK INSERT
需要从服务器“拉”(以及所需的本地文件访问)。
Then the MERGE
command (in SQL Server 2008+) can be used to insert/update/upsert the data from the temporary table to the target table according to the desired rules. 然后,
MERGE
命令(在SQL Server 2008+中)可用于根据所需规则将数据从临时表插入/更新/ upsert到目标表。 Since the data is entirely in the database at this point, this operation will be as fast as can be. 由于此时数据完全在数据库中,因此该操作将尽可能快。 Using
MERGE
may also result in performance advantages over many individual commands, even those within the same transaction. 与许多单独的命令相比,使用
MERGE
也可能产生性能优势 ,即使是同一事务中的命令也是如此。
See also: 也可以看看:
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.