简体   繁体   中英

Speed up update of 185k rows in SQL Server 2008?

I have a binary file with about 185k rows in it. C# parses file in seconds. What would be the best way to update MSSQL table with that data?

What I've tried:

  1. Easiest way - read binary row, parse, update table. The whole process takes around 2 days to update all the data.
  2. Combine 200 update queries and send them at once to the MSSQL. In this case, update takes 8 to 10 hrs.
  3. Combine 500+ queries into single one. Works faster, but drops timeout exceptions time to time, so some updates are not going through.

Any advice on how to speed up the update process?

不确定你真的想通过C#来做这件事:可能想要使用BULK INSERT并给它一个文件,你的数据格式正确。

Use SqlBulkCopy (to a temporary table) followed by MERGE .

The bulk-copy method can efficiently transfer data using a "push" from a .NET client. Using BULK INSERT requires a "pull" from the server (along with the required local-file access).

Then the MERGE command (in SQL Server 2008+) can be used to insert/update/upsert the data from the temporary table to the target table according to the desired rules. Since the data is entirely in the database at this point, this operation will be as fast as can be. Using MERGE may also result in performance advantages over many individual commands, even those within the same transaction.

See also:

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM