简体   繁体   English

加快SQL Server 2008中185k行的更新速度?

[英]Speed up update of 185k rows in SQL Server 2008?

I have a binary file with about 185k rows in it. 我有一个大约185k行的二进制文件。 C# parses file in seconds. C#在几秒钟内解析文件。 What would be the best way to update MSSQL table with that data? 用这些数据更新MSSQL表的最佳方法是什么?

What I've tried: 我尝试过的:

  1. Easiest way - read binary row, parse, update table. 最简单的方法 - 读取二进制行,解析,更新表。 The whole process takes around 2 days to update all the data. 整个过程大约需要2天才能更新所有数据。
  2. Combine 200 update queries and send them at once to the MSSQL. 合并200个更新查询并立即将它们发送到MSSQL。 In this case, update takes 8 to 10 hrs. 在这种情况下,更新需要8到10小时。
  3. Combine 500+ queries into single one. 将500多个查询合并为单个查询。 Works faster, but drops timeout exceptions time to time, so some updates are not going through. 工作得更快,但不时会丢弃超时异常,因此有些更新不会通过。

Any advice on how to speed up the update process? 有关如何加快更新过程的任何建议?

不确定你真的想通过C#来做这件事:可能想要使用BULK INSERT并给它一个文件,你的数据格式正确。

Use SqlBulkCopy (to a temporary table) followed by MERGE . 使用SqlBulkCopy (到临时表),然后使用MERGE

The bulk-copy method can efficiently transfer data using a "push" from a .NET client. 批量复制方法可以使用.NET客户端的“推送”有效地传输数据。 Using BULK INSERT requires a "pull" from the server (along with the required local-file access). 使用BULK INSERT需要从服务器“拉”(以及所需的本地文件访问)。

Then the MERGE command (in SQL Server 2008+) can be used to insert/update/upsert the data from the temporary table to the target table according to the desired rules. 然后, MERGE命令(在SQL Server 2008+中)可用于根据所需规则将数据从临时表插入/更新/ upsert到目标表。 Since the data is entirely in the database at this point, this operation will be as fast as can be. 由于此时数据完全在数据库中,因此该操作将尽可能快。 Using MERGE may also result in performance advantages over many individual commands, even those within the same transaction. 与许多单独的命令相比,使用MERGE 也可能产生性能优势 ,即使是同一事务中的命令也是如此。

See also: 也可以看看:

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM