简体   繁体   English

上传文件时需要插入大量数据:最佳解决方案?

[英]huge amount of data to insert required on file upload : best solution?

I found a solution at my problem, but my knowledge limits me to see other solutions; 我找到了解决问题的方法,但是我的知识使我无法看到其他解决方案。

Context: we would upload a Perfmon file to the web application, and store all the values in database. 上下文:我们将一个Perfmon文件上传到Web应用程序,并将所有值存储在数据库中。 the file has this csv structure: 该文件具有以下csv结构:


time, counter1, counter2, counter3(process1), ...

0:00, x, y, z,...

0:30, ...

(x, y and z are values obviously) (x,y和z显然是值)

As an example, a 3 months worth of Perfmon would generate more or less 2M data. 例如,一个3个月的Perfmon可以生成或多或少的2M数据。

My solution: on file uploading, some logic would read and transform the file with the following structure: 我的解决方案:在文件上载时,某些逻辑将使用以下结构读取和转换文件:


time, counter, process, value

0:00, counter1,, x

0:00, counter2,, y

0:00, counter3,process1,z

0:30, ...

Saving this file, my database can then do a bulk insert knowing the path. 保存此文件后,我的数据库便可以知道路径,然后进行批量插入。

I can't think of other solution that wouldn't require tons of transactions, or being more optimized. 我想不出其他不需要大量交易或进行更优化的解决方案。 Though this solution can have some limitation as the database server needs to have the right permission over the storage server. 尽管此解决方案可能会有一些限制,因为数据库服务器需要对存储服务器具有正确的权限。

Open to hear your thoughts, Cheers! 开放,听听您的想法,干杯!

我发现了有关sqlbulkcopy的信息,并进行了尝试,实际上提高了性能时间x3

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM