简体   繁体   中英

huge amount of data to insert required on file upload : best solution?

I found a solution at my problem, but my knowledge limits me to see other solutions;

Context: we would upload a Perfmon file to the web application, and store all the values in database. the file has this csv structure:


time, counter1, counter2, counter3(process1), ...

0:00, x, y, z,...

0:30, ...

(x, y and z are values obviously)

As an example, a 3 months worth of Perfmon would generate more or less 2M data.

My solution: on file uploading, some logic would read and transform the file with the following structure:


time, counter, process, value

0:00, counter1,, x

0:00, counter2,, y

0:00, counter3,process1,z

0:30, ...

Saving this file, my database can then do a bulk insert knowing the path.

I can't think of other solution that wouldn't require tons of transactions, or being more optimized. Though this solution can have some limitation as the database server needs to have the right permission over the storage server.

Open to hear your thoughts, Cheers!

我发现了有关sqlbulkcopy的信息,并进行了尝试,实际上提高了性能时间x3

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM