简体   繁体   中英

200,000 rows or info with SQLBulkCopy c#

I have a database table with 3 columns, all strings. I must upload 200,000 rows of information. I can do this but as you would expect it is taking much too long, like 30+ minutes.

I am now trying to use SQLBulkCopy to do it faster but I cannot understand how to do it correctly. A sample of a row of data would be:

"test string","test string,"test string"

Should I write my data to a temp file so SQLBulkCopy can use it to upload the data? Like have each line represent a row and deliminate the data by comma's?

Here is what I have so far, any help would be great!

//this method gets a list of the data objects passed in

 List<string[]> listOfRows = new List<string[]>(); // holds all the rows to be bulk copied to the data table in th database

foreach (DataUnit dataUnit in dataUnitList)
{         
 string[] row = new string[2];
 row[0] = dataUnit.value1.ToString();
 row[1] = dataUnit.value2.ToString();
 row[2] = dataUnit.value3.ToString() ;
 listOfRows.Add(row);
 }

File.Create(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "Test200kFile.txt"));

using (System.IO.StreamWriter file = new System.IO.StreamWriter(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "Test200kFile.txt")))
             {

                 foreach(string[] array in listOfRows)
                 {
                     file.Write(array[0] + "," + array[1] + "," + array[2]);
                 }
             }

             using (MyFileDataReader reader = new MyFileDataReader(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "Test200kFile.txt")))
             {
                 SqlBulkCopy bulkCopy = new SqlBulkCopy("my connection string to database");
                 bulkCopy.DestinationTableName = "my table name";
                 bulkCopy.BatchSize = 20000; //what if its less than 200,000?

                 bulkCopy.WriteToServer(reader);

                 bulkCopy.Close();

             }  

The bulk insert T-SQL requiers that the source data is present in a file on the DB server. That's not the way you want to go if you write code.

Instead you should create an adapter around whatever data source you have. The adapter should implement parts of the IDataReader interface - SqlBulkCopy only make use of a few the methods of IDataReader , so only those need to be implemented:

Properties

  • FieldCount

Methods

  • GetName
  • GetOrdinal
  • GetValue
  • Read

(source: http://www.michaelbowersox.com/2011/12/22/using-a-custom-idatareader-to-stream-data-into-a-database/ )

So unless your data already is on file, there is no need to write it to file. Just make sure that you can present a record at a time through the IDataReader interface.

Avoiding to save the data to file saves a lot of resources and time. I once implemented an import job that read XML data from an FTP data source through an XmlReader, which was then wrapped in a custom IDataReader . That way I could stream the data from the ftp server, through the app server, to the DB server without having to write the entire data set to disk on the app server.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM