简体   繁体   中英

Bulk insert to SQL Server from Excel / CSV

I need to bulk insert to SQL Server data from an Excel or CSV file. The data is close to 100k lines. I am using C#/.NET. In the Microsoft documentation I find this:

BULK INSERT Sales.Invoices
FROM '\\share\invoices\inv-2016-07-25.csv'
WITH (FORMAT = 'CSV'
  , FIRSTROW=2
  , FIELDQUOTE = '\'
  , FIELDTERMINATOR = ';'
  , ROWTERMINATOR = '0x0a');

BULK INSERT (Transact-SQL)

If I have an Excel file I can convert it to CSV and use this. This works great and it is very efficient, but my problem is that I do not have access to the file system on the SQL server. Has anyone an idea what is the best way to work around that or is there maybe a totally different approach that is better?

I am seeing this problem like client wants to load data to one of the database table for some reason.

  • if they need to automate this process or they need to perform this so frequently or needs to do more than one file at a time, I would stream line the process by asking the client to copy the CVS files or Excel files to a server folder when they want to import to database.

After then I would write a folder watching .NET application that would reader CVS or Excel files and import data to database table. If it fails to load the data, it should move the file to "error files" Folder with error log. If it is success, application would move that file to "completed files".

Advantages of this approach.

  • multiple users can use this functionality to upload data to server.
  • User machine doesn't require any kind of installation.
  • Access to the folder can completely managed by AD.

Disadvantage

  • This is an old approach to handle this kind of use case.
  • User needs to manually check the log or server folder to know the status of the files.

Technical References

You can try out SqlBulkCopy , allowing you to insert your data directly to the database by use of a DataTable .

More information can be found here: SqlBulkCopy.WriteToServerAsync .

Using SQLBulkCopy seems to be a very good solution as it is explained

here

Here we have the suggestion of doing this:

var lines = System.IO.File.ReadAllLines(@"d:\data.txt");
if (lines.Count() == 0) return;
var columns = lines[0].Split(',');
var table = new DataTable();
foreach (var c in columns)
    table.Columns.Add(c);

for (int i = 1; i < lines.Count() - 1; i++)
    table.Rows.Add(lines[i].Split(','));

var connection = @"your connection string";
var sqlBulk = new SqlBulkCopy(connection);
sqlBulk.DestinationTableName = "Table1";
sqlBulk.WriteToServer(table);

This is pretty efficient and by using this code I can insert 100.000 records in less than 5 secs which is totally acceptable performance. Direct SQL Bulk insert on the SQL server itself is of course faster and inserts the data almost instantly but when as in my case that is not possible, we have a decent alternative here.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM