简体   繁体   中英

Bulk data insertion in SQL Server table from delimited text file using c#

I have tab delimited text file. File is around 100MB. I want to store data from this file to SQL server table. The file contains 1 million records when stored in sql server. What is the best way to achieve this?

I can create in momory datatable in c# and then upload the same to sql server, but in this case it will load entire 100 MB file to memory. What if file size get bigger?

No problem; CsvReader will handle most delimited text formats, and implements IDataReader , so can be used to feed a SqlBulkCopy . For example:

using (var file = new StreamReader(path))
using (var csv = new CsvReader(file, true)) // true = first row is headers
using (var bcp = new SqlBulkCopy(connectionString))
{
    bcp.DestinationTableName = "Foo";
    bcp.WriteToServer(csv);
}

Note that CsvReader has lots of options more more subtle file handling (specifying the delimiter rules, etc). SqlBulkCopy is the high-performance bulk-load API - very efficient. This is a streaming reader/writer API; it does not load all the data into memory at once.

You should read the file line-by-line, so you don't have to load the whole line into memory:

using (var file = System.IO.File.OpenText(filename))
{
    while (!file.EndOfStream)
    {
        string line = file.ReadLine();

        // TODO: Do your INSERT here
    }
}

* Update *

" This will make 1 million separate insert commands to sql server. Is there any way to make it in bulk "

You could use parameterised queries, which would still issue 1M inserts, but would still be quite fast.

Alternatively, you can use SqlBulkCopy , but that's going to be rather difficult if you don't want to use 3rd party libraries. If you are more amenable to the MS license, you could use the LINQ Entity Data Reader (distributed under Ms-PL license), which provides the AsDataReader extension method:

void MyInsertMethod()
{
    using (var bulk = new SqlBulkCopy("MyConnectionString"))
    {
        bulk.DestinationTableName = "MyTableName";
        bulk.WriteToServer(GetRows().AsDataReader());
    }
}

class MyType
{
    public string A { get; set; }
    public string B { get; set; }
}

IEnumerable<MyType> GetRows()
{
    using (var file = System.IO.File.OpenText("MyTextFile"))
    {
        while (!file.EndOfStream)
        {
            var splitLine = file.ReadLine().Split(',');

            yield return new MyType() { A = splitLine[0], B = splitLine[1] };
        }
    }
}

If you didn't want to use the MS licensed code either, you could implement IDataReader yourself, but that is going to be a PITA. Note that the CSV handling above ( Split(',') ) is not at all robust, and also that column names in the table must be the same as property names on MyType . TBH, I'd recommend you go with Marc's answer on this one

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM