简体   繁体   中英

How to export more than 1 million rows from SQL Server table to CSV in C# web app?

I am trying to export a SQL Server table with 1 million rows and 45 columns to a .csv file for the user to download via the web interface but it takes so long that I eventually have to stop the process manually.

I use a SqlDataReader and write into the file as the reader reads to avoid memory problems. The code works for small tables (less than 3k rows) but the large one keeps running and the destination file stays at 0 KB.

using (spContentConn) { using (var sdr = sqlcmd.ExecuteReader())
    using (CsvfileWriter)
    { 
        DataTable Tablecolumns = new DataTable();

        for (int i = 0; i < sdr.FieldCount; i++)
        {
            Tablecolumns.Columns.Add(sdr.GetName(i));
        }

        CsvfileWriter.WriteLine(string.Join("~", Tablecolumns.Columns.Cast<DataColumn>().Select(csvfile => csvfile.ColumnName)));

        while (sdr.Read())
            for (int j = Tablecolumns.Columns.Count; j > 0; j--)
            {
                if (j == 1)
                    CsvfileWriter.WriteLine("");
                else
                    CsvfileWriter.Write(sdr[Tablecolumns.Columns.Count - j].ToString() + "~");
            }
    }

I used the same answer recommended in this thread but still doesn't work. Please help. export large datatable data to .csv file in c# windows applications

It is not clear from the .NET documentation whether FileWriter has efficient buffering, therefore I always use a BufferedStream instead when I need to read/write large volumes of data. With a stream, you would have to write byte data instead of strings, but that requires only a minor adaptation of your code.

It also looks like you are reading and writing the columns of a DataTable in a loop, which would affect performance. Since the number and order of the columns would not change during an export operation, consider using the positional index to access the column values instead. It would also be better to write one row at a time instead of one column at a time.

Finally, you are using a data-reader, so that should provide the best throughput of data from your SQL Server (limited by your server and bandwidth, obviously). This would also suggest that the performance bottleneck is in the way that your data is being written to file.

For comparison, I just wrote 1,000,000 rows of 45 columns to a text file in under 60 seconds. Granted that my code does not read from a database, but that should still provide a good enough baseline for you.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM