简体   繁体   English

使用C#从Excel 2007从SQL Server批量上传时出错

[英]Error in Bulk Upload in SQL server from Excel 2007 using C#

I am trying to insert in SQL using Bulk upload. 我正在尝试使用批量上传在SQL中插入。 It is working well when I am using from local, even working on Test live URL too. 当我从本地使用时,它运行良好,甚至也可以测试实时URL。 But when I am uploading on Live it breaks after uploading 10000 rows. 但是当我在Live上上传时,在上传10000行后会中断。

Below is my code: 下面是我的代码:

public bool ExportExcelToSql (DataTable DT)
    {
        bool result = false;
        tableColumns = "*";
        try
        {
            using (var connection = new OleDbConnection(_excelConnectionString))
            {
                connection.Open();
                DataTable dt = null;
                dt = connection.GetOleDbSchemaTable(OleDbSchemaGuid.Tables, null);
                _excelSheetName = dt.Rows[0]["TABLE_NAME"].ToString();
                var command = new OleDbCommand("Select " + tableColumns + " FROM [" + _excelSheetName + "]", connection);
                command.CommandTimeout = 6000;
                using (DbDataReader dr = command.ExecuteReader())
                {
                    string conString = _sqlConnectionString;
                    var sqlConn = new SqlConnection(conString);
                    sqlConn.Open();

                    using (var bulkCopy = new SqlBulkCopy(sqlConn))
                    {
                        bulkCopy.BulkCopyTimeout = 6000;
                        bulkCopy.DestinationTableName = tableName.ToString();
                        for (int i = 0; i < DT.Rows.Count; i++)
                        {
                            bulkCopy.ColumnMappings.Add(new SqlBulkCopyColumnMapping(DT.Rows[i][0].ToString(), DT.Rows[i][1].ToString(),));
                        }
                        bulkCopy.WriteToServer(dr);
                    }
                    result = true;
                }
            }
        }
        catch (Exception ex)
        {
            throw ex;
        }
        return result;
    }

I would split the file in chunks and import them one at a time. 我将文件拆分为多个块,然后一次导入一个。 No, there is no such limit after 100,000 rows. 不,100,000行后没有此限制。 I would suspect that the problem is that the file does not match the format file or that your using something like SQL Express 2008 R2 is limited to 4GB database size. 我怀疑问题是文件与格式文件不匹配,或者您使用的诸如SQL Express 2008 R2之类的数据库大小限制为4GB。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM