[英]How to bulk insert records into SQL server
I'm trying to read a text file and then split each line as per it's significance which forms a particular record entry in my database table.我正在尝试读取一个文本文件,然后根据其重要性拆分每一行,这在我的数据库表中形成一个特定的记录条目。 I'm storing these records in a list and bulk insert the data from list to database.我将这些记录存储在一个列表中,并将数据从列表批量插入到数据库中。 The file that I'm reading is of size ~18MB and has around 15,000 to 18,000 of lines .我正在阅读的文件大小约为 18MB,大约有 15,000 到 18,000 行。 Below is the code :下面是代码:
StringBuilder logInsertCommand = new StringBuilder();
List<string> bulkLogInsert = new List<string>();
using (FileStream fs = File.Open(FilePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (BufferedStream bs = new BufferedStream(fs))
using (StreamReader sr = new StreamReader(bs, Encoding.GetEncoding("iso-8859-1")))
{
while ((line = sr.ReadLine()) != null)
{
//Perform some logic with `line` and get all the column values required for inserting a new record in database table. Values like FirstColumnValue, SecondColumnValue are obtained from the logic performed on `line` variable.
logInsertCommand.Append(FirstColumnValue).Append(';').Append(SecondColumnValue).Append(';').Append(ThirdColumnValue).Append(';').Append(FourthColumnValue).Append(';').Append(FifthColumnValue);
bulkLogInsert.Add(logInsertCommand.ToString());
}
}
public void InsertBulkLog(List<string> records)
{
try
{
String connectionString = ConfigurationManager.AppSettings["DBConString"];
DataTable table = new DataTable("TORNADO_LOGS");
table.Columns.Add(new DataColumn("FILENAME", typeof(string)));
table.Columns.Add(new DataColumn("PROJ_CODE", typeof(string)));
table.Columns.Add(new DataColumn("IS_RECORD_PROCESSED", typeof(string)));
table.Columns.Add(new DataColumn("FILE_LAST_MODIFIED_DATE", typeof(string)));
table.Columns.Add(new DataColumn("MP3_FILE", typeof(string)));
foreach (string record in records)
{
string[] rowParameters = record.Split(new char[] { ';' }, StringSplitOptions.RemoveEmptyEntries);
table.Rows.Add(rowParameters);
}
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connectionString))
{
bulkCopy.BulkCopyTimeout = 600;
bulkCopy.DestinationTableName = table.TableName;
bulkCopy.WriteToServer(table);
}
}
catch (Exception ex)
{
//Write to log
}
}
My question here is that I'm storing the records (15k to 17k) in a container like list and then trying to bulk insert the data in SQL server I guess this is not so good approach so how can I efficiently insert this data into database?我的问题是我将记录(15k 到 17k)存储在像列表这样的容器中,然后尝试在 SQL 服务器中批量插入数据我想这不是很好的方法,所以我怎样才能有效地将这些数据插入数据库? Any approach will be helpful.任何方法都会有所帮助。
To fully stream the data from the file into SQL, you need to create a IDataReader
.要将文件中的数据完全流式传输到 SQL,您需要创建一个IDataReader
。
There are many ways to do this, but the easiest is to use the NuGet FastMember library, which has ObjectReader.Create
.有很多方法可以做到这一点,但最简单的是使用 NuGet FastMember 库,它具有ObjectReader.Create
。 This accepts an IEnumerable<SomeType>
and returns a IDataReader
which you can pass directly to WriteToServer
.这接受一个IEnumerable<SomeType>
并返回一个IDataReader
,您可以将其直接传递给WriteToServer
。 This means that each line is streamed into the Bulk Copy, and you never store the whole file in memory at once.这意味着每一行都流式传输到批量复制中,您永远不会一次将整个文件存储在内存中。
private IEnumerable<RecordLine> GetRecords()
{
using (FileStream fs = File.Open(FilePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (StreamReader sr = new StreamReader(bs, Encoding.GetEncoding("iso-8859-1")))
{
while ((line = sr.ReadLine()) != null)
{
var line = new RecordLine();
// use logic to create a RecordLine object here
yield return line;
}
}
}
public void InsertBulkLog()
{
try
{
var connectionString = ConfigurationManager.AppSettings["DBConString"];
using (var reader = ObjectReader.Create(GetRecords());
using (var bulkCopy = new SqlBulkCopy(connectionString))
{
bulkCopy.BulkCopyTimeout = 600;
bulkCopy.DestinationTableName = table.TableName;
bulkCopy.WriteToServer(reader);
}
}
catch (Exception ex)
{
//Write to log
}
}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.