简体   繁体   中英

Bulk Insert With Auto Increment - No Identity column

I am trying to implement bulk insert of data from Datatable. In my MS-SQL Table(Destination table) i have a column with primary key not Identity column, so i have to increment manually. But its not possible in Code because there will be multi Thread on the same table.Please give me suggestion if any.

public void BulkInsert(DataTable dtTable)
    {
        DataTable dtProductSold = dtTable;
        //creating object of SqlBulkCopy  
        SqlBulkCopy objbulk = new SqlBulkCopy(ConStr.ToString()); 
        //assigning Destination table name  
        objbulk.DestinationTableName = "BatchData_InvReportMapping";  
        //Mapping Table column  
        objbulk.ColumnMappings.Add("InvPK", "InvPK"); 
        objbulk.ColumnMappings.Add("DateValue", "DateDalue");
        objbulk.ColumnMappings.Add("TextValue", "TextValue");
        objbulk.ColumnMappings.Add("NumericValue", "NumericValue"); 
        objbulk.ColumnMappings.Add("ErrorValue", "ErrorValue"); 
        //inserting bulk Records into DataBase   
        objbulk.WriteToServer(dtProductSold);  
    }

Thanks in advance,

This is too long for a comment.

If you have a primary key column, then you need to take responsibility for its being unique and non-NULL when you insert rows. SQL Server offers a very handy mechanism to help with this, which is the identity column.

If you do not have an identity, then I you basically have two options:

  • Load data that has a valid primary key column.
  • Create a trigger that assigns the value when rows are loaded in.

Oh, wait. The default option for bulk insert is not to fire triggers, so the second choice really isn't a good option.

Instead, modify the table to have an identity primary key column. Then define a view on the table without the primary key and do the bulk insert into the view. The primary key will then be assigned automatically.

EDIT:

There is a third option, which might be feasible. Load the data into a staging table. Then insert from the staging table into the final table, calculating the primary key value. Something like this:

insert into finaltable (pk, . . .)
    select m.maxpk + seqnum, . . . .
    from (select row_number() over (order by (select null)) as seqnum,
                 . . .
          from stagingtable
         ) s cross join
         (select max(pk) as maxpk
          from finaltable
         ) m;

i had one idea

generally we use tables to store the records, even if you insert the data using front end finally it will be stored in table.So i am suggesting to use sequences with insert trigger on the table. which means when you insert the data into the table first the trigger will be called, sequence will be incremented the the increased value will be stored along with other values in the table. just try this. because in oracle 11g we don't have identity() hence we will use sequences and insert trigger for identity column

Create a Table called id's. VARCHAR(50) TableName, INT Id. When you want to generate your ids read the relevant row and increment it by the number of rows you want to insert within the same transaction. you can now bulk insert these rows whenever you want without worrying about other threads inserting them. Similar to how Nhibernates HiLow generator works. http://weblogs.asp.net/ricardoperes/making-better-use-of-the-nhibernate-hilo-generator

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM