简体   繁体   中英

SSIS Script task enlist in the current transaction

I have a package in SSIS with multiple task. I am loading file,if the Filesystem task at the end fail i want to be able to rollback the transaction.My package look like that. general_package I like to be able to rollback all the operations the SSIS script have been done.To do that i need the SSIS script to enlist in the transaction created by the BEGIN_TRANSACTION Sql task.How could i do that ?

In ssis to gather the transaction i do :

 object rawConnection = Dts.Connections["destination_ado"].AcquireConnection(Dts.Transaction);
             myADONETConnection = (SqlConnection)rawConnection;

Then i do a BulkCopy:

using (SqlBulkCopy sbc = new SqlBulkCopy(myADONETConnection))
                    {
                        sbc.DestinationTableName = "[" + SCHEMA_DESTINATION + "].[" + TABLE_DESTINATION + "]";
                        //  sbc.DestinationTableName = "test_load";
                        // Number of records to be processed in one go
                        sbc.BatchSize = 10000;


                        // Finally write to server
                        sbc.WriteToServer(destination);
                    }
                    myADONETConnection.Close();

How do I tell the SqlBulkCopy to use the existing transaction ? In the options of the connection in SSIS i use RetainSameConnection:true

Thanks for all your thought

Vincent

Looking at your package, I see that you are iterating through bunch of files and for each iteration you are loading the files content into your destination tables. You want all your data loads to be atomic ie fully loaded or none at all.

With this in mind I would like to suggest the following approaches and in all of these appraoches there is no need of using Script Task or Begin/End Transaction blocks explicitly -

  1. Use a Data Flow Task and in the properties set TransactionOption to Required. This will do the job of enabling the transaction on the block
  2. Have a error-redirection at the destination to a error table in a batch-wise manner so as to minimize the errors to the lowest minimum possible (such as - http://agilebi.com/jwelch/2008/09/05/error-redirection-with-the-ole-db-destination/ ). We used 100k, 50k, 1 as 3 batches successfully when doing data loads of over million per day. You can then deal with those errors seperately.
  3. If the use case is such that the whole data has to fail then just redirect the failed records. Move the record to 'failed' folder using File System Task(FST). Have a DFT following the FST to perform a lookup on the destination and then deleting all those records.

So i found a solution. On the first Script block (extract and load )i create a transaction with this code:

SqlTransaction tran = myADONETConnection.BeginTransaction(IsolationLevel.ReadCommitted);

Then i use this transaction in the SqlBulkCopy this way :

using (SqlBulkCopy sbc = new SqlBulkCopy(myADONETConnection,SqlBulkCopyOptions.Default,tran))

Pass the transaction object to an SSIS variable :

Dts.Variables["User::transaction_object"].Value = tran;

Then on my two block at the end Commit transaction and Rolloback transaction i use SSIS script, read the variable and either commit or rollback the transaction:

SqlTransaction tran = (SqlTransaction)Dts.Variables["User::transaction_object"].Value;
tran.Commit();

As a result if a file cannot be move to the Archive folder i don't get load twice,a transaction is fire for each file so if a file can't be more only the data about this file get rollback and the enumerator keep on going to the next one.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM