简体   繁体   中英

Insert/Update rows in a DataTable, VB.NET

I'm having issues with a piece of code that should load a database table on a TypedTable and insert (or update if the key is already present), on the update part though my code run extremely slow.

Now, most of the tables I handle require a full refresh, so I wipe the data and re-add everything from another table in the typedtable using a simple AddTableRow(row) procedure that works just fine, but when I need to update the data I use the LoadDataRow(row, fAcceptChanges) function, and even with the .BeginLoadData() -> .EndLoadData() it gets extremely slow (2/3 update per second) with a Table containing around 500k rows of data (every row has like 15 cols).

I'm pretty new to vb.net so I don't know much about alternatives I have to update the datatable, but if anyone know any way to speed it up I'll be really glad to hear everything about.

Some more info:

Mostly the reason because I'm inserting the data row by row is because I need to check the constraints for my table so I can handle exeptions raised from the insert part, plus the automatic constraint check of the TypedDataTable it's pretty good, considering I have to handle more than 10 db tables.

My code for the update run like this atm:

Table = Parser.GetData()
TypedTable = TableAdapter.GetData()

For Each row In Table
    Try
        Dim TypedRow = TypedTable.NewRow()
        LoadNotTypedIntoTyped(row, TypedRow)
        TypedTable.BeginLoadData()
        TypedTable.LoadDataRow(TypedRow.ItemArray, True) 'TODO speed up this
        TypedTable.EndLoadData()
    Catch ex As Exception
        'Generic exception handling here
    End Try
Next

SqlBulkCopyLoadProcedure()

I found a good solution to my particular problem; using a typedtable mean that I have more control on the table constraints, because my datasource is related to the DB table, so I created a new empty typed table to load the new data, then I load the current data from the db and Table1.Merge(Table2) to merge the data.

In my case this is possible because the amount od data I handle in not too big (around 500k records), if the memory becomes a problem I think that a viable solution can be to create a support table and merge directly using SQL, but I'm a DB newbie so contradict me if I'm wrong here

Code of what I did:

Dim SupportTable As TypedTable = MyTypedTable.Clone()
For each row in TableToLoad
    Dim NewTypedRow = SupportTable.NewRow()
    For Each col In Columns
        'Load every column
    Next
    SupportTable.AddTypedRow(NewTypedRow)
Next
TypedTable.Merge(SupportTable)
TypedTable.AcceptChanges()
'Load to database

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM