简体   繁体   中英

Persist variables in SQL commands using C# and one connection

So maybe someone can point me in right direction.

I have this massive SQL query that a .NET application is running (I'm going to try my best to explain since i cannot post code).

What I decided to do is break the query down so the database does not get deadlocks.

So long story short is there a way to keep all the variable

Declare @sometable as Table

since each SqlCommand needs the information and IDs from the queries before it.

This is using the same connection. I've been at this for 4 days now and my head is turning.

Here is some of my code i tried using the same command and changing the query text

foreach (var query in querys)
{
 command.CommandText = query;
 DataSet ds = new DataSet();
  try {
var reader = command.ExecuteReader();

while (!reader.IsClosed)
ds.Tables.Add().Load(reader);
mergedResults.Add(ds);
 }

EDIT: here is a sample of the top of the massive query thatr needs to be used every where

DECLARE @dbID int;
DECLARE @idDomain int;
DECLARE @DomainName varchar(255);

DECLARE @IdDrivers TABLE
(
 idDriver int,
 startTime datetime,
 endTime dateTime
);

Anything starting with @ (locals, parameters, table valued parameters, etc) are all scoped to a single SQL batch, so no: they cannot be persisted between batches. There are # temporary tables which are scoped to the connection, but this doesn't sound like a good solution. Slowing things down doesn't change deadlock behaviour, as locking semantics are per operation , not per batch - unless you have a spanning transaction, in which case that has impact. Either way: the number of batches is not a determining factor.

If you're getting deadlocks, you need to investigate the how and why of that. It might be that performing some reads (those followed by updates to the same data) with UPDLOCK might help, by taking exclusive locks sooner.

If you want to persist your table variable across batches:

Instead of the table variable @IdDrivers, you could use a temporary table (#IdDrivers). Create this temp table in your first batch.

eg:

if object_id('tempdb..#IdDrivers') is not null drop table #IdDrivers

create table #IdDrivers (your required columns here)

Then your second batch can populate the temp table, and subsequent batches can read from it.

This should work, as long as you open the Sql connection before the first batch, and close it after the last batch has been executed.

You may want to create another #temp table, to store your variables like: @idDomain.

It is important that the batch which creates the temporary table NOT execute via sp_executesql behind the scenes - otherwise, your subsequent batches will not be able to see the temp table you created, due to scoping. eg if you attempt to pass parameters, then ADO.NET will use sp_executesql behind the scenes, so don't pass parameters to the sql which creates the temporary table.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM