简体   繁体   中英

Select n number of rows and run a stored procedure

I have table 1 it has - let's say - 10K records.

What I use is

ROW_NUMBER() OVER(BATCH_ID) as ROWLIST  

to get row numbers

What I need is: for the first 1K rows run let's say (I will have stored procedure that runs and for specific reasons I need it running for x number of rows at a time because of CPU/RAM requirement and limitation)

INSERT INTO dbo.[tbl_sub] 
SELECT CIN
FROM  tbl_Main

and for next 1K rows run same thing. Until the end of the recordset (or rowcount).

So basically partition out a table with X number of records into rows then per X number of rows run a specific code.

HERE IS THE CODE SOME ARE AWESOME SOLUTIONS! i have hard time implementing it in this code:

CREATE TABLE [LTAC_TEST_1](
[CLAIM_ID] [nvarchar](15) NULL,
[CIN] [nvarchar](10) NULL,
[SVC_DATE] [datetime] NULL,
[SVC_DATE_TO] [datetime] NULL,
[TOTAL_DAYS] [int] NULL,
[CHAIN_COUNT] [int] NULL
) ON [PRIMARY]

;WITH chain_builder AS
(
SELECT ROW_NUMBER() OVER(ORDER BY s.CIN, s.CLAIM_ID) as chain_ID,
  s.CIN,
  s.SVC_DATE, s.SVC_DATE_TO, s.CLAIM_ID, 1 as chain_count
FROM [LTAC_FINBASE_BASE2] s
WHERE s.SVC_DATE <> ALL 
  (
  SELECT DATEADD(d, 1, s2.SVC_DATE_TO)
  FROM [LTAC_FINBASE_BASE2] s2
  WHERE s.CIN = s2.CIN
  )
UNION ALL
SELECT chain_ID, s.CIN, s.SVC_DATE, s.SVC_DATE_TO,
  s.CLAIM_ID, chain_count + 1
  FROM [LTAC_FINBASE_BASE2] s
JOIN chain_builder as c
  ON s.CIN = c.CIN AND
  s.SVC_DATE = DATEADD(d, 1, c.SVC_DATE_TO)
),
chains AS
(
SELECT chain_ID, CIN, SVC_DATE, SVC_DATE_TO,
  CLAIM_ID, chain_count, ROW_NUMBER() OVER(PARTITION BY chain_ID, chain_count ORDER BY SVC_DATE_TO DESC) as link_row
FROM chain_builder
),
link_picker AS
(
SELECT chain_ID, CIN, SVC_DATE, SVC_DATE_TO,
  CLAIM_ID, chain_count
FROM chains
WHERE link_row = 1
),
diff AS
(
SELECT c.chain_ID, c.CIN, c.SVC_DATE, c.SVC_DATE_TO,
  c.CLAIM_ID, c.chain_count,
  datediff(day,c.SVC_DATE,c.SVC_DATE_TO)+1 daysdiff
FROM link_picker c
),
diff_sum AS
(
SELECT chain_ID, CIN, SVC_DATE, SVC_DATE_TO,
  CLAIM_ID, chain_count,
  SUM(daysdiff) OVER (PARTITION BY chain_ID) as total_diff
FROM diff
),
diff_comp AS
(
SELECT chain_ID, CIN,
  MAX(total_diff) OVER (PARTITION BY CIN) as total_diff
FROM diff_sum
)
INSERT INTO [LTAC_TEST_1]
SELECT DISTINCT ds.CLAIM_ID, ds.CIN, ds.SVC_DATE,
  ds.SVC_DATE_TO, ds.total_diff as TOTAL_DAYS, ds.chain_count as CHAIN_COUNT
FROM diff_sum ds
JOIN diff_comp dc
ON ds.chain_ID = dc.chain_ID AND ds.CIN = dc.CIN
  AND ds.total_diff = dc.total_diff
OPTION (maxrecursion 0)

There are multiple ways to do this, here is one solution - not saying it is the most efficient:

Providing you want to move all rows from tbl_Main to tbl_sub the WHILE EXISTS statement says continue our LOOP provided we still have values that appear in tbl_Main that don't yet exist ( NOT EXISTS ) in tbl_sub.

1000 rows will be inserted in order of BATCH_ID on each iteration until complete. In your example above it isn't clear why you are using the ROW_NUMBER() window function as you haven't specified whether that column is a PARTITION BY or ORDER BY and you may not require it.

WHILE EXISTS
    (
        SELECT
            *

        FROM
            tbl_Main

        WHERE
            NOT EXISTS
            (
                SELECT  *
                FROM    dbo.tbl_sub
                WHERE   CIN = tbl_Main.CIN
            )
    )

BEGIN
    INSERT INTO dbo.tbl_sub
    (
        CIN
    )

    SELECT TOP 1000
        CIN

    FROM
        tbl_Main

    WHERE
        NOT EXISTS
        (
            SELECT  *
            FROM    dbo.tbl_sub
            WHERE   CIN = tbl_Main.CIN
        )

    ORDER BY
        BATCH_ID
END

One approach is to Create a temporary table that contains all the key values with an identity number (starting at 0) and two computed columns that convert that identity number into a BatchID and a Batch Number between (0 and 999)

CREATE TABLE #t
(
id int identity (0,1),
BatchID as ([id]/10),
BatchNumber as ([id]%10),
KeyColumn varchar(50)
)

INSERT INTO #t(KeyColumn) 
SELECT KeyColumn  FROM [DataTable_t]

You can then cycle through all the values of BatchID and do what ever you need with that set of 1000 records

--Subquery to return a batch of 1000 records
(SELECT d.* 
FROM #t t 
JOIN DataTable_t] d 
ON d.KeyColumn = t.KeyColumn
WHERE t.BatchID = @BatchID ) AS Batch

I'm not sure if that's any use in your case

I would make a stored procedure that runs this

DECLARE @RowsAffected INT =1

While(@RowsAffected<>0)
BEGIN
INSERT INTO dbo.[tbl_sub] 
   SELECT TOP 1000
      tbl_Main.CIN
   FROM  tbl_Main
   left join dbo.[tbl_sub] t2 on t2.CIN=tbl_Main.CIN
   WHERE t2.CIN IS NULL

   SET @RowsAffected=@@Rowcount
END

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM