I have a table named student I want to assign a fee or update (if present), I am looping over them which worked fine until I deleted a student. So now whenever I run the stored procedure it shows error.
Here is my code.
ALTER PROC [dbo].[sp_AutoAssignFeeUpdate]
(
#FeeID int,
#FeeAmount int,
#Fine int,
#DueDate date,
#AppliedON date,
#FeeMonth varchar(30)
)
AS
--- Variables Using in Loops
DECLARE #LoopCounter INT , #MaxStudentID INT, #StdID INT, #FID INT
-- Setting Counter From the count of students in student table if they are 'Active'
SELECT #LoopCounter = min(AdmissionNumber) , #MaxStudentID = max(AdmissionNumber)
FROM StudentTable
-- WHILE Loop Condition
WHILE(#LoopCounter IS NOT NULL AND #LoopCounter <= #MaxStudentID )
BEGIN
--- SELECT IDs all Active students and matching with counter
SELECT #StdID = AdmissionNumber
FROM StudentTable WHERE AdmissionNumber = #LoopCounter AND Active = 'True'
--- CHECK IF ROW EXITS
SELECT #StdID = AdmissionNumber
FROM FeeAssociationTable
IF EXISTS ( SELECT FeeMonth FROM FeeAssociationTable
WHERE #LoopCounter = AdmissionNumber AND FeeID = #FeeID AND FeeMonth = #FeeMonth)
BEGIN
UPDATE FeeAssociationTable
SET FeeAmount = #FeeAmount, Fine = #Fine , DueDate = #DueDate
WHERE #LoopCounter = AdmissionNumber AND FeeID = #FeeID
AND FeeMonth = #FeeMonth
END
ELSEBEGIN
INSERT FeeAssociationTable
(FeeID, AdmissionNumber, FeeAmount, FeeMonth, DueDate, Fine, AppliedOn, [Status])
VALUES
(#FeeID, #LoopCounter, #FeeAmount, #FeeMonth, #DueDate, #Fine, #AppliedON, 'Pending')
END
SET #LoopCounter = #LoopCounter + 1
END
This is working if the Ids are continuous. What should I do if there is an Id missing or how to skip that specific number which is not present in the studentTable.
Explanation:
The loop take the initial value of min(id) from studentTable set as counter, and final value of max(id).
The loop compares both values id in studentTable and counter of the loop.
Then for each counter student in the table the the fee is assigned.
INSERT FeeAssociationTable
(FeeID, AdmissionNumber, FeeAmount, FeeMonth, DueDate, Fine, AppliedOn, [Status])
VALUES
(#FeeID, #LoopCounter, #FeeAmount, #FeeMonth, #DueDate, #Fine,
The problem is here, while inserting I am using #LoopCounter. Lets say #LoopCounter = 100 but StudentTable is skipping 100 and there is 101 there. The conflict rises. Because the SQL can't find the **100** id in the studentTable.
Thanks in Advance.
As I said in a comment, this whole thing looks like it can be replaced by a MERGE. Don't do things one step at a time when you can tell the server what to do with the entire set of rows.
Something like:
MERGE INTO FeeAssociationTable t
USING (SELECT AdmissionNumber, #FeeID as FeeID, #FeeMonth as FeeMonth FROM StudentTable
WHERE Active = 'True') s
ON t.AdmissionNumber = s.AdmissionNumber AND
t.FeeID = s.FeeID AND
t.FeeMonth = s.FeeMonth
WHEN MATCHED THEN UPDATE SET FeeAmount = #FeeAmount, Fine = #Fine , DueDate = #DueDate
WHEN NOT MATCHED THEN INSERT
(FeeID, AdmissionNumber, FeeAmount, FeeMonth, DueDate, Fine, AppliedOn, [Status])
VALUES
(#FeeID, s.AdmissionNumber, #FeeAmount, #FeeMonth, #DueDate, #Fine, #AppliedON, 'Pending');
Not sure I've got all of the conditions quite right, but you should be able to see what I'm driving at, I hope.
Your actual issue could have been "solved" by replacing:
SET #LoopCounter = #LoopCounter + 1
with:
SELECT #LoopCounter = MIN(AdmissionNumber) FROM StudentTable
WHERE Active = 'True' and AdmissionNumber > #LoopCounter
but don't do that, please.
Man , you should use a For each . For example:
DECLARE yourCursor CURSOR LOCAL STATIC
FOR SELECT AdmissionNumber
FROM StudentTable
OPEN yourCursor
FETCH NEXT FROM yourCursor INTO #StdID
WHILE ##FETCH_STATUS = 0
BEGIN
/*
CHECK IF EXIST FOR UPDATE OR INSERT
*/
FETCH NEXT FROM yourCursor INTO #StdID
END
CLOSE yourCursor
DEALLOCATE yourCursor
GO
This replace your while.
Related
We have DVD Rental company. In this particular scenario we consider only Member, Rental and Membership tables.
The task is to write a trigger that prevents a customer from being shipped a DVD
if they have reached their monthly limit for DVD rentals as per their membership contract using the function.
My trigger leads to infinite loop. It works without While loop, but then it does not work properly, if I consider multiple updates to the Rental table. Where I am wrong?
-- do not run, infinite loop
CREATE OR ALTER TRIGGER trg_Rental_StopDvdShip
ON RENTAL
FOR UPDATE
AS
BEGIN
DECLARE #MemberId INT
DECLARE #RentalId INT
SELECT * INTO #TempTable FROM inserted
WHILE (EXISTS (SELECT RentalId FROM #TempTable))
BEGIN
IF UPDATE(RentalShippedDate)
BEGIN
IF (SELECT TotalDvdLeft FROM dvd_numb_left(#MemberId)) <= 0
BEGIN
ROLLBACK
RAISERROR ('YOU HAVE REACHED MONTHLY LIMIT FOR DVD RENTALS', 16, 1)
END;
END;
DELETE FROM #TempTable WHERE RentalID = #RentalId
END;
END;
My function looks as follows:
CREATE OR ALTER FUNCTION dvd_numb_left(#member_id INT)
RETURNS #tab_dvd_numb_left TABLE(MemberId INT, Name VARCHAR(50), TotalDvdLeft INT, AtTimeDvdLeft INT)
AS
BEGIN
DECLARE #name VARCHAR(50)
DECLARE #dvd_total_left INT
DECLARE #dvd_at_time_left INT
DECLARE #dvd_limit INT
DECLARE #dvd_rented INT
DECLARE #dvd_at_time INT
DECLARE #dvd_on_rent INT
SET #dvd_limit = (SELECT Membership.MembershipLimitPerMonth FROM Membership
WHERE Membership.MembershipId = (SELECT Member.MembershipId FROM Member WHERE Member.MemberId = #member_id))
SET #dvd_rented = (SELECT COUNT(Rental.MemberId) FROM Rental
WHERE CONCAT(month(Rental.RentalShippedDate), '.', year(Rental.RentalShippedDate)) = CONCAT(month(GETDATE()), '.', year(GETDATE())) AND Rental.MemberId = #member_id)
SET #dvd_at_time = (SELECT Membership.DVDAtTime FROM Membership
WHERE Membership.MembershipId = (SELECT Member.MembershipId FROM Member WHERE Member.MemberId = #member_id))
SET #dvd_on_rent = (SELECT COUNT(Rental.MemberId) FROM Rental
WHERE Rental.MemberId = #member_id AND Rental.RentalReturnedDate IS NULL)
SET #name = (SELECT CONCAT(Member.MemberFirstName, ' ', Member.MemberLastName) FROM Member WHERE Member.MemberId = #member_id)
SET #dvd_total_left = #dvd_limit - #dvd_rented
SET #dvd_at_time_left = #dvd_at_time - #dvd_on_rent
IF #dvd_total_left < 0
BEGIN
SET #dvd_total_left = 0
SET #dvd_at_time_left = 0
INSERT INTO #tab_dvd_numb_left(MemberId, Name, TotalDvdLeft, AtTimeDvdLeft)
VALUES(#member_id, #name, #dvd_total_left, #dvd_at_time_left)
RETURN;
END
INSERT INTO #tab_dvd_numb_left(MemberId, Name, TotalDvdLeft, AtTimeDvdLeft)
VALUES(#member_id, #name, #dvd_total_left, #dvd_at_time_left)
RETURN;
END;
Will be glad for any advice.
Your main issue is that even though you populate #TempTable you never pull any values from it.
CREATE OR ALTER TRIGGER trg_Rental_StopDvdShip
ON RENTAL
FOR UPDATE
AS
BEGIN
DECLARE #MemberId INT, #RentalId INT;
-- Move test for column update to the first test as it applies to the entire update, not per row.
IF UPDATE(RentalShippedDate)
BEGIN
SELECT * INTO #TempTable FROM inserted;
WHILE (EXISTS (SELECT RentalId FROM #TempTable))
BEGIN
-- Actually pull some information from #TempTable - this wasn't happening before
SELECT TOP 1 #RentalID = RentalId, #MemberId = MemberId FROM #TempTable;
-- Select our values to its working
-- SELECT #RentalID, #MemberId;
IF (SELECT TotalDvdLeft FROM dvd_numb_left(#MemberId)) <= 0
BEGIN
ROLLBACK
RAISERROR ('YOU HAVE REACHED MONTHLY LIMIT FOR DVD RENTALS', 16, 1)
END;
-- Delete the current handled row
DELETE FROM #TempTable WHERE RentalID = #RentalId
END;
-- For neatness I always drop temp tables, makes testing easier also
DROP TABLE #TempTable;
END;
END;
An easy way to debug simply triggers like this is to copy the T-SQL out and then create an #Inserted table variable e.g.
DECLARE #Inserted table (RentalId INT, MemberId INT);
INSERT INTO #Inserted (RentalId, MemberId)
VALUES (1, 1), (2, 2);
DECLARE #MemberId INT, #RentalId INT;
-- Move test for column update to the first test as it applies to the entire update, not per row.
-- IF UPDATE(RentalShippedDate)
BEGIN
SELECT * INTO #TempTable FROM #inserted;
WHILE (EXISTS (SELECT RentalId FROM #TempTable))
BEGIN
-- Actually pull some information from #TempTable - this wasn't happening before
SELECT TOP 1 #RentalID = RentalId, #MemberId = MemberId FROM #TempTable;
-- Select our values to its working
SELECT #RentalID, #MemberId;
-- IF (SELECT TotalDvdLeft FROM dvd_numb_left(#MemberId)) <= 0
-- BEGIN
-- ROLLBACK
-- RAISERROR ('YOU HAVE REACHED MONTHLY LIMIT FOR DVD RENTALS', 16, 1)
-- END;
-- Delete the current handled row
DELETE FROM #TempTable WHERE RentalID = #RentalId
END;
-- For neatness I always drop temp tables, makes testing easier also
DROP TABLE #TempTable;
END;
Note: throw is the recommended way to throw an error instead of raiserror.
Another thing to consider is that you must try to transform your UDF into an inline TVF because of some side effects.
Like this one:
CREATE OR ALTER FUNCTION dvd_numb_left(#member_id INT)
RETURNS TABLE
AS
RETURN
(
WITH
TM AS
(SELECT Membership.MembershipLimitPerMonth AS dvd_limit,
Membership.DVDAtTime AS dvd_at_time,
CONCAT(Member.MemberFirstName, ' ', Member.MemberLastName) AS [name]
FROM Membership AS MS
JOIN Member AS M
ON MS.MembershipId = M.MembershipId
WHERE M.MemberId = #member_id
),
TR AS
(SELECT COUNT(Rental.MemberId) AS dvd_rented
FROM Rental
WHERE YEAR(Rental.RentalShippedDate ) = YEAR(GETDATE)
AND MONTH(Rental.RentalShippedDate ) = MONTH(GETDATE)
AND Rental.MemberId = #member_id
)
SELECT MemberId, [Name],
CASE WHEN dvd_limit - dvd_rented < 0 THEN 0 ELSE dvd_limit - dvd_rented END AS TotalDvdLeft,
CASE WHEN dvd_limit - dvd_rented < 0 THEN 0 ELSE dvd_at_time - dvd_on_rent END AS AtTimeDvdLeft
FROM TM CROSS JOIN TR
);
GO
Which will be much more efficient.
The absolute rule to have performances is: TRY TO STAY IN A "SET BASED" CODE instead of iterative code.
The above function can be optimized by the optimzer whilet yours cannot and will needs 4 access to the same tables.
I'm trying to generate dummy data from the existing data I have in the tables. All I want is to increase the number of records in Table1 to N specified amount. The other tables should increase based on the foreign key references.
The tables has one to many relationship. For one record in table 1, I can have multiple entries in table 2, and in table 3 I can have many records based on IDs of the second table.
Since IDs are primary keys, I either capture it by
SET #NEWLY_INSERTED_ID = SCOPE_IDENTITY()
after inserting to table 1 and using in insert for table2, or inserting them to temp table and joining them to achieve the same results for table 3.
Here's the approach I'm taking with the CURSOR.
DECLARE #MyId as INT;
DECLARE #myCursor as CURSOR;
DECLARE #DESIRED_ROW_COUNT INT = 70000
DECLARE #ROWS_INSERTED INT = 0
DECLARE #CURRENT_ROW_COUNT INT = 0
DECLARE #NEWLY_INSERTED_ID INT
DECLARE #LANGUAGE_PAIR_IDS TABLE ( LangugePairId INT, NewId INT, SourceLanguage varchar(100), TargetLangauge varchar(100) )
WHILE (#ROWS_INSERTED < #DESIRED_ROW_COUNT)
BEGIN
SET #myCursor = CURSOR FOR
SELECT Id FROM MyTable
SET #CURRENT_ROW_COUNT = (SELECT COUNT(ID) FROM MyTable)
OPEN #myCursor;
FETCH NEXT FROM #myCursor INTO #MyId;
WHILE ##FETCH_STATUS = 0
BEGIN
IF ((#CURRENT_SUBMISSION_COUNT < #DESIRED_ROW_COUNT) AND (#ROWS_INSERTED < #DESIRED_ROW_COUNT))
BEGIN
INSERT INTO [dbo].[MyTable]
([Column1]
([Column2]
([Column3]
)
SELECT
,convert(numeric(9,0),rand() * 899999999) + 100000000
,COlumn2
,Colum3
FROM MyTable
WHERE Id = #MyId
SET #NEWLY_INSERTED_ID = SCOPE_IDENTITY()
INSERT INTO [dbo].[Language]
([MyTable1Id]
,[Target]
,[Source]
OUTPUT inserted.Id, inserted.MyTable1Id, inserted.Source, inserted.[Target] INTO #LANGUAGE_PAIR_IDS (LangugePairId, NewId, SourceLanguage, TargetLangauge)
SELECT
#NEWLY_INSERTED_ID
,[Target]
,[Source]
FROM [dbo].[Language]
WHERE MyTableId = #MyId
ORDER BY Id
DECLARE #tbl AS TABLE (newLanguageId INT, oldLanguageId INT, sourceLanguage VARCHAR(100), targetLanguage VARCHAR(100))
INSERT INTO #tbl (newLanguageId, oldLanguageId, sourceLanguage, targetLanguage)
SELECT 0, id, [Source], [Target] MyTable1Id FROM Language WHERE MyTable1Id = #MyId ORDER BY Id
UPDATE t
SET t.newlanguageid = lp.LangugePairId
FROM #tbl t
JOIN #LANGUAGE_PAIR_IDS lp
ON t.sourceLanguage = lp.SourceLanguage
AND t.targetLanguage = lp.TargetLangauge
INSERT INTO [dbo].[Manager]
([LanguagePairId]
,[UserId]
,[MyDate])
SELECT
tbl.newLanguageId
,p.[UserId]
,p.[MyDate]
FROM Manager m
INNER JOIN #tbl tbl
ON m.LanguagePairId = tbl.oldLanguageId
WHERE m.LanguagePairId in (SELECT Id FROM Language WHERE MyTable1Id = #MyId) -- returns the old language pair id
SET #ROWS_INSERTED += 1
SET #CURRENT_ROW_COUNT +=1
END
ELSE
BEGIN
PRINT 'REACHED EXIT'
SET #ROWS_INSERTED = #DESIRED_ROW_COUNT
BREAK
END
FETCH NEXT FROM #myCursor INTO #MyId;
END
CLOSE #myCursor
DEALLOCATE #myCursor
END
The above code works! It generates the data I need. However, it's very very slow. Just to give some comparison. Initial load of data for table 1 was ~60,000 records, Table2: ~74,000 and Tabl3 ~3,400
I tried to insert 9,000 rows in Table1. With the above code, it took 17:05:01 seconds to complete.
Any suggestion on how I can optimize the query to run little faster? My goal is to insert 1-2 mln records in Table1 without having to wait for days. I'm not tied to CURSOR. I'm ok to achieve the same result in any other way possible.
I have to calculate an after tax salary amount based on a gross salary present in one table, and different other parameters present in another table. This is the situation:
I have a salary table that contains the gross salary of employees
To compute the net amount, I have to either substract or add other parameters (contributions, insurance, ...) based on whether the corresponding value has to be considered as either gross or relative (percentage). Here is the table:
Logic:
Relativite = 1 means that the value (valeur in the table) is percentage, 0 means it's gross.
Sens = 1 means the value has to be substracted from the salary, 0 means it has to be added.
With this example, what I want to achieve in order to get the net salary is something like this:
1st Line: Net_Salary = (700 - (700*13.4)/100).
2nd Line: Net Salary = value of first Line - 13
3rd LIne: Net Salary = value of 2nd Line - 13000 and so forth...
To achieve this, I have used a cursor that loops through the table and fetches each value to compute the net salary. I end up with something like this:
The problem with this result is that the amount is not decremented while looping through the table. It always computes based on the original value.
Here is the code I have used:
declare #registration_nr varchar(20),
#entity_id varchar(10)
DECLARE #gross_salary float, #net_salary float, #cursid int, #category varchar(50), #value float, #relative numeric(1), #sens numeric(1)
set #registration_nr = '19820506-0-2';
set #entity_id = 'edu7';
SET #gross_salary = (select pay_amount from dbo.EMPLOYEES_PAY where registration_nr = #registration_nr and entity_id = #entity_id and active = 1)
--set #rowcnt = (select count(1) from dbo.PARAMETRES_SALAIRES where code_institution = #entity_id and actif = 1)
CREATE TABLE #temp
(registration_nr varchar(20),
category varchar(50),
valeur float,
relativite numeric(1),
sens numeric(1),
salaire_net float);
DECLARE curs_rowid CURSOR FAST_FORWARD FOR
SELECT nom_categorie,
relativite,
valeur,
sens
FROM dbo.SALARY_SETTINGS --This is the table that contains the parameters (insurance,...)
WHERE code_institution = #entity_id and actif = 1;
OPEN curs_rowid
FETCH NEXT FROM curs_rowid INTO #category, #relative, #value, #sens
WHILE ##fetch_status = 0
BEGIN
if #relative = 0
BEGIN
if #sens = 0
BEGIN
set #net_salary = #gross_salary + (#gross_salary*#value)/100
INSERT INTO #temp (category, valeur, relativite, sens, salaire_net)
values(#category, #value, #relative, #sens, #net_salary);
END;
else if #sens = 1
BEGIN
set #net_salary = #gross_salary - (#gross_salary*#value)/100
INSERT INTO #temp (category, valeur, relativite, sens, salaire_net)
values(#category, #value, #relative, #sens, #net_salary);
END;
END;
else if #relative = 1
BEGIN
if #sens = 0
BEGIN
set #net_salary = #gross_salary + #value
INSERT INTO #temp (category, valeur, relativite, sens, salaire_net)
values(#category, #value, #relative, #sens, #net_salary);
END;
else if #sens = 1
BEGIN
set #net_salary = #gross_salary - #value
INSERT INTO #temp (category, valeur, relativite, sens, salaire_net)
values(#category, #value, #relative, #sens, #net_salary);
END;
END;
FETCH NEXT FROM curs_rowid INTO #category, #relative, #value, #sens
END;
CLOSE curs_rowid;
DEALLOCATE curs_rowid;
Any idea how I can solve this thing and have on the last row the last value that is based on all the previous calculations?
After the line:
SET #gross_salary = (select pay_amount from dbo.EMPLOYEES_PAY where registration_nr = #registration_nr and entity_id = #entity_id and active = 1)
Add
SET #net_salary=#gross_salary;
And in the cursor part, replace all #gross_salary with #net_salary
I have the below insert query which selects records from the OriginalData table where everything is nvarchar(max) and inserts it into the Temp table which has specific field definitions i.e MainAccount is INT.
I am doing a row by row insert because if there is a record in OriginalData table where the MainAccount value is 'Test' the it will obviously cause a conversion error and the insert will fail.
I want to be able to capture this error. There is a field on the originalData table called "error" which I want to populate. However I want this to run thru the entire table as oppose to fail on the first error and stop.
DECLARE #RowId INT
, #MaxRowId INT
Set #RowId = 1
Select #MaxRowId = 60
WHILE(#RowId <= #MaxRowId)
BEGIN
INSERT INTO [Temp] (ExtractSource, MainAccount,RecordLevel1Code, RecordLevel2Code, RecordTypeNo, TransDate, Amount, PeriodCode, CompanyCode)
SELECT ExtractSource, MainAccount,RecordLevel1Code, RecordLevel2Code,RecordTypeNo,TransDate, Amount, PeriodCode, DataAreaId
FROM [OriginalData]
WHERE RowId = #RowId
PRINT #RowId
SET #RowId = #RowId + 1
END
select * from [Temp]
You should use TRY..CATCH block:
WHILE(#RowId <= #MaxRowId)
BEGIN
BEGIN TRY
INSERT INTO [Temp] (ExtractSource, MainAccount,RecordLevel1Code,
RecordLevel2Code, RecordTypeNo, TransDate, Amount, PeriodCode, CompanyCode)
SELECT ExtractSource, MainAccount,RecordLevel1Code, RecordLevel2Code,
RecordTypeNo,TransDate, Amount, PeriodCode, DataAreaId
FROM [OriginalData]
WHERE RowId = #RowId;
PRINT #RowId;
END TRY
BEGIN CATCH
-- error handlingg
END CATCH
SET #RowId += 1;
END
I have a simple query for update table (30 columns and about 150 000 rows).
For example:
UPDATE tblSomeTable set F3 = #F3 where F1 = #F1
This query will affected about 2500 rows.
The tblSomeTable has a trigger:
ALTER TRIGGER [dbo].[trg_tblSomeTable]
ON [dbo].[tblSomeTable]
AFTER INSERT,DELETE,UPDATE
AS
BEGIN
declare #operationType nvarchar(1)
declare #createDate datetime
declare #UpdatedColumnsMask varbinary(500) = COLUMNS_UPDATED()
-- detect operation type
if not exists(select top 1 * from inserted)
begin
-- delete
SET #operationType = 'D'
SELECT #createDate = dbo.uf_DateWithCompTimeZone(CompanyId) FROM deleted
end
else if not exists(select top 1 * from deleted)
begin
-- insert
SET #operationType = 'I'
SELECT #createDate = dbo..uf_DateWithCompTimeZone(CompanyId) FROM inserted
end
else
begin
-- update
SET #operationType = 'U'
SELECT #createDate = dbo..uf_DateWithCompTimeZone(CompanyId) FROM inserted
end
-- log data to tmp table
INSERT INTO tbl1
SELECT
#createDate,
#operationType,
#status,
#updatedColumnsMask,
d.F1,
i.F1,
d.F2,
i.F2,
d.F3,
i.F3,
d.F4,
i.F4,
d.F5,
i.F5,
...
FROM (Select 1 as temp) t
LEFT JOIN inserted i on 1=1
LEFT JOIN deleted d on 1=1
END
And if I execute the update query I have a timeout.
How can I optimize a logic to avoid timeout?
Thank you.
This query:
SELECT *
FROM (
SELECT 1 AS temp
) t
LEFT JOIN
INSERTED i
ON 1 = 1
LEFT JOIN
DELETED d
ON 1 = 1
will yield 2500 ^ 2 = 6250000 records from a cartesian product of INSERTED and DELETED (that is all possible combinations of all records in both tables), which will be inserted into tbl1.
Is that what you wanted to do?
Most probably, you want to join the tables on their PRIMARY KEY:
INSERT
INTO tbl1
SELECT #createDate,
#operationType,
#status,
#updatedColumnsMask,
d.F1,
i.F1,
d.F2,
i.F2,
d.F3,
i.F3,
d.F4,
i.F4,
d.F5,
i.F5,
...
FROM INSERTED i
FULL JOIN
DELETED d
ON i.id = d.id
This will treat update to the PK as deleting a record and inserting another, with a new PK.
Thanks Quassnoi, It's a good idea with "FULL JOIN". It is helped me.
Also I try to update table in portions (1000 items in one time) to make my code works faster because for some companyId I need to update more than 160 000 rows.
Instead of old code:
UPDATE tblSomeTable set someVal = #someVal where companyId = #companyId
I use below one:
declare #rc integer = 0
declare #parts integer = 0
declare #index integer = 0
declare #portionSize int = 1000
-- select Ids for update
declare #tempIds table (id int)
insert into #tempIds
select id from tblSomeTable where companyId = #companyId
-- calculate amount of iterations
set #rc=##rowcount
set #parts = #rc / #portionSize + 1
-- update table in portions
WHILE (#parts > #index)
begin
UPDATE TOP (#portionSize) t
SET someVal = #someVal
FROM tblSomeTable t
JOIN #tempIds t1 on t1.id = t.id
WHERE companyId = #companyId
delete top (#portionSize) from #tempIds
set #index += 1
end
What do you think about this? Does it make sense? If yes, how to choose correct portion size?
Or simple update also good solution? I just want to avoid locks in the future.
Thanks