After Trigger with JOIN Tables - sql

TASK: Create an AFTER TRIGGER to accomplish a condition from a JOIN. The Trigger would be in table_1 when some record is created. Meanwhile, table_2 has a common column with some parameters that the condition needs to have.
Every time that the Result <> 1 AND Status <> 3 in table_2 and ALERT should be sent
-- QUERY WITH JOIN TABLE_1 ON TABLE_2
-- MOCK TABLE
-- Table_1 as A | Table_2 as B
A.LotCode | A.LineNumber | B.Result | B.Status
00000 | xxxx | 1 | 3
00001 | xxxx | 2 | 4
-- The LotCode 00001 should send it through email because satisfy the condition
CREATE TRIGGER FullfillOrderQCResult
ON Table_1
AFTER INSERT
AS
BEGIN
-----DECLARE VARIABLES-----
DECLARE #LOTNUMBER VARCHAR(50)
DECLARE #ACTIONPEFORMED VARCHAR(MAX)
DECLARE #ITEM INT
DECLARE #RESULT TINYINT
DECLARE #STATUS TINYINT
SELECT #LOTNUMBER = A.LotCode, #ITEM = A.LineNumber, #RESULT = B.Result, #STATUS = B.Status
FROM inserted AS A
JOIN Table_2 AS B
ON A.LotCode = B.DocumentID2
-----CONDITION WHEN I INSERT A VALUE-----
IF (#RESULT <> 1 AND #STATUS <> 3)
BEGIN
SET #ACTIONPEFORMED =
N'Hello, ' + '<br>' + '<br>'
+ N' The following LOT NUMBER: ' + #LOTNUMBER + ' has not been approved for this Item: '
EXEC MSDB.DBO.SP_SEND_DBMAIL
#PROFILE_NAME = 'SQLMail',
#RECIPIENTS = 'TEST#gmail.com',
#SUBJECT = 'LOT NON-Approved',
#BODY = #ACTIONPEFORMED,
#IMPORTANCE = 'HIGH',
#BODY_FORMAT = 'HTML'
END
ELSE
PRINT 'ALL GOOD MY FRIEND'
END
TESTING THE TRIGGER
--------INSERT VALUES------------------
INSERT INTO Table_1 (LotCode,LineNumber)
values ('00000','xxxx')
-----EXISTING VALUES-----
INSERT INTO Table_2 (CreationUser,DocumentID1,DocumentID2,DocumentID3,Result,Status)
values ('JL','00000','00000','00000',2,3)

The following shows you how to handle the fact that Inserted might have multiple rows. This is really not ideal behaviour for a trigger, because you have to process the results RBAR (row by agonising row), which is slow by itself, let alone the fact that you are sending an email.
CREATE TRIGGER FullfillOrderQCResult
ON Table_1
AFTER INSERT
AS
BEGIN
SET NOCOUNT ON;
-----DECLARE VARIABLES-----
DECLARE #ACTIONPEFORMED varchar(max), #Id int;
SELECT A.LotCode, A.LineNumber, CONVERT(bit, 0) Done, IDENTITY(int) id -- Use your own id if you have one, just need to uniquely identify each row.
INTO #FullfillOrderQCResult_temp
FROM Inserted AS A
INNER JOIN Table_2 AS B ON A.LotCode = B.DocumentID2
WHERE B.Result <> 1 and B.[Status] <> 3;
WHILE EXISTS (SELECT 1 FROM #FullfillOrderQCResult_temp WHERE Done = 0) BEGIN
SELECT TOP 1 #Id = id, #ACTIONPEFORMED =
N'Hello, ' + '<br>' + '<br>'
+ N'The following LOT NUMBER: ' + LotCode + ' has not been approved for this Item: ' + LineNumber
FROM #FullfillOrderQCResult_temp
WHERE Done = 0;
EXEC MSDB.DBO.SP_SEND_DBMAIL
#PROFILE_NAME = 'SQLMail',
#RECIPIENTS = 'TEST#gmail.com',
#SUBJECT = 'LOT NON-Approved',
#BODY = #ACTIONPEFORMED,
#IMPORTANCE = 'HIGH',
#BODY_FORMAT = 'HTML';
UPDATE #FullfillOrderQCResult_temp SET Done = 1 WHERE id = #Id;
END;
END;
I don't know whether you would still want the concept of 'ALL GOOD MY FRIEND' because you could have none, some or all rows with issues. Anyway I assume print is only for debugging.
That said you would be much better off pushing an event into a queue and having a service process said event because triggers really should be as fast as possible. And adding an event to a queue could be handled in a set based manner e.g.
CREATE TRIGGER FullfillOrderQCResult
ON Table_1
AFTER INSERT
AS
BEGIN
SET NOCOUNT ON;
INSERT INTO MyEventQueue (A.LotCode, A.LineNumber) -- Any other information required to identify the records etc
SELECT A.LotCode, A.LineNumber
FROM Inserted AS A
INNER JOIN Table_2 AS B ON A.LotCode = B.DocumentID2
WHERE B.Result <> 1 and B.[Status] <> 3;
END;

Related

SQL Loop through 8 million record and update them

I have a audit table that has about 8 million records. I have recently added two new column which I need to update from existing column with some rules/conditions. Basically initially, whenever a FK was updated in a table, it was storing old and new FK ids into the audit table. for example
Table A
ID Name
1 First A
2 Second A
3 Third A
Table B
ID AID Name
1 1 First B
2 1 Second B
3 2 Third B
Audit
ID TableName FieldName OldValue NewValue
now if i update first record of the table B
from 1 1 First B to 1 3 First B then the audit table will store the change as
Audit
ID TableName FieldName OldValue NewValue
1 Table B AID 1 3
Now I have updated Audit table to store actual Text value of the FK i.e above change will be stored as
Audit
ID TableName FieldName OldValue NewValue OldText NewText
1 Table B AID 1 3 First A Third A
The problem is I already have about 8 million records that I need to new columns for. I have written below query to do that
declare #sql nvarchar(max);
declare #start int = 1
while #start <= 8000000
begin
select top 10000 #sql = COALESCE(#sql+'Update Audit set ','Update Audit set') +
isnull(' OldText = ('+ dbo.GetFKText(i.TableName, i.FieldName)+case when len(isnull(i.OldValue,'')) < 1 then null else i.OldValue end +'),',' OldText = OldValue, ') +
isnull(' NewText = ('+ dbo.GetFKText(i.TableName, i.FieldName)+case when len(isnull(i.NewValue,'')) < 1 then null else i.NewValue end +')',' NewText = NewValue ') +
' where AuditID = '+cast(i.AuditID as nvarchar(200))+' and lower(ltrim(rtrim(TableName))) <> ''audit'';'
from Audit i where i.AuditID >= #start
exec sp_executesql #sql
set #start = #start+10000;
end
get text function (basically I getting column that has name = (TableName)+'Name' or (TablName)+(SomeText)+'Name' this just a convention that I have followed in all the tables)
declare #res nvarchar(max)='';
declare #fn nvarchar(200);
declare #ttn nvarchar(200);
declare #tcn nvarchar(200);
SELECT top 1
#ttn = kcu.table_name
,#tcn = kcu.column_name
FROM INFORMATION_SCHEMA.CONSTRAINT_COLUMN_USAGE ccu
INNER JOIN INFORMATION_SCHEMA.REFERENTIAL_CONSTRAINTS rc
ON ccu.CONSTRAINT_NAME = rc.CONSTRAINT_NAME
INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE kcu
ON kcu.CONSTRAINT_NAME = rc.UNIQUE_CONSTRAINT_NAME
Where ccu.TABLE_NAME = #TableName and ccu.COLUMN_NAME = #FieldName
if isnull(#ttn,'') != '' and ISNULL(#tcn,'') != ''
begin
select #fn= COLUMN_NAME
from (SELECT top 1 COLUMN_NAME ,
case when COLUMN_NAME like (#ttn+'Name') then 0
when COLUMN_NAME like (#ttn+'%Name') then 1
when COLUMN_NAME like (#ttn+'Code') then 2
when COLUMN_NAME like (#ttn+'%Code') then 3 else 4 end as CPriority
FROM JVO.INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = #ttn and (COLUMN_NAME like '%Name' or COLUMN_NAME like '%Code'
)
order by CPriority) as aa;
RETURN 'select '+#fn+' from '+#ttn+' where '+#tcn+' = ';
end
return null;
Its working but really slow, it update about 1 million records in 13 hours. can anyone help to improve this query or suggest alternative way to update it.
Thanks

How to fire a trigger after the SPROC has completed execution?

I have written a trigger that sends email once a row INSERT is performed.
ALTER TRIGGER TR_SendMailOnDataRequest
ON DataRequest
AFTER INSERT
AS
BEGIN
SET NOCOUNT ON;
DECLARE
#DR_Id INT,
#DR_FullName VARCHAR(200),
#DR_Email VARCHAR(200),
#DR_Phone VARCHAR(20),
#UT_Name VARCHAR(50),
#DR_UserTypeOther VARCHAR(50) = NULL,
#D_Name VARCHAR(200),
#DR_RequestDate DATETIME,
#UF_LinkedFiles VARCHAR(MAX),
#DRN_Names VARCHAR(200),
#DR_Description VARCHAR(1200),
#DR_CreatedOn DATETIME,
#analystMailList VARCHAR(MAX),
#tableHtml NVARCHAR(MAX),
#downloadLink VARCHAR(MAX) = N'NONE'
SELECT #DR_Id = MAX(DR_Id) FROM dbo.DataRequest
SELECT
#DR_FullName = DR_FullName,
#DR_Email = DR_Email,
#DR_Phone = DR_Phone,
#UT_Name = UT_Name,
#DR_UserTypeOther = DR_UserTypeOther,
#D_Name = D_Name,
#DR_RequestDate = DR_RequestDate,
#UF_LinkedFiles = UF_LinkedFiles,
#DRN_Names = DRN_Names,
#DR_Description = DR_Description,
#DR_CreatedOn = DR_CreatedOn
FROM
dbo.FN_GetDataRequest(#DR_Id)
SELECT #analystMailList = dbo.FN_GetAnalystsMailList()
IF (LEN(#UF_LinkedFiles) > 0)
BEGIN
SET #downloadLink = N'Downloads'
END
SET #tableHTML =
N'<H1>Data Request</H1>' +
N'<UL>' +
N'<LI>Full Name: ' + #UF_LinkedFiles + N'</LI>' +
N'<LI>Email: ' + #DR_Email + N'</LI>' +
N'<LI>Phone: ' + CAST(#DR_Phone AS VARCHAR(20)) + N'</LI>' +
N'<LI>User Type: ' + #UT_Name + N'</LI>' +
N'<LI>User Type Other: ' + COALESCE(#DR_UserTypeOther, N'NONE') + N'</LI>' +
N'<LI>Reuest Date: ' + CONVERT(VARCHAR(20), #DR_RequestDate, 107) + N'</LI>' +
N'<LI>Downloads: ' + #downloadLink + N'</LI>' +
N'</UL>';
BEGIN
EXEC msdb.dbo.sp_send_dbmail
#profile_name = 'Example',
#recipients = 'John Doe<jdoe#example>',
--#recipients = #analystMailList,
#reply_to = #DR_Email,
#subject = 'Email Test',
#body_format = 'HTML',
#body = #tableHtml
END
END
GO
The above trigger is fired when there is a ROW INSERT operation on table DataRequest. After the row insert operation, I take the IDENTITY element generated after the INSERT operation and use that as the foreign key, and INSERT other values in a different table. Finally, I use the values from both the tables and create an email to be sent.
I wasn't getting the values from the other tables (e.g. #UF_LinkedFiles), so I realized that the TRIGGER is being fired just after the INSERT in FIRST table but before the INSERT in the SECOND table, thus no values available when SENDING EMAIL.
So how do I make sure that TRIGGER is fired only after the SPROC that does all the INSERT activities in multiple tables has completed the transaction.
Here is the table diagram -
Instead of using a trigger, I have included the EMAIL SENDING code in the SPROC where rows are being inserted.
Not sure if that is your case because you don't explain how is the behavior between the tables. But i had an scenario where i try to execute a SELECT during a series of insert and i couldn't find the row because the transaction wasn't finish yet.
What i did was create an additional table
tblProgress
id integer,
fieldA integer,
fieldB integer,
fieldC integer
So if you have 3 tables TableA, TableB and TableC each table will have one INSERT trigger and will do some job then access tblProgress.
TableA create a row
TableB and TableC update.
Then tblProgress will also have an AFTER UPDATE trigger, where you validate all 3 field have NOT NULL value
When you have all 3 values you can send the email.

Efficient SQL Server stored procedure

I am using SQL Server 2008 and running the following stored procedure that needs to "clean" a 70 mill table from about 50 mill rows to another table, the id_col is integer (primary identity key)
According to the last running I made it is working good but it is expected to last for about 200 days:
SET NOCOUNT ON
-- define the last ID handled
DECLARE #LastID integer
SET #LastID = 0
declare #tempDate datetime
set #tempDate = dateadd(dd,-20,getdate())
-- define the ID to be handled now
DECLARE #IDToHandle integer
DECLARE #iCounter integer
DECLARE #watch1 nvarchar(50)
DECLARE #watch2 nvarchar(50)
set #iCounter = 0
-- select the next to handle
SELECT TOP 1 #IDToHandle = id_col
FROM MAIN_TABLE
WHERE id_col> #LastID and DATEDIFF(DD,someDateCol,otherDateCol) < 1
and datediff(dd,someDateCol,#tempDate) > 0 and (some_other_int_col = 1745 or some_other_int_col = 1548 or some_other_int_col = 4785)
ORDER BY id_col
-- as long as we have s......
WHILE #IDToHandle IS NOT NULL
BEGIN
IF ((select count(1) from SOME_OTHER_TABLE_THAT_CONTAINS_20k_ROWS where some_int_col = #IDToHandle) = 0 and (select count(1) from A_70k_rows_table where some_int_col =#IDToHandle )=0)
BEGIN
INSERT INTO SECONDERY_TABLE
SELECT col1,col2,col3.....
FROM MAIN_TABLE WHERE id_col = #IDToHandle
EXEC [dbo].[DeleteByID] #ID = #IDToHandle --deletes the row from 2 other tables that is related to the MAIN_TABLE and than from the MAIN_TABLE
set #iCounter = #iCounter +1
END
IF (#iCounter % 1000 = 0)
begin
set #watch1 = 'iCounter - ' + CAST(#iCounter AS VARCHAR)
set #watch2 = 'IDToHandle - '+ CAST(#IDToHandle AS VARCHAR)
raiserror ( #watch1, 10,1) with nowait
raiserror (#watch2, 10,1) with nowait
end
-- set the last handled to the one we just handled
SET #LastID = #IDToHandle
SET #IDToHandle = NULL
-- select the next to handle
SELECT TOP 1 #IDToHandle = id_col
FROM MAIN_TABLE
WHERE id_col> #LastID and DATEDIFF(DD,someDateCol,otherDateCol) < 1
and datediff(dd,someDateCol,#tempDate) > 0 and (some_other_int_col = 1745 or some_other_int_col = 1548 or some_other_int_col = 4785)
ORDER BY id_col
END
Any ideas or directions to improve this procedure run-time will be welcomed
Yes, try this:
Declare #Ids Table (id int Primary Key not Null)
Insert #Ids(id)
Select id_col
From MAIN_TABLE m
Where someDateCol >= otherDateCol
And someDateCol < #tempDate -- If there are times in these datetime fields,
-- then you may need to modify this condition.
And some_other_int_col In (1745, 1548, 4785)
And Not exists (Select * from SOME_OTHER_TABLE_THAT_CONTAINS_20k_ROWS
Where some_int_col = m.id_col)
And Not Exists (Select * From A_70k_rows_table
Where some_int_col = m.id_col)
Select id from #Ids -- this to confirm above code generates the correct list of Ids
return -- this line to stop (Not do insert/deletes) until you have verified #Ids is correct
-- Once you have verified that above #Ids is correctly populated,
-- then delete or comment out the select and return lines above so insert runs.
Begin Transaction
Delete OT -- eliminate row-by-row call to second stored proc
From OtherTable ot
Join MAIN_TABLE m On m.id_col = ot.FKCol
Join #Ids i On i.Id = m.id_col
Insert SECONDERY_TABLE(col1, col2, etc.)
Select col1,col2,col3.....
FROM MAIN_TABLE m Join #Ids i On i.Id = m.id_col
Delete m -- eliminate row-by-row call to second stored proc
FROM MAIN_TABLE m
Join #Ids i On i.Id = m.id_col
Commit Transaction
Explaanation.
You had numerous filtering conditions that were not SARGable, i.e., they would force a complete table scan for every iteration of your loop, instead of being able to use any existing index. Always try to avoid filter conditions that apply processing logic to a table column value before comparing it to some other value. This eliminates the opportunity for the query optimizer to use an index.
You were executing the inserts one at a time... Way better to generate a list of PK Ids that need to be processed (all at once) and then do all the inserts at once, in one statement.

How to get rows having sum equal to given value

There is table contain
ID Qty
----------
1 2
2 4
3 1
4 5
Now if i had to choose rows where sum of Qty equals to 10,
How can i do this ?
like 2+4+1 = 7
but if i add 5 then 12
so ignore 2, then
4+1+5 = 10
How can i achieve this ?
Edit:
I want any possible combination which sums up to gives value.
suppose 7 then any rows which sums up to 7
like wise if 8 then any rows sums up to 8
want row/rows having combination equals to given value.
The problem you want to solve is called the subset sum problem. Unfortunately, it is NP-complete.
This means that, whether you use SQL or any other language to solve it, you will only be able to solve very small instances of the problem, i.e. ones with only a few entries in the table. Otherwise, the runtime will become excessive, since it grows exponentially with the number of rows in the table. The reason for this is that there is essentially no better way of finding the solution than to try all possible combinations.
If an approximate solution is acceptable, there is a polynomial time algorithm, which is described on the Wikipedia page.
If you want it to be exactly always three numbers that add to 10, then this
SELECT
*
FROM
MyTable t1
JOIN
MyTable t2 ON t1.ID <> t2.ID
JOIN
MyTable t3 ON t1.ID <> t3.ID AND t2.ID <> t3.ID
WHERE
t1.Qty + t2.Qty + t3.Qty = 10
If you want 2 or 4 or 5 numbers then you can't really do it in SQL
Edit:
Updated to ignore by ID not Qty
And again: If you want 2 or 4 or 5 numbers then you can't really do it in SQL
When you are dealing with such task in SQL you need to go to the cursors approach.
Cursors let you perform row by row operations and that's what you need, for example:
You want to make groups of where the summed quantity is 10
This are the tasks
select all the table into a temp table #TBL_ALL
loop row by row and when the rows sum up 10, delete from the temp table and insert into a final temp table #TBL_FINAL
do this until the #TBL_ALL sum cant be performed
An example:
Test table:
/****** Object: Table [dbo].[tblExample] Script Date: 06/09/2011 11:25:27 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE TABLE [dbo].[tblExample](
[Id] [int] IDENTITY(1,1) NOT NULL,
[Qty] [int] NOT NULL,
CONSTRAINT [PK_tblExample] PRIMARY KEY CLUSTERED
(
[Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
Test Data:
INSERT INTO tblExample SELECT 2;
INSERT INTO tblExample SELECT 4;
INSERT INTO tblExample SELECT 1;
INSERT INTO tblExample SELECT 5;
INSERT INTO tblExample SELECT 5;
INSERT INTO tblExample SELECT 11;
INSERT INTO tblExample SELECT 1;
INSERT INTO tblExample SELECT 2;
INSERT INTO tblExample SELECT 3;
INSERT INTO tblExample SELECT 4;
INSERT INTO tblExample SELECT 7;
INSERT INTO tblExample SELECT 9;
INSERT INTO tblExample SELECT 1;
INSERT INTO tblExample SELECT 2;
Store procedure:
http://pastebin.com/EFeZcKXf
Results:
Grouped table:
ids qty group
12 9 1
7 1 1
11 7 2
9 3 2
4 5 3
5 5 3
2 4 4
10 4 4
14 2 4
Numbers not used:
id qty
1 2
8 2
3 1
13 1
6 11
Hope it helps
SP for those who cant access PasteBin
CREATE PROCEDURE getGroups
(
#groupByQty int, -- grouping number
#numberRuns int -- how many loops
-- usage: getGroups 10, 10
)
AS
SET NOCOUNT ON;
-- declare all variables
DECLARE #rowId int,
#rowQty int,
#rowTotal int,
#groupId int,
#totalRuns int,
#continue bit
-- set up our final temporary table
CREATE TABLE #TBL_COUNT
(
ids NVARCHAR(4000),
qty int,
[group] int
)
-- initializate variable
SET #groupId = 1;
SET #continue = 1;
SET #totalRuns = 0;
SELECT Id, Qty INTO #TBL_ALL FROM tblExample ORDER BY Qty DESC;
WHILE #totalRuns <= #numberRuns
BEGIN
-- declare the cursor
DECLARE Product CURSOR FOR SELECT Id, Qty FROM #TBL_ALL ORDER BY Qty DESC;
OPEN Product;
FETCH Product INTO #rowId, #rowQty;
PRINT ' ';
PRINT '### Run: ' + CAST(#totalRuns AS nvarchar(10)) + ' #################################################################';
PRINT 'Grouping Table by ' + CAST(#groupByQty AS nvarchar(10)) + ' | group id = ' + CAST(#groupId AS nvarchar(10));
-- Retrieve and process the first row
SELECT Top 1 #rowId = Id, #rowQty = Qty FROM #TBL_ALL ORDER BY Qty DESC;
PRINT 'First Row: id = ' + CAST(#rowId AS nvarchar(10)) + ' | qty = ' + CAST(#rowQty AS nvarchar(10));
-- sum it up and see if we have #groupByQty
SELECT #rowTotal = ISNULL(SUM(qty),0) FROM #TBL_COUNT WHERE [group] = #groupId;
PRINT 'Current sum in #TBL_COUNT: #groupId = '+ CAST(#groupId AS nvarchar(10)) +' | #rowTotal = ' + CAST(#rowTotal AS nvarchar(10)) + ' | (#rowTotal + #rowQty) = ' + CAST((#rowTotal + #rowQty) AS nvarchar(10));
IF #rowQty > #groupByQty
BEGIN
PRINT ' x First row has an unused number';
END
ELSE
BEGIN
-- handle result
IF (#rowTotal + #rowQty) = #groupByQty
BEGIN
PRINT '+++ Current sum is ' + CAST(#groupByQty AS nvarchar(10)) + ' +++';
-- save number
INSERT INTO #TBL_COUNT SELECT #rowId, #rowQty, #groupId;
PRINT '### Inserted final # into #TBL_COUNT : id = ' + CAST(#rowId AS nvarchar(10)) + ' | qty = ' + CAST(#rowQty AS nvarchar(10)) + ' | group = ' + CAST(#groupId AS nvarchar(10));
-- remove from table as we use it already
DELETE FROM #TBL_ALL WHERE Id = #rowId;
-- we got 10, let's change our Groupping
SET #groupId = (#groupId + 1);
PRINT 'New group id: ' + CAST(#groupId AS nvarchar(10));
END
ELSE
BEGIN
IF (#rowTotal + #rowQty) < #groupByQty
BEGIN
PRINT '### Inserted into #TBL_COUNT : id = ' + CAST(#rowId AS nvarchar(10)) + ' | qty = ' + CAST(#rowQty AS nvarchar(10)) + ' | group = ' + CAST(#groupId AS nvarchar(10));
-- save number
INSERT INTO #TBL_COUNT SELECT #rowId, #rowQty, #groupId;
-- remove from table as we use it already
DELETE FROM #TBL_ALL WHERE Id = #rowId;
END
ELSE
BEGIN
PRINT ' x Unmatch number, will handle this latter';
END
END
END
-- start the main processing loop
WHILE ##Fetch_Status = 0
BEGIN
FETCH Product INTO #rowId, #rowQty;
PRINT '##Fetch_Status = ' + CAST(##Fetch_Status AS nvarchar(100));
IF ##Fetch_Status < 0
BEGIN
BREAK
END
-- we have the values of our row, let's use them
PRINT 'Fetched Row: id = ' + CAST(#rowId AS nvarchar(10)) + ' | qty = ' + CAST(#rowQty AS nvarchar(10));
-- sum it up and see if we have #groupByQty
SELECT #rowTotal = ISNULL(SUM(qty),0) FROM #TBL_COUNT WHERE [group] = #groupId;
PRINT 'Current sum in #TBL_COUNT: #groupId = '+ CAST(#groupId AS nvarchar(10)) +' | #rowTotal = ' + CAST(#rowTotal AS nvarchar(10)) + ' | (#rowTotal + #rowQty) = ' + CAST((#rowTotal + #rowQty) AS nvarchar(10));
-- handle result
IF (#rowTotal + #rowQty) = #groupByQty
BEGIN
PRINT '+++ Current sum is ' + CAST(#groupByQty AS nvarchar(10)) + ' +++';
-- save number
INSERT INTO #TBL_COUNT SELECT #rowId, #rowQty, #groupId;
PRINT '### Inserted final # into #TBL_COUNT : id = ' + CAST(#rowId AS nvarchar(10)) + ' | qty = ' + CAST(#rowQty AS nvarchar(10)) + ' | group = ' + CAST(#groupId AS nvarchar(10));
-- remove from table as we use it already
DELETE FROM #TBL_ALL WHERE Id = #rowId;
-- we got 10, let's change our Groupping
SET #groupId = (#groupId + 1);
PRINT 'New group id: ' + CAST(#groupId AS nvarchar(10));
-- start again
BREAK;
END
ELSE
BEGIN
IF (#rowTotal + #rowQty) < #groupByQty
BEGIN
PRINT '### Inserted into #TBL_COUNT : id = ' + CAST(#rowId AS nvarchar(10)) + ' | qty = ' + CAST(#rowQty AS nvarchar(10)) + ' | group = ' + CAST(#groupId AS nvarchar(10));
-- save number
INSERT INTO #TBL_COUNT SELECT #rowId, #rowQty, #groupId;
-- remove from table as we use it already
DELETE FROM #TBL_ALL WHERE Id = #rowId;
END
ELSE
BEGIN
PRINT ' x Unmatch number, will handle this latter';
END
END
END -- END WHILE ##Fetch_Status = 0
SET #totalRuns = #totalRuns + 1;
-- Close and dealocate
CLOSE Product;
DEALLOCATE Product;
END -- END WHILE totalRuns <= #numberRuns
-- let's sum our last group and remove it if it's not #groupByQty
SELECT #rowTotal = ISNULL(SUM(qty),0) FROM #TBL_COUNT WHERE [group] = #groupId;
IF #rowTotal <> #groupByQty
BEGIN
SET IDENTITY_INSERT #TBL_ALL ON
INSERT INTO #TBL_ALL (Id, Qty) SELECT Ids, Qty FROM #TBL_COUNT WHERE [group] = #groupId;
DELETE FROM #TBL_COUNT WHERE [group] = #groupId;
END
SET NOCOUNT OFF;
-- Show and Delete temp tables
SELECT * FROM #TBL_COUNT;
SELECT * FROM #TBL_ALL;
DROP TABLE #TBL_COUNT;
DROP TABLE #TBL_ALL;
P.S. I'm not a SQL Professional, so bear with me if I did something weird, and keep in mind that this is a performance waste, maybe someone can use the the Loop without Cursors, more in this page
If u add always 3 numbers its like gbn said, if not then u have to check every combination of ur rows which gives u 2 to the number_of_rows power cobinations and i dont see how can it be done on sql in one query. If u use loop in sql sure its possible but u should find some good algorithm to finish this task.

SQL: Query timeout expired

I have a simple query for update table (30 columns and about 150 000 rows).
For example:
UPDATE tblSomeTable set F3 = #F3 where F1 = #F1
This query will affected about 2500 rows.
The tblSomeTable has a trigger:
ALTER TRIGGER [dbo].[trg_tblSomeTable]
ON [dbo].[tblSomeTable]
AFTER INSERT,DELETE,UPDATE
AS
BEGIN
declare #operationType nvarchar(1)
declare #createDate datetime
declare #UpdatedColumnsMask varbinary(500) = COLUMNS_UPDATED()
-- detect operation type
if not exists(select top 1 * from inserted)
begin
-- delete
SET #operationType = 'D'
SELECT #createDate = dbo.uf_DateWithCompTimeZone(CompanyId) FROM deleted
end
else if not exists(select top 1 * from deleted)
begin
-- insert
SET #operationType = 'I'
SELECT #createDate = dbo..uf_DateWithCompTimeZone(CompanyId) FROM inserted
end
else
begin
-- update
SET #operationType = 'U'
SELECT #createDate = dbo..uf_DateWithCompTimeZone(CompanyId) FROM inserted
end
-- log data to tmp table
INSERT INTO tbl1
SELECT
#createDate,
#operationType,
#status,
#updatedColumnsMask,
d.F1,
i.F1,
d.F2,
i.F2,
d.F3,
i.F3,
d.F4,
i.F4,
d.F5,
i.F5,
...
FROM (Select 1 as temp) t
LEFT JOIN inserted i on 1=1
LEFT JOIN deleted d on 1=1
END
And if I execute the update query I have a timeout.
How can I optimize a logic to avoid timeout?
Thank you.
This query:
SELECT *
FROM (
SELECT 1 AS temp
) t
LEFT JOIN
INSERTED i
ON 1 = 1
LEFT JOIN
DELETED d
ON 1 = 1
will yield 2500 ^ 2 = 6250000 records from a cartesian product of INSERTED and DELETED (that is all possible combinations of all records in both tables), which will be inserted into tbl1.
Is that what you wanted to do?
Most probably, you want to join the tables on their PRIMARY KEY:
INSERT
INTO tbl1
SELECT #createDate,
#operationType,
#status,
#updatedColumnsMask,
d.F1,
i.F1,
d.F2,
i.F2,
d.F3,
i.F3,
d.F4,
i.F4,
d.F5,
i.F5,
...
FROM INSERTED i
FULL JOIN
DELETED d
ON i.id = d.id
This will treat update to the PK as deleting a record and inserting another, with a new PK.
Thanks Quassnoi, It's a good idea with "FULL JOIN". It is helped me.
Also I try to update table in portions (1000 items in one time) to make my code works faster because for some companyId I need to update more than 160 000 rows.
Instead of old code:
UPDATE tblSomeTable set someVal = #someVal where companyId = #companyId
I use below one:
declare #rc integer = 0
declare #parts integer = 0
declare #index integer = 0
declare #portionSize int = 1000
-- select Ids for update
declare #tempIds table (id int)
insert into #tempIds
select id from tblSomeTable where companyId = #companyId
-- calculate amount of iterations
set #rc=##rowcount
set #parts = #rc / #portionSize + 1
-- update table in portions
WHILE (#parts > #index)
begin
UPDATE TOP (#portionSize) t
SET someVal = #someVal
FROM tblSomeTable t
JOIN #tempIds t1 on t1.id = t.id
WHERE companyId = #companyId
delete top (#portionSize) from #tempIds
set #index += 1
end
What do you think about this? Does it make sense? If yes, how to choose correct portion size?
Or simple update also good solution? I just want to avoid locks in the future.
Thanks