I have a trigger for executing two procedures.
ALTER TRIGGER [dbo].[TRG_SP_SYNCH_CAB]
ON [VTBO_INTERFACE].[dbo].[T_TRIGGER_TABLE_FOR_SYNCH]
INSTEAD OF INSERT
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Insert statements for trigger here
INSERT INTO T_TRIGGER_TABLE_FOR_SYNCH (DT)
VALUES (GETDATE());
exec PUMPOMAT_HO.DBO.SP_CM_TransferCAB
exec PUMPOMAT_HO.DBO.SP_CM_UpdateCAB
END
Execution time for two procedures is 5 mins. When I try to insert a value to T_TRIGGER_TABLE_FOR_SYNCH table, my other tables which are used in stored procedures are locked along 5 mins. But when I try to execute two procedures directly like
exec SP_CM_TransferCAB
exec SP_CM_UpdateCAB
No lock happens. What should I write in trigger to avoid of table locks.
Thanks.
Try by calling the second procedure inside(end of) the first procedure since I see no parameters are given.
Is this table [VTBO_INTERFACE].[dbo].[T_TRIGGER_TABLE_FOR_SYNCH] used in any of the procedure?
You should try to change the design/data flow to mimic this procedure call.
Related
I have a stored procedure, usp_region and it has a select statement with 50 columns as the result set. This procedure is called by multiple other stored procedures in our application.
Most of the stored procedure pass a parameter to this procedure and display the result set that it returns. I have one stored procedure, usp_calculatedDisplay, that gets the columns from this stored procedure and inserts the values into a temp table and does some more calculations on the columns.
Here's a part of the code in usp_calculatedDisplay.
Begin Procedure
/* some sql statements */
Declare #tmptable
(
-- all the 50 columns that are returned from the usp_region procedure
)
Insert Into #tmptable
exec usp_region #regionId = #id
Select t.*, /* a few calculated columns here */
From #tmptable t
End of procedure
Every time I add a column to the usp_region procedure, I'll also have to make sure I have to add it to this procedure. Otherwise it breaks. It has become difficult to maintain it since it is highly possible for someone to miss adding a column to the usp_calculatedDisplay procedure when the column is added to the usp_region.
In order to overcome this problem, I decided to do this:
Select *
Into #tmptable
From OPENROWSET('SQLNCLI',
'Server=localhost;Trusted_Connection=yes;',
'EXEC [dbo].[usp_region]')
The problem is 'Ad Hoc Distributed Queries' component is turned off. So I can't use this approach to overcome this issue. I was wondering if there are any other ways of overcoming this problem. I would really appreciate any help. Thank you!
Every time I add a column to the usp_region procedure
SQL Server is a structured database and it does not meant to solve such cases that you need to change your structure every day.
If you add/remove columns so often then you probably did not choose the right type of database, and you better re-design your system.
It has become difficult to maintain it since it is highly possible for someone to miss adding a column to the usp_calculatedDisplay procedure when the column is added to the usp_region.
There are two simple solutions for this (1) using DDL Triggers - very bad idea but simple to implement and working. (2) Using my trick to select from stored procedure
Option 1: using DDL trigger
You can automate the entire procedure and ALTER the stored procedure usp_calculatedDisplay every time that the stored procedure usp_region is changed
https://learn.microsoft.com/en-us/sql/relational-databases/triggers/ddl-triggers
The basic approach is
CREATE OR ALTER TRIGGER NotGoodSolutionTrig ON DATABASE FOR ALTER_PROCEDURE AS BEGIN
DECLARE #var_xml XML = EVENTDATA();
IF(
#var_xml.value('(EVENT_INSTANCE/DatabaseName)[1]', 'sysname') = 'tempdb'
and
#var_xml.value('(EVENT_INSTANCE/SchemaName)[1]', 'sysname') = 'dbo'
and
#var_xml.value('(EVENT_INSTANCE/ObjectName)[1]', 'sysname') = 'usp_region'
)
BEGIN
-- Here you can parse the text of the stored procedure
-- and execute ALTER on the first SP
-- To make it simpler, you can design the procedure usp_region so the columns names will be in specific row or between to comment which will help us to find it
-- The code of the Stored Procedure which you need to parse is in the value of:
-- #var_xml.value('(EVENT_INSTANCE/TSQLCommand/CommandText)[1]', 'NVARCHAR(MAX)'))
-- For example we can print it
DECLARE #SP_Code NVARCHAR(MAX)
SET #SP_Code = CONVERT(NVARCHAR(MAX), #var_xml.value('(EVENT_INSTANCE/TSQLCommand/CommandText)[1]', 'NVARCHAR(MAX)'))
PRINT #SP_Code
-- In your case, you need to execute ALTER on the usp_calculatedDisplay procedure using the text from usp_region
END
END
Option 2: trick to select from stored procedure using sys.dm_exec_describe_first_result_set
This is simple and direct way to get what you need.
CREATE OR ALTER PROCEDURE usp_calculatedDisplay AS
-- Option: using simple table, so it will exists outsie the scope of the dynamic query
DROP TABLE IF EXISTS MyTable;
DECLARE #sqlCommand NVARCHAR(MAX)
select #sqlCommand = 'CREATE TABLE MyTable(' + STRING_AGG ([name] + ' ' + system_type_name, ',') + ');'
from sys.dm_exec_describe_first_result_set (N'EXEC usp_region', null,0)
PRINT #sqlCommand
EXECUTE sp_executesql #sqlCommand
INSERT MyTable EXECUTE usp_region;
SELECT * FROM MyTable;
GO
Note!!! Both solutions are not recommended in production. My advice is to avoid such needs by redesign your system. If you need to re-write 20 SP so do it and don't be lazy! Your goal should be what best for the database usage.
I have a stored procedure that basically inserts from one table to another.
While this procedure is running, I don't want anyone else to be able to start it, however I don't want serialization, the other person to run the procedure after I am done with it.
What I want is for the other person trying to start it to get an error, while I am running the procedure.
I've tried with using sp_getapplock, however I can't manage to completely stop the person from running the procedure.
I also tried finding the procedure with sys.dm_exec_requests and blocking the procedure, while this does work, i think it's not optimal because on some servers I don't have the permissions to run sys.dm_exec_sql_text(sql_handle).
What is the best way for me to do this?
Cunning stunts:
ALTER PROC MyProc
AS
BEGIN TRY
IF OBJECT_ID('tembdb..##lock_proc_MyProc') IS NOT NULL
RAISERROR('Not now.', 16, 0)
ELSE
EXEC('CREATE TABLE ##lock_proc_MyProc (dummy int)')
...
EXEC('DROP TABLE ##lock_proc_MyProc')
END TRY
BEGIN CATCH
...
EXEC('DROP TABLE ##lock_proc_MyProc')
...
END CATCH
GO
Which can be extended by storing spid and watching for zombie ##.
Also you can just raise isolation level/granularity like TABLOCK, UPDLOCK:
ALTER PROC MyProc
AS
BEGIN TRAN
DECLARE #dummy INT
SELECT #dummy=1
FROM t (TABLOCKX, HOLDLOCK)
WHERE 1=0
...
COMMIT TRAN
this will have different effect - will wait, not fall.
I want to return an output from a stored procedure by using insert into. For performance reasons, the target table is a memory-optimized table type.
I now figured out that while the stored procedure is running, all in the stored procedure affected rows are kept locked until the stored procedure completes.
Example:
insert into #ModifiedSecurities (SecurityID, AttributeTypeID)
exec Securities.spSecuritiesImportBody
#ProcessingID = #ProcessingID
During execution of Securities.spSecuritiesImportBody (which takes up to 10 minutes), all by spSecuritiesImportBody affected table rows are locked until the stored procedure completes (even the tables have nothing to do with the output of the stored procedure).
While in a single insert statement this behavior might make sense, I don't see any use of it and therefore want to get rid of these locks.
Is there any way to execute the stored procedure without creating these locks?
Here a code sample I made:
Execute the preparation
Run the code
Try to select from dbo.ProcessingsTesting while code is running. It won't be possible as table is locked. The lock is being create during dbo.UpdProcessing. However, for some reason, the lock is not being released.
select *
from dbo.ProcessingsTesting
-- start of preparation
drop procedure dbo.UpdProcessing
drop table dbo.ProcessingsTesting
drop procedure dbo.spSecuritiesImportBody
go
create table dbo.ProcessingsTesting
(
ProcessingID int,
EndDate datetime
)
insert into dbo.ProcessingsTesting
(
ProcessingID
)
select 1 union all
select 2 union all
select 3 union all
select 4 union all
select 5
-- stored procedure
go
create procedure dbo.spSecuritiesImportBody
(
#ProcessingID int
)
as
begin
exec dbo.UpdProcessing
#ProcessingID = #ProcessingID
WAITFOR DELAY '00:03:00'
-- return data
select 1, 2
end
-- stored procedure
go
create procedure dbo.UpdProcessing
(
#ProcessingID int
)
as
begin
update dbo.ProcessingsTesting
set EndDate = null
where ProcessingID = #ProcessingID
end
-- end of preparation
-- run the code
declare #ModifiedSecurities table
(
[SecurityID] [int] NOT NULL,
[AttributeTypeID] [smallint] NOT NULL
)
insert into #ModifiedSecurities (SecurityID, AttributeTypeID)
exec dbo.spSecuritiesImportBody
#ProcessingID = 1
Unless you begin and commit an explicit transaction, locks will be held on the modified rows until the outermost INSERT...EXEC statement completes. You can add an explicit transaction to the dbo.UpdProcessing proc (or surround the EXEC dbo.UpdProcessing with BEGIN TRAN and COMMIT) to release locks on the updated rows before the INSERT...EXEC completes:
ALTER PROCEDURE dbo.UpdProcessing
(
#ProcessingID int
)
AS
BEGIN TRAN;
UPDATE dbo.ProcessingsTesting
SET EndDate = null
WHERE ProcessingID = #ProcessingID
COMMIT;
GO
Although this will provide the desired results, it doesn't make much sense to me that one would update data unrelated to the SELECT results in the same stored procedure. It seems the procs should be called independently since they perform different functions.
On the procedure spSecuritiesImportBody you should change the execute update exec dbo.UpdProcessing
#ProcessingID = #ProcessingID to
Exec dbo.UpdProcessing #ProcessingID because is no sense using like this.
You can not avoid the lock because the MSSQL server use lock on row, table, pages to ensure that you are not changing the data when the server is executing statements regarding that record on database.
I have a stored procedure, let's call it stored procedure 'B'. Stored procedure 'B' calls stored procedure 'A' which returns a resultset that needs to be inserted into a temp table within stored procedure 'B', in order to do further mutations. Because of nested inserts, I have used OPENROWSET (and tried OPENQUERY too).
Now, it seems to work great! However, next to returning a resultset, stored procedure 'A' also does INSERTS in a table. The weird thing is, when stored procedure 'A' is executed from within stored procedure 'B', stored procedure 'A' only returns the resultset, and does NO insert at all. It just seems to skip the entire INSERT INTO statement. I have tried putting dummy SELECT 'test' breakpoints before and after the INSERT, and they are executed fine! How is this possible?
This query looks like this (I changed data and columns up a bit):
DECLARE #SQL NVARCHAR(MAX)
SET #SQL = 'INSERT INTO #Temp (1,2,3)
SELECT * FROM OPENROWSET (
''SQLOLEDB'',
''Server=(local);TRUSTED_CONNECTION=yes;'',
''SET FMTONLY OFF EXECUTE StoredProcedureA
#Parameter1 = '''''+#InputValue1+'''''
,#Parameter_2 = '''''+#InputValue2+'''''
''
)'
EXEC(#SQL)
No errors are returned. The resultset (SELECT statement from procedure A) is correctly loaded into #Temp within procedure B. But the INSERT that is done within procedure A is not executed.
Does openquery/openrowset not allow INSERTS and only execute SELECT outputs? I thought, maybe its a security/rights issue? Is there any other way to workaround this issue?
Thanks in advance guys!
It is because you are using a temporary table denoted by #.
The scope of this table is ends when your nested stored procedure ends and the temporary table is dropped.
So the insert happens, the table just doesn't exist anymore.
If you create the table before starting your nested procedure you can solve this problem. You can just drop the table in you Procedure B if you really want it gone.
I'm trying to execute a stored procedure directly after its creation however it is not getting called. It looks like the stored procedure is not yet created during the execution call.
Here is how the script looks like:
CREATE PROCEDURE sp_Transfer_RegionData
AS
BEGIN
INSERT INTO Region (regionName)
SELECT column1
FROM openquery(ITDB, 'select * from db.table1')
END
EXEC sp_Transfer_RegionData
The script runs fine however the needed table is not populated. After replacing the execution part with:
IF OBJECT_ID('sp_Transfer_RegionData') IS NOT NULL
begin
exec [dbo].[sp_Transfer_RegionData]
print 'tada'
end
I could see that the stored procedure does not exist when it has to be executed. Couldn't find a solution for this in the internet...
So how to make the SQL script run sync so that the stored procedure would already exist during the execution part?
You need a GO after you created the SP otherwise you have created a recursive SP that calls itself "indefinitely" which is 32 times in SQL Server.
Maximum stored procedure, function, trigger, or view nesting level
exceeded (limit 32).
Try this:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE sp_Transfer_RegionData
AS
BEGIN
INSERT INTO Region (regionName)
SELECT column1
FROM openquery(ITDB, 'select * from db.table1')
END
GO
EXEC sp_Transfer_RegionData