SQL Server performance fast only when refresh the stored procedure - sql

I can run a stored procedure multiple times and it wont hit it's cache: (1665ms is duration column)
But if I then alter the stored procedure changing nothing: (240ms is duration column)
Problem: how to get the stored procedure to always be fast (on the second and next calls)

With some digging I found that when I called the SP initially (after a reboot) with a NULL applicationID
exec [dbo].[usp_Tab32] #responsibleReviewerID=1135,#applicationID=NULL,#environment=1,#userUIStatus=0,#roleID=NULL
then with a more confined query:
exec [dbo].[usp_Tab32] #responsibleReviewerID=1135,#applicationID=1406,#environment=1,#userUIStatus=0,#roleID=NULL
This would be slow.
However if I hit the more confined query first, then both would be fast.
To clear down the database plan cache:
DECLARE #dbId INTEGER
SELECT #dbId = dbid FROM master.dbo.sysdatabases WHERE name = ‘myDatabase’
DBCC FLUSHPROCINDB (#dbId)
More detail here
All against SQL2012 Developer edition.

Create your stored procedure with RECOMPILE and recompile at Runtime
CREATE PROCEDURE yourprodecurename
WITH RECOMPILE
AS
--your code here
GO
then call it in this way:
EXEC yourprodecurename WITH RECOMPILE
This should give you the experience you want, because, when a procedure is compiled for the first time or recompiled, the procedures query plan is optimized for the current state of the database.
So this can improve the procedure’s processing performance.

Related

How to prevent SQL Stored Procedure Alters while the stored procedure is running?

We have a stored procedure that runs hourly and requires heavy modification. There is an issue where someone will edit it while the stored proc is running and will cause the stored proc to break and end. I am looking for an error to pop up when someone tries to edit a stored procedure while it is running, rather than breaking the execution.
It's a sql server agent job that runs hourly, I get "The definition of object 'stored_procedure' has changed since it was compiled."
Is there something I can add to the procedure? A setting?
I think you can use a trigger at the database level in order to prevent changes and within the object apply validations for the running stored procedure, something like this:
USE [YourDatabase]
GO
ALTER TRIGGER [DDLTrigger_Sample]
ON DATABASE
FOR CREATE_PROCEDURE, ALTER_PROCEDURE, DROP_PROCEDURE
AS
BEGIN
IF EXISTS (SELECT TOP 1 1
FROM sys.dm_exec_requests req
CROSS APPLY sys.dm_exec_query_plan(req.plan_handle) sqlplan WHERE sqlplan.objectid = OBJECT_ID(N'GetFinanceInformation'))
BEGIN
PRINT 'GetFinanceInformation is running and cannot be changed'
ROLLBACK
END
END
that way you can prevent the stored procedure being changed during execution, if it's not being executed changes will be reflected as usual. hope this helps.
You should do some research and testing and confirm this is the case. Altering a SProc while executing should not impact the run.
Open two SSMS windows and run query 1 first and switch to window 2 and run that query.
Query 1
CREATE PROCEDURE sp_altertest
AS
BEGIN
SELECT 'This is a test'
WAITFOR DELAY '00:00:10'
END
GO
EXEC sp_altertest
Query2
alter procedure sp_altertest AS
BEGIN
SELECT 'This is a test'
WAITFOR DELAY '00:00:06'
END
GO
Exec sp_altertest
Query 1 should continue to run and have a 10 sec execution time while query 2 will run with a 6 sec runtime. The SProc is cached at the time of run and stored in memory. The alter should have no impact.

Index healthy and statistics update after 1000 updates

I have a stored procedure which gets data from 5 tables. Tables are updated approximately 1000 records and 1000 updates in one hour. After inserting and updating, the stored procedure runs into a timeout.
When I rebuild one of the index of a table which is referenced in the stored procedure, it starts working normal again.. but it breaks down again after each new 1000 records updated.
What should I do?
Ok I think you are mistaken here when you say rebuilding the Index is solving the problem.
I think it is actually that rebuilding indexes invalids the the cached execution plan and on next execution after rebuilding the index will force sql server to recompile the execution plan.
Normally SQL Serve would use a cached execution plan for a stored procedure, but there are some factors that can cause sql server to recompile an execution plan for a stored procedure even if there a cached execution plan in proc cache memory. Rebuilding or any changes to an index that is being used in the execution of a stored procedure will result in recompilation of execution plan.
Since you are inserting 1000 rows every hour and you would also want to keep your statistics updated. I would say run an nightly job to update statistics.
But for your Store Procedure use WITH RECOMPILE option in your procedure's definition or use this option when executing this Stored Procedure and I think it will solve the issue.
to add this option in sp's definition
ALTER PROCEDURE myProc
WITH RECOMPILE
AS.......
Or to add this option when executing your stored procedure you can do as follows
EXECUTE myProc WITH RECOMPILE
Or you can also use a system stored procedure sp_recompile to force sql server to compile an execution plan even if there is one in cache memory.
EXECUTE sp_recompile N'dbo.MyProc';
GO
EXECUTE dbo.MyPrco;
GO

What are the downsides of creating SQL Server stored procedures in the following manner?

We check in all our database objects into source control as rerunnable scripts (views, functions, triggers & stored procedures etc...)
When it comes time to deploy, we need to ensure that all the scripts are re-runnable & repeatable so that a stored procedure is be created/updated to the latest version.
Are there any downsides to creating the scripts in the following manner.
IF NOT EXISTS
(
SELECT *
FROM INFORMATION_SCHEMA.ROUTINES
WHERE ROUTINE_SCHEMA = 'dbo'
AND ROUTINE_NAME = 'MyStoredProcedure'
)
BEGIN
EXEC ('CREATE PROCEDURE [dbo].[MyStoredProcedure] AS SELECT 1')
-- ALSO DO ANY INITIAL GRANT PRIVILEGE SCRIPTING HERE
END
GO
ALTER PROCEDURE [dbo].[MyStoredProcedure] (
#param1 INT,
#param2 NVARCHAR(50) = 'Default String'
)
AS
BEGIN
-- DO SOMETHING WITH #param1 AND #param2
SELECT 1;
END
GO
Essentially the script checks to see if the object exists in the relevant system view, and if it doesn't exist, some dynamic sql creates it as a stub to get around CREATE PROCEDURE/GO statement issues not being allowed in conditional blocks. Then it applies the actual functionality of the script through an ALTER.
So the benefits are obvious to me, I'm just wondering are there any downsides to doing this... other than the slight overhead of writing slightly more verbose scripts.
10 year SQL Server developer/architect here, and I can't think of any downsides other than the (relatively slight) upfront cost of creating the script that will do this.
If you are concerned that a plan compiled as trivial at the time of creation is not recompiled when the procedure is ALTERed, you could add an explicit call to SP_RECOMPILE for each, but I have never had this this problem with SQL Server (I have had it with DB2) and so I think that is excessive caution.
This is an interesting and I think useful approach.

Stored Procedure returning duplicate results where as firing the sql it runs directly doesn't

We have got a stored procedure in SQL Server 2005 with a complicated bit of single select query. Recently in one environment, we noticed that for a small subset of the results returned by the stored proc, there were duplicate records. When we ran the sql query directly, we got the correct set of records without any duplicates. The stored procedure uses a lot of views and joins (inner join/left join). One theory I have is that somehow the stored procedure is using some cached execution plans as we have modified some views recently, but I don't have enough SQL expertise to be sure of that. Does any one have any idea?
Thanks for your help,
Ashish
Different results might be caused by different connection settings (e.g. ansi_nulls, arith_abort etc.).
Run sp_recompile on the stored procedure to clear the procedure cache for that stored procedure.
To clear the entire procedure cache execute
DBCC FREEPROCCACHE
Here's an example of recompiling if you want to put it in a re-usable script:
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
/****** Object: Maintenance - StoredProcedure [Sample].[SampleSearch] Script Date: 07/28/2011 14:15:15 ******/
IF (EXISTS (SELECT * FROM INFORMATION_SCHEMA.ROUTINES WHERE ROUTINE_SCHEMA = 'Sample' AND ROUTINE_NAME = 'Sampleearch'))
BEGIN
PRINT 'Marking procedure [Sample].[SampleSearch] for recompile'
EXEC sp_recompile 'Sample.SampleSearch'
PRINT 'Finished marking procedure [Sample].[SampleSearch] for recompile'
END
GO
However, if the query is returning different results, maybe turn on SQL tracing or debug the call from the code to ensure the same in and out parameters are being used in both cases.

add SQL Server index but how to recompile only affected stored procedures?

I need to add an index to a table, and I want to recompile only/all the stored procedures that make reference to this table. Is there any quick and easy way?
EDIT:
from SQL Server 2005 Books Online, Recompiling Stored Procedures:
As a database is changed by such actions as adding indexes or changing data in indexed columns, the original query plans used to access its tables should be optimized again by recompiling them. This optimization happens automatically the first time a stored procedure is run after Microsoft SQL Server 2005 is restarted. It also occurs if an underlying table used by the stored procedure changes. But if a new index is added from which the stored procedure might benefit, optimization does not happen until the next time the stored procedure is run after Microsoft SQL Server is restarted. In this situation, it can be useful to force the stored procedure to recompile the next time it executes
Another reason to force a stored procedure to recompile is to counteract, when necessary, the "parameter sniffing" behavior of stored procedure compilation. When SQL Server executes stored procedures, any parameter values used by the procedure when it compiles are included as part of generating the query plan. If these values represent the typical ones with which the procedure is called subsequently, then the stored procedure benefits from the query plan each time it compiles and executes. If not, performance may suffer
You can exceute sp_recompile and supply the table name you've just indexed. all procs that depend on that table will be flushed from the stored proc cache, and be "compiled" the next time they are executed
See this from the msdn docs:
sp_recompile (Transact-SQL)
They are generally recompiled automatically. I guess I don't know if this is guaranteed, but it has been what I have observed - if you change (e.g. add an index) the objects referenced by the sproc then it recompiles.
create table mytable (i int identity)
insert mytable default values
go 100
create proc sp1 as select * from mytable where i = 17
go
exec sp1
If you look at the plan for this execution, it shows a table scan as expected.
create index mytablei on mytable(i)
exec sp1
The plan has changed to an index seek.
EDIT: ok I came up with a query that appears to work - this gives you all sproc names that have a reference to a given table in the plan cache. You can concatenate the sproc name with the sp_recompile syntax to generate a bunch of sp_recompile statements you can then execute.
;WITH XMLNAMESPACES (default 'http://schemas.microsoft.com/sqlserver/2004/07/showplan')
,TableRefs (SProcName, ReferencedTableName) as
(
select
object_name(qp.objectid) as SProcName,
objNodes.objNode.value('#Database', 'sysname') + '.' + objNodes.objNode.value('#Schema', 'sysname') + '.' + objNodes.objNode.value('#Table', 'sysname') as ReferencedTableName
from sys.dm_exec_cached_plans cp
outer apply sys.dm_exec_sql_text(cp.plan_handle) st
outer apply sys.dm_exec_query_plan(cp.plan_handle) as qp
outer apply qp.query_plan.nodes('//Object[#Table]') as objNodes(objNode)
where cp.cacheobjtype = 'Compiled Plan'
and cp.objtype = 'Proc'
)
select
*
from TableRefs
where SProcName is not null
and isnull(ReferencedTableName,'') = '[db].[schema].[table]'
I believe that the stored procedures that would potentially benefit from the presence of the index in question will automatically have a new query plan generated, provided the auto generate statistics option has been enabled.
See the section entitled Recompiling Execution Plans for details of what eventualities cause an automatic recompilation.
http://technet.microsoft.com/en-us/library/ms181055(SQL.90).aspx