Index healthy and statistics update after 1000 updates - sql

I have a stored procedure which gets data from 5 tables. Tables are updated approximately 1000 records and 1000 updates in one hour. After inserting and updating, the stored procedure runs into a timeout.
When I rebuild one of the index of a table which is referenced in the stored procedure, it starts working normal again.. but it breaks down again after each new 1000 records updated.
What should I do?

Ok I think you are mistaken here when you say rebuilding the Index is solving the problem.
I think it is actually that rebuilding indexes invalids the the cached execution plan and on next execution after rebuilding the index will force sql server to recompile the execution plan.
Normally SQL Serve would use a cached execution plan for a stored procedure, but there are some factors that can cause sql server to recompile an execution plan for a stored procedure even if there a cached execution plan in proc cache memory. Rebuilding or any changes to an index that is being used in the execution of a stored procedure will result in recompilation of execution plan.
Since you are inserting 1000 rows every hour and you would also want to keep your statistics updated. I would say run an nightly job to update statistics.
But for your Store Procedure use WITH RECOMPILE option in your procedure's definition or use this option when executing this Stored Procedure and I think it will solve the issue.
to add this option in sp's definition
ALTER PROCEDURE myProc
WITH RECOMPILE
AS.......
Or to add this option when executing your stored procedure you can do as follows
EXECUTE myProc WITH RECOMPILE
Or you can also use a system stored procedure sp_recompile to force sql server to compile an execution plan even if there is one in cache memory.
EXECUTE sp_recompile N'dbo.MyProc';
GO
EXECUTE dbo.MyPrco;
GO

Related

SQL Server : unexpected performance issue between in-line and stored procedure execution

I have run into an enigma of sorts while researching a performance issue with a specific stored procedure. I did not create that stored procedure, and the logic is fairly ugly with nested selects in join statements, etc...
However, when I copy the logic of the stored procedure directly into a new query window, add the parameters at the top and execute it, this runs in under 400 milliseconds. Yet, when I call the stored procedure and execute it with the exact same parameter values, it takes 23 seconds to run!
This makes absolutely no sense at all to me!
Are there some server-level settings that I should check which could potentially be impacting this?
Thanks
Recompile your stored procedure.
The Microsoft documentation says
When a procedure is compiled for the first time or recompiled, the procedure's query plan is optimized for the current state of the database and its objects. If a database undergoes significant changes to its data or structure, recompiling a procedure updates and optimizes the procedure's query plan for those changes. This can improve the procedure's processing performance.
and
Another reason to force a procedure to recompile is to counteract the "parameter sniffing" behavior of procedure compilation. When SQL Server executes procedures, any parameter values that are used by the procedure when it compiles are included as part of generating the query plan. If these values represent the typical ones with which the procedure is subsequently called, then the procedure benefits from the query plan every time that it compiles and executes. If parameter values on the procedure are frequently atypical, forcing a recompile of the procedure and a new plan based on different parameter values can improve performance.
If you run the SP's queries yourself (in SSMS maybe) they get compiled and run.
How to recompile your SP? (See the doc page linked above.)
You can rerun its definition to recompile it once. That may be enough if the procedure was first defined long ago in the life of your database and your tables have grown a lot.
You can put CREATE PROCEDURE ... WITH RECOMPILE AS into its definition so it will recompile every time you run it.
You can EXEC procedure WITH RECOMPILE; when you run it.
You can restart your SQL Server. (This can be a nuisance; booting the server magically makes bad things stop happening and nobody's quite sure why.)
Recompiling takes some time. But it takes far less time than a complex query with a bad execution plan would.
So... I ended up turning the nested selects on the joins to table variables, and now the sproc is executing in 60-milliseconds, while the in-line sql is taking 250=ms.
However, I still do not understand why the sproc was performing so much slower than the in-line sql version with the original nested sql logic?
I mean, both were using the exact same sql logic, so why was the sproc taking 23-seconds while the in-line was 400-ms?

Does "With recompile" recompile all the queries in stored procedure?

So we have quite a large database and a stored procedure which searches through a large number of documents. Depending on the context, it either fetches millions of documents or just one hundred.
Point is, the procedure takes 30 seconds for both millions and hundred documents which is surreal. If I add OPTION (RECOMPILE) after each of the five queries, it takes a second for 100 documents and (expected) 30 seconds for millions of documents.
I've tried creating a procedure with WITH RECOMPILE option but it seems that it doesn't recompile queries in it.
Is this correct? Does the WITH RECOMPILE on stored procedure recompiles inner queries or just the execution plan for an entire SP? How can I do this without repeating OPTION (RECOMPILE) after each query?
Does the WITH RECOMPILE on stored procedure recompiles inner queries or just the execution plan for an entire SP?
inner queries or just execution plan,i am not sure what does this mean ? With Recompile at Stored proc level,will cause recompilation every time the proc is executed and query is not saved to Cache
How can I do this without repeating OPTION (RECOMPILE) after each query?
create proc usp_test
with Recompile
as
Begin
--code
End
Some More Details:
With Recompile will recompile a new plan for the entire stored proc ,everytime its run..
suppose ,you have below proc
create proc usp_test
as
Begin
select * from t1
go
select * from t2
End
adding recompile on top of stored proc,will cause SQLServer to recompile all the batches in Stored proc
Instead of recompiling,the total proc,if you know for sure ,which batch is causing issues,you can add Option(Recompile) like below
create proc usp_test
as
Begin
select * from t1 option(recompile)
select * from t2
End
doing this,you are avoiding unnecessary recompilation of other batches
Both OPTION(RECOMPILE) and WITH RECOMPILE will give you execution plans based on the parameters used each time, but only OPTION (RECOMPILE) allows "parameter embedding optimization" where the parameters are replaced with literal constants when its parsing the queries. This can allow the optimizer to make simplifications to the query.
I would find out which query is actually causing the performance issue and use OPTION(RECOMPILE) on that.

Sql server stored procedure and indexes

In SQL Server 2008, if a stored procedure is created before indices is created, will the stored procedure use those indices after they have been created?
The short answer is yes it would. Stored procedures can even exist before the tables that they use exist.
A longer answer means you need to know about execution plans and the plan cache that SQL Server keeps. When a procedure is run, the plan for it (which can include the indexes to use) is cached and kept for a period of time. So it's possible that the index will get used immediately or after the current execution plan has expired from the cache.
Take a look at Execution plan basics for more info.

A Sybase stored procedure runs faster the second time it is run

I have one stored procedure in sybase which is taking more time for the first run than for runs directly after.
While creating this stored procedure, I am using with recompile option. So it shouldn't save any plan for the stored procedure. It will create new plan everytime the procedure executes.
Why would a stored procedure run faster the second time it is run?
This is most likely because of sybase internal cache.
So the first time stored the relevant data in the cache and that's why the second time is faster.
You can check with sp_helpcache to see what's configured.

Stored Procedure taking time in execution

I am breaking my head on this issue since long. I have stored procedure in MS SQl and when I try to execute that procedure by providing all the parameters in SQL Query, it takes long time to execute but when I try to directly run the query which is there in SP it executes in no time. This is affecting my application performance also as we are using stored procedures to fetch the data from DB Server.
Please help.
Regards,
Vikram
Looks like parameter sniffing.
Here is a nice explanation: I Smell a Parameter!
Basically, sql server has cached query execution plan for the parameters it was first run with so the plan is not optimal for the new values you are passing. When you run the query directly the plan is generated at that moment so that's why it's fast.
You can mark the procedure for recompilation manually using sp_recompile or use With Recompile option in its definition so it is compiled on every run.