I have a query that is being called by a DataContext object that is creating an extremely inefficient execution plan. I would like to add an "OPTION(RECOMPILE)" query hint to the query, but I do not know how to add this query hint to a DataContext object's query.
I ran a SQL trace in order to capture the query. I ran it manually as is and it took almost four minutes, by adding "OPTION(RECOMPILE)" to the query it reduced the run time to a second. The query contains many variables, a couple table-value functions and a view with an embedded table-value function. All the input variables are numbers. The query plans between the two executions were very different.
I do not need help optimizing the code to avoid the poor execution plan; I can do this myself if I need to go this route. All I need to know is if there is a way to add the OPTION(RECOMPILE) query hint to my Linq query. I'm not going to post the code, it is irrelevant to my question.
If it's possible to add the Recompile Query-Hint please let me know how and if it is not possible if you could please provide a link to some documentation that indicates this to be the case I would appreciate it.
I'm using SQL Server 2012 as my rdbms.
There is an issue against EF requesting that hints are added in future - http://entityframework.codeplex.com/workitem/261.
If you're lucky it would make it into EF 6.
Related
i have a table valued function with quite some code inside, doing multiple join selects and calling sub-functions and returns a result set. during the development of this function, at some point, i faced a performance degradation when executing the function. normally it shouldn't take more than 1 sec but it started taking about 10 sec. I played a bit with joins and also indexes but nothing changed dramatically.
after some time of changes and research, I wanted to see the results with another way. I created the same exact code with same exact parameters as a stored procedure. then i executed the sp. boom! it takes less then 1 sec. the same exact code takes about 10 sec with a function.
i really cannot figure out what this all about and i have no time to do more research. I need it as a function for some reasons but i don't know what to do at this point. I thought i could create it as a proc then call it within the function but then i realized it's not possible to do it for functions.
i wanted to hear some good views and advice here from experts.
thanks in advance
ps:i did not add any code here as the code is not in a good format and quite dirty. i would share it if anybody is interested. server is sql 2014 enterprise 64 bit
edit: i saw the possible duplicate question before but it did not satisfy me as my question is specifically about performance hit. the other question has many answers about general differences between procedures and functions. i want to make it more clear about possible performance related differences.
These are the differences from my experience:
When you first started writing the function, you are likely to run it with the same parameters again & again until it works correctly. This enables page caching in which SQL Server keeps the relevant data in memory.
Functions do not cache their execution plans. As you add more data, it takes longer to come up with a plan. SET STATISTICS TIME ON to see query compilation time vs. execution time.
Functions can only use table variables and there's no stats on those. That can make for some horrendous JOIN decisions later.
Some people prefer table-valued functions because they are easier to query:
SELECT * FROM fcn_myfunc(...) WHERE <some_conditions>
Instead of creating a temp table, exec the stored procedure then filter off that temp table. If your code is performance critical, turn it into a stored procedure.
I have a statement that takes around 15 seconds to load, which is way too long.. I would like to see what is the best way to 'Cache' this data into the memory. Would I use somekind of View or Stored Procedure for this? I'm aware i can use triggers and another table, but I would like to avoid that at all costs, there is quite a bit of memory to spare.
Any suggestions?
You could check out indexed views (usually called materialized views in other RDBMS).
Do you know why your query is taking 15 seconds to run? Is the query working off the correct indexes? As others have mentioned, running the same query within a stored procedure is going to produce the same performance as the execution plan will be the same.
You might get better mileage out of using the SQL Query Optimizer and optimizing out the bottlenecks in your query. This is a good article on using the SQL Query Optimizer.
IT ALL DEPEND ON YOU, MAKE SURE TO CHECK UR EXECUTION PLAIN AND TRY TO AFFORD TOO MUCH OF SCAN, U WILL GET BETTER PERFROMANCE. I HOPE THIS HELP
I am working on tuning a stored procedure. It is a huge stored proc and joins tables that has about 6-7 million records.
My question is how do I determine the time spent in the components of the proc. The proc has 1 big select with many temp tables created on the fly (read sub-queries).
I tried using SET STATISTICS TIME ON, SET SHOWPLAN_ALL ON.
I am looking to isolate a chunk of code that takes the most time and not sure of how to do it.
Please help.
PS: I did try to google it, searched on Stackoverflow..........No luck. Here is one question that I looked at
How to improve SQL Server query containing nested sub query
Any help is really appreciated. Thanks in advance.
I would try out SQL Sentry's SQL Plan Explorer. It gives you visual help in finding the problem. It is also a free tool. It highlights the bits that cost a lot of I/O or CPU, versus a generic percent.
Here's where you can check it out:
http://www.sqlsentry.net/plan-explorer/sql-server-query-view.asp
Eric
I realize your asking for "time" (the how long), but maybe you should focus on the "what". What I mean is, tuning to the results of Execution Plan. Ideally using the "Show Execution Plan" is going to give you the biggest bang. And it will tell you, via percentages where it is cost the most resources.
If you are in SSMS 2008 you can right click in your query window and click "Include Execution Plan".
In your scenario, the best way to do this is to just run the components individually. Bear in mind the below is relevant for tuning for execution time primarily (in a low-contingency/concurrency environment). You may have other priorities under a heavy concurrent load.
I have to do a very similar break down on a regular basis for different procedures I have to tune. As a rule the general methodology I follow is:
1 - Do a baseline run
2 - Add PRINT or RAISERROR commands between portions that return the current time to aid in identifying which steps take the longest.
3 - Break down the queries individually. I normally run portions on their own (omit JOIN conditions) to see what the variance is. If it is a very long-running query you can add a TOP clause to any SELECTs to limit the returns. As long as you are consistent this will still give you a good idea.
4 - Tweak the components from step 3 that take the most time. If you have complicated subqueries, maybe make them indexed #temp tables to see if that helps. CTEs as a rule never help performance, so you may need to materialize those as well.
Is there any hints in Oracle that works the same way as these SQL Server hints?
Recompile: That a query is recompiled every time it's run (if execution plans should vary greatly depending on parameters). Would this be best compared to cursor_sharing in Oracle?
Optimize for: When you want the plan to get optimized for a certain parameter even if a different one is used the first time the SQL is run? I guess maybe could be helped with cursor_sharing as well?
Since you're using 11g, Oracle should use adaptive cursor sharing by default. If you have a query that uses bind variables and the histogram on the column with skewed data indicates that different bind variable values should use different query plans, Oracle will maintain multiple query plans for the same SQL statement automatically. There would be no need to specifically hint the queries to get this behavior, it's already baked in to the optimizer.
I didn't know, but found a discussion with some solutions here on forums.oracle.com
My primary concern is with SQL Server 2005... I went through many website and each tells something different.
What are the scenarios that are good / ok to use.. For example does it hurts to even set variable values inside IF or only if I run a query. Supposing my SPs is building a dynamic SQL based of several conditions in Input Parameters, do I need to rethink about the query... What about a SP that runs different query based on whether some record exists in the table. etc.. etc.. My question is not just limited to these scenarios... I'm looking for a little more generalised answer so that I can improve my future SPs
In essense... Which statements are good to use in Branching conditions / Loops, which is bad and which is Okay.
Generally... Avoid procedural code in your database, and stick to queries. That gives the Query Optimizer the chance to do its job much better.
The exceptions would be code that is designed to do many things, rather than making a result-set, and when a query would need to join rows exponentially to get a result.
It is very hard to answer this question if you don't provide any code. No language construct is Good/Bad/Okay by itself, its what you want to achieve and how well that can be expressed with those constructs.
There's no definitive answer as it really depends on the situation.
In general, I think it's best to keep the logic within a sproc as simple and set-based as possible. Making it too complicated with multiple nested IF conditions for example, may complicate it for the query optimiser meaning it can't create a good execution plan suitable for all paths through the sproc. For example, the first time the sproc is run, it takes path A through the logic and the execution plan reflects this. The next time it runs with different parameters, it takes path B through but resuses the original execution plan which is not optimal for this second path. One solution to this is to break the load into separate stored procedures to call depending on the path being followed - this allows that sub sproc to be optimised and execution plan cached independently.
Loops can be the only viable option, but in general I'd try to not use them - always try to do things in a set-based fashion if it is possible.