Client from their production environment send me a trace file.
I want to know which stored procedures are taking longest.
The things they record in the trace file include: RPC:Starting , RPC:Completed
I noticed in in the trace columns we have both StartTime/EndTime and also Duration.
Which one should I use for my purpose?
And to know how long a SP took, so Should I find the difference between StartTime of RPC:Starting and EndTime of RPC:Completed?
If it helps, you can run the Stored Procedure using the Display Estimated Execution Plain (CTRL + L) tool contained in SQL Management Studio.
It will demonstrate the cost of time and execution of each.
I hope I have helped.
Related
What is the technics if I want to catch/monitor/log/save the native SQL commands of the application developed by us? We have Oracle database.
I have already tried the SQL Developer/Tools/Monitor session function, but it does not include the SQL statements of our apps. RealTime SQL Monitor function contains only a part of the required commands and a lot of useless entries….
Practically what I want:
- „Switch On” the trace function (e.g. in SQL developer or SQL*Plus)
- Launch the application and try some functionalities with real data (e.g. the slow queries)
- As soon as I think I have enough measurement: „Switch Off” the trace function and….
- Start analyzing/tuning the SQL commands (e.g. with SQL developer/Explain Plan, etc.)
1) You can always use awr reports for specified time to know which queries were running in the databases in given time period.
Run awr report using ;
#$ORACLE_HOME/rdbms/admin/awrrpt.sql
AWR captured top sql, but you can increase number of sql captured in awr.
2) Database level trace can capture all sqls run, with execution plans & other stats. You can manually start & stop trace. Once trace is stopped, you can use tkprof to generate readable files. trace always cause bit of performance over head & space overhead , but not massive in my experience.
Abhi
I'm having a real problem with my application and SQL Server (2008 R2).
I have a bug whereby a stored procedure is failing because of a misconfigured SQLCMD variable but the call to the stored procedure is in an assembly for which I don't have the source code.
Is there a way to watch which stored procedures are being executed? Or is there a way to determine with an SQL query which stored procedures have been executed and when?
I am really stuck. Please help!
M
You could try running this against your database:
select OBJECT_NAME([object_id], database_id), last_execution_time, execution_count
from sys.dm_exec_procedure_stats
order by last_execution_time desc
Documentation: http://msdn.microsoft.com/en-us/library/cc280701.aspx
That gives you a snapshot at the time of execution what was last run and how many times it's been executed since it was last compiled. The table doesn't unfortunately give a log per-se of the stored procedures getting run, just when they were run last and some other helpful information.
Fore a much more involved approach, you could look at SQL Server Audit, a new feature to SQL Server 2008. I don't have much experience with it, but this should give you a starting point if you're super stuck.
The query has been canceled because the estimated cost of this query (1660) exceeds the configured threshold of 1500. Contact the system administrator.
I am getting error as above on live while running one of the stored procedure threads where parameter contain XML variable.
I have checked the configuration value of QUERY_GOVERNOR_COST_LIMIT is set to 1500.
To get resolve this problem I have added SET QUERY_GOVERNOR_COST_LIMIT 0 in stored procedures. And it is working fine.
When I run stored procedures in back end with and without SET QUERY_GOVERNOR_COST_LIMIT 0 statement, it is running fine, and run within 0 seconds.
But it is creating problem with .Net application and getting error.
So, why it is giving error with application and not with SQL Query analyzer?
Even query is run within 0 seconds as it can give error when execution time will exceed more then 15 seconds (as configured QUERY_GOVERNOR_COST_LIMIT 1500 )?
Please share your idea for the analysis and solution.
Could be because SET ARITHABORT is OFF from .NET
could also be a conversion problem, look at your execution plan do you see any conversions. How are you executing this from .NET and are you using the correct data types?
usually this happens because of different default ANSI setting for SSMS and .net
they could create different execution plans.
the first you need to check is the execution plans from both sources.
you can do this with sql profiler's Showplan XML event
QUERY_GOVERNOR_COST_LIMIT is a connection level Runtime setting. So while making connection this needs to be set. When you test in SSMS Query window you need to set this setting in QueryOption property window (right click inside query window, QueryOptions, Advance,...)
You also mentioned that query executes in 0 seconds so why even from .NET it errors out with setting of 15sec? Because the setting works on Estimated Query execution Cost, not the Actual one. So right question is why sql server has estimated the execution cost more than 15sec.? And there is no single answer for this one.
Though I would like to know what is the user workflow/situation where you actually need to use this setting. Many times estimated cost is lot different than actual so unless dev/dba know exactly what they are doing and what will be execution... looks like I don't understand the practical usage of this setting.
I've an SSIS package that runs a stored proc for exporting to an excel file. Everything worked like a champ until I needed to a do a bit of rewriting on the stored proc. The proc now takes about 1 minute to run and the exported columns are different, so my problems are the following;
1) SSIS complains when I hit the preview button "No column information returned by command"
2) It times out after about 30 seconds.
What I've done.
Tried to clean up/optimize the query. That helped a bit, but it still is doing some major calculations and it runs just fine in SSMS.
Changed the timeout values to 90 seconds. Didn't seem to help. Maybe someone here can?
Thanks,
Found this little tidbit which helped immensely.
No Column Names
Basically all you need to do is add the following to your SQL query text in SSIS.
SET FMTONLY OFF
SET NOCOUNT ON
Only problem now is it runs slow as molasses :-(
EDIT: It's running just too damn slow.
Changed from using #tempTable to tempTable. Adding in appropriate drop statements. argh...
Although it appears you may have answered part of your own question, you are probably getting the "No column information returned by command" error because the table doesn't exist at the time it tries to validate the metadata. Creating the tables as non-temporary tables resolves this issue.
If you insist on using temporary tables, you can create the temporary tables in the step preceeding the data flow. You would need to create it as a ## table and turn off connection sharing for the connection for this to work, but it is an alternative to creating permanent tables.
A shot in the dark based on something obscure I hit years ago: When you modified the procedure, did you add a call to a second procedure? This might mess up SSIS's ability to determine the returned data set.
As for (2), does the procedure take 30+ or 90+ seconds to run in SSMS? If not, do you know that the query is actually getting into SQL from SSIS? Might be worth firing up SQL Profiler to see what's actually being sent to SQL Server. [Which was the way I found out my obscure factoid.]
When the SQL Server (2000/2005/2008) is running sluggish, what is the first command that you run to see where the problem is?
The purpose of this question is that, when all answers are compiled, other users can benefit by running your command of choice to segregate where the problem might be.
There are other troubleshooting posts regarding SQL Server performance but they can be useful only for specific cases.
If you roll out and run your own custom SQL script,
then would you let others know what
the purpose of the script is
it returns (return value)
to do to figure out where problem is
If you could provide source for the script, please post it.
In my case,
sp_lock
I run to figure out if there are any locks (purpose) to return SQL server lock information. Since result set displays object IDs (thus not so human readable), I would usually skim through result to see if there are abnormally many locks.
Feel free to update tags
Why run a single query when a picture is worth a thousand words!
I prefer to run the freely avaialable Performance Dashboard Reports.
They provide a complete snapshot overview of your servers performance in seconds. You can then choose the a specific area to investigate (locking, currently running queries, wait requests etc.) simply by clicking the apporpriate area on the Dashboard.
http://www.microsoft.com/downloads/details.aspx?FamilyId=1d3a4a0d-7e0c-4730-8204-e419218c1efc&displaylang=en
One slight caveat, I beleive these are only available in SQL 2005 and above.
sp_who
http://msdn.microsoft.com/en-us/library/aa260384(SQL.80).aspx
I want to see "who", what machines/users are running what queries, length of time, etc. I can also easily scan for blocks.
If something is blocking a bunch of other transactions I can use the spid to issue a kill command if necessary.
sp_who_3 - Provides a lot of information available elsewhere but in one nice output. Also has several parameters to allow customized output.
A custom query which combines what you would expect in sp_who with DBCC INPUTBUFFER(spid) to get the last query text on each spid ordered by the blocked/blocking graph.
Process data is avaliable via master..sysprocesses.
sp_who3 returns standand sp_who2 output, until you specify a specific spid, then gives 6 different recordsets about that spid including locks, blocks, what it's currently doing, the T/SQL it's running, and the statement within the T/SQL that is currently running.
Ian Stirk has a great script I like to use as detailed in this article: http://msdn2.microsoft.com/en-ca/magazine/cc135978.aspx
In particular, I like the missing indexes one:
SELECT
DatabaseName = DB_NAME(database_id)
,[Number Indexes Missing] = count(*)
FROM sys.dm_db_missing_index_details
GROUP BY DB_NAME(database_id)
ORDER BY 2 DESC;
DBCC OPENTRAN to see what the oldest active transaction is
Displays information about the oldest
active transaction and the oldest
distributed and nondistributed
replicated transactions, if any,
within the specified database. Results
are displayed only if there is an
active transaction or if the database
contains replication information. An
informational message is displayed if
there are no active transactions.
followed by sp_who2
I use queries like those:
Number of open/active connections in ms sql server 2005