Are there any standard queries that can be run that will show the performance of a SQL Server 2005 database?
Note: I need to know the performance of every aspect of the database.
EDIT:
I am looking for a way to measure the time it takes for typical queries to execute. I am then going to apply indexing to certain tables in the database and then time how long the same queries take to execute and see if there is a significant difference. Is there an easy way to do this?
Thanks!
(Edited, link hopefully fixed)
For general background research/analysis of SQL Server performance, I prefer to watch how SQL is performing as it is performing. The best tools for that are SQL Profiler and sometimes Windows System Monitor (aka Performance Monitor aka PerfMon). Alas, neither are particularly simple, let alone simple queries against the system -- though some PerfMon counters are exposed through a few DMViews I can't dig up just now.
BOL has reasonable information on these; a good top-level (online) page for this is here. Be wary, there is serious DBA stuff beyond that point
There are some dynamic management views and functions build in:
http://msdn.microsoft.com/en-us/library/ms188754(SQL.90).aspx
select * from sys.dm_db_index_usage_stats
select * from sys.dm_os_memory_objects
Related
I am trying to connect Tableau to a SQL view I made in PostgreSQL.
This view returns ~80k rows with 12 fields. On my local PostgreSQL database, it take 7 seconds to execute. But when I try to create a chart in a worksheet using this view, it take forever to display something (more than 2 minutes to add just a field).
This views in complex and involve many join, coalesce and case due to business specifities.
Do you guys have an idea to improve?
Thank you very much for your help ! :-)
Best,
Max
Tableau documentation has helpful info for performance optimization
https://help.tableau.com/current/pro/desktop/en-us/performance_tips.htm
I highly recommend the whitepaper on designing efficient dashboards mentioned on that site - a bit dated, but timeless advice
For starters, learn to use the Performance Recorder in Tableau to find out what tasks are causing delays, and if they involve queries, to capture the SQL that Tableau emits.
With Tableau, and many other client tools, the standard first approach is to see what SQL the client tool generates, then execute that SQL without using the client tool, say just in psql in your case. If you can reproduce the slow query just in SQL, then you are better positioned to either
Optimize your database, say either with indices, or restructuring your schema OR
Understand why your client tool, Tableau in this case, generated that inefficient query and reason about what you could differently in Tableau that would cause it to generate different SQL
The whitepaper I mentioned should be helpful
I know that's a crazy question. I know I can use the built-in SQL profiler, client statistics, other statistics built into SQL Server (oh just view the query plan) -- but that's all ugly, takes too much time to set up and the results are not intuitive.
What I was hoping for was something like JetBrains DotTrace where you could see hotspots of slow code - but applied to stored procedures.
Let me also add I am working with existing stored procs that are lengthy - some are 10k plus lines. While this is not ideal, I only want to start with refactoring small parts of only the worse performing stored procs - and so I don't have to spend all day looking at performance numbers/timings/etc, I just want a profiler that shows me where in those stored procs the time is being spent (which blocks or lines).
Crazy request, I know - hopefully someone knows something that would be helpful.
Have a look at DBSophic's tools. The free tool helps you analyze a trace you've already collected, and gives recommendations for T-SQL re-writes, schema changes etc., focusing on the most painful parts (regardless of number of lines in modules).
If you combine their Workload Analyzer with our tool, SQL Sentry Performance Advisor, you can point it at workload data that we've already collected - so no worry about going and manually collecting your own trace. I wrote a blog post about this.
I know there are things out there to help to optimize queries, ect... but is there anything else, something like a full package that can scan your database and highlight all the performance issues, naming conventions, tables not properly normalized, etc?
I know this is the job of a DBA and if the DBA is good, he shouldn't need a tool like that, but sometimes you start a new job, you get in charge of an existing database and the DB is a mess, so you don't know where to start...
Thanks to everyone
Dave
In my opinion a normalized database does not guarantee good performance. Normalization is concerned primarily with data consistency.
It's not really practical for an automated tool to handle what you are proposing because there is no one case fits all implementation for best practice, which after all is to be treated as more of a guideline. That's why businesses hire people like me to take an objective look at their unique SQL Server environment (shameless plug).
The performance aspect can be addressed either by the wealth of features already available in the SQL Server product or by off the shelf tools such a those from Quest/Redgate and the like.
If you want to get a quick overall feel for the performance of a new SQL Server box that has come under your administrative control then I suggest either using the freely available Performance Dashboard Reports or SQL Server DMV's. You could also take a look at the current Wait Types on the server.
SQL Server 2005 Performance Dashboard
Reports
SQL Server Wait Type Example
How to identify the most costly SQL
Server queries using DMV’s
I hope this answers your question and do let me know if I can assist further.
Edited in response to comment:
Maybe a general Health Check could provide useful info.
SQL Server Best Practice
Analyzer
I'm looking for a tool, which would help creating complex SQL queries. Sometimes it's difficult to even verify, whether the results of a query are correct. It's especially easy to get queries joining several tables to return too little or too much data.
The tool should enable at least creation of test tables, some kind of visualization how the queries gather their data and hopefully give better parsing of error cases than for example Oracle does.
Are there tools like this or do I have to stick with creating test tables manually, filling them with test data and commiting all kinds of queries with SQuirrel SQL?
When you have a very complex query it is usually easiest to validate by breaking it up into multiple queries that populate temp tables. These intermediary results can be individually verified and then you bring them together to produce the final result set. Depending on performance needs you can stick with the temp table approach or you can then rewrite to a single statement. Typically when I have a huge query it is for background processing so I stick with the temp table approach.
What RDBMS are you using? All of the major ones have some type of console available (e.g.-SSMS in SQL Server, Toad in Oracle, MySQL Query Browser/Administrator for MySQL, etc.), and they all have Query Execution Plans where you can see how the query will actually run. So, the answer to your question is that it's entirely dependent on what RDBMS you're using, but the safe bet answer is: Yes.
I recommend trying SQL Server 2008 Management Studio Express (SSMSE) if you are working with SQL Server. I have used it at work and I believe it does everything you are looking for.
You can get it and SQL Server (express editions) here.
Certainly not a free, open-source solution, but I believe Quest Software's TOAD will fit your requirements. Quest seems to offer alot of tools in that space...they have tools for modeling and analysis, however I've never used the modeler or analyzer.
I personally have experience with the commercial version of TOAD for Oracle. It's GUI is overwhelming at first, but after you mentally filter out all of the extra buttons that you'll never use, it's manageable.
In the never-ending search for performance (and my own bludgeoning experience), I've learnt a few things that could drag down the performance of a SQL statement.
Obsessive Compulsive Subqueries Disorder
Doing crazy type conversions (and nest those into oblivion)
Group By on aggregate functions of said crazy type conversions
Where fldID in (select EVERYTHING from my 5mil record table)
I typically work with MSSQL. What tools are available to test the performance of a SQL statement? Are these tools built in and specific to each type of DB server? Or are there general tools available?
SQL Profiler (built-in): Monitoring with SQL Profiler
SQL Benchmark Pro (Commercial)
SQL Server 2008 has the new Data Collector
SQL Server 2005 (onwards) has a missing indexes Dynamic Management View (DMV) which can be quite useful (but only for query plans currently in the plan cache): About the Missing Indexes Feature.
There is also the SQL Server Database Engine Tuning Advisor which does a reasonable job (just don't implement everything it suggests!)
I mostly just use Profiler and the execution plan viewer
Execution Plans are one of the first things to look at when debugging query performance problems. An execution plan will tell you how much time is roughly spent in each portion of your query, and can be used to quickly identify if you are missing indexes or have expensive joins or loops.
MSSQL has a database tuning advisor that will often recommend indexes for tables based upon common queries run during the tuning period, however it wo't rewrite a query for you.
In my opinion, experience and experimentation are the best tools for writing good SQL queries.
In mysql (may be in other databases too) you can EXPLAIN your query to see what database server thinks about it. This usually used to deside which indexes should be created. And this one is build-in, so you can use it without installing additional software.
Adam Machanic has a simple tool called SqlQueryStress that might be of use. It is designed to be used to "run a quick performance test against a single query, in order to test ideas or validate changes".