Hi I have a web application with nhibernate talking to the database.
The application is running very slow due to lot of chatty db calls that nhibernate is doing.
I want to run a profiler or something similar that can give some stats example
number of db trips.
table names and times accessed.
I saw one of the products called nhprof . I am wondering if there is something open source or free available out there or some other technique that i can use to meet the goal here.
Edit: I am using SQL Server 2005 .
If you just want to know the 2 things you mention, you can create a log4net appender to find the information you want. I use a http module that adds the query information to the html in a web application when running in debug mode. If you want real statistics or more information than in the example, nhprof might be worth the money.
The NHibernate statistics are available as ISessionFactory.Statistics and ISession.Statistics. For the basic stats you describe, this
That said, NHProf does more and is well worth the price.
Related
We're having some queries in an Azure SQL database that are occasionally running very slowly. The issue has been difficult to properly diagnose, as the same queries will run fine at other times, even when the server is under a similar load.
To help, I'd like to be able to view log information for the server. If I could see a list of transactions, by time, and their outcome (completed, terminated/rolled back, etc) I believe it would be helpful. Several other SQL pages seem to allude to log-files you can access, but since this is an Azure SQL instance, there isn't a physical server I can just download a file from.
I know I can query sys.event_log to see when particular events are occurring (and in fact, I do see a high amount of deadlocks around our problem times), but I'm unaware of any way to see what query's were being handled at the time of these locks.
I'd like to be able to view log information for the server. If I could see a list of transactions, by time, and their outcome (completed, terminated/rolled back, etc) I believe it would be helpful.
The log information you are trying to view is not helpfull.
You can view slowly running queries running using the same manner like on premises using DMV's
You can also enable query store ,which can you show you different stages of query .This i think will help you more in troubleshooting slow queries and is not tied to Premium Databases only
I've been a SQL Azure Database user for some time (over a year). I have a mostly readonly 5GB database that fuels my website. Queries hit the database about once or twice a second, and response times are generally sub 100ms.
There have been a few times when performance for all queries goes down the toilet. Today for example, I awoke to alarms that the database was performing poorly. Simple queries that normally take 30ms are taking over 3 minutes! My load on the server is no greater than usual, so I attribute this decline in performance to my DB sharing an instance with one or more DBs from other Azure users.
To solve this problem, I copy the database to a new instance (CREATE DATABASE NEW_DB AS COPY OF OLD_DB), and point the website to the new instance. All is well until this happens the next time. In about a year's time, this has happened 4 or 5 times.
My question: does anyone have some advice on how to mitigate this? If this is just life under Azure, it's pretty unacceptable.
EDIT: just realized that this question is from 2014. If you're still having issues, the questions and suggestions below may guide you in the right direction. If you've resolved the performance issues, feel free to share how any actions you may have taken to improve performance.
What tier are you on right now?
Reference: http://searchsqlserver.techtarget.com/tip/SQL-Azure-database-recommendations-and-best-practices
Are your users coming from different geographical regions? If so, are you using endpoint monitoring for the web app that accesses your SQL Azure db?
Reference: https://azure.microsoft.com/en-us/documentation/articles/web-sites-monitor/#webendpointstatus
Have you tried reading through the official performance guide?
Reference: https://msdn.microsoft.com/en-us/library/azure/dn369873.aspx
Here's a 3rd-party writeup that mentions "the differences in connectivity behavior or that SQL Azure resources get throttled when you overload the database require you to take such things into account and code your application to handle issues you may not have using traditional a SQL Server application."
Reference: http://searchsqlserver.techtarget.com/tip/SQL-Azure-database-recommendations-and-best-practices
This article requires (free) email signup before reading the full article, but it may help you with some recommendations and best practices.
Hope that helps!
We are having significant performance problems on azure. Various factors have made this difficult to examine precisely on azure itself. If the problems are in the performance of the code or of the database I would like to examine them by running locally. However it appears that the default configuration of our database on azure is different than it is locally, e.g. apparently an azure created database defaults to run with different configuration than my local database, e.g. the default on azure includes read committed snapshot as I understand, but that is not the default for a database I create in sql server. That means that performance issues are different for the two.
My question is how can I find all such discrepancies between the setup of the two and correct them so that when I find speed issues locally I will know they represent speed issues on azure. I am a sql server novice. I recognize that I cannot recreate "time to database" and "network time" issues that way, but I don't think those are what are killing us.
You might find my answer to this post useful.
We had great advantages in implementing telemetry to gather information and use it later for analysis, to finally find out where and how you are spending your time interacting with SQL and therefore how to improve the query plans. Here is a link to the CAT blog post: http://blogs.msdn.com/b/windowsazure/archive/2013/06/28/telemetry-basics-and-troubleshooting.aspx
All-
I'm trying to determine which SQL databases are currently being used the most (as well as what applications are requesting information from them).
Is there a log analyzing tool? Or something built into SQL server that could help me achieve this?
Ideally I'd like to show a map of server usage and understand which applications are actually hitting them.
Thanks!
sys.dm_db_index_usage_stats shows exactly how many time each index/table was read/scanned/updated since the server started up. This is the most important piece of information since everything else (IO, RAM, CPU) can be ultimately traced to these operations. The one information not revealed from here is blocking and contention, for which a good starting point is sys.dm_os_wait_stats. And finally there is sys.dm_exec_query_stats which will drill down to the individual query CPU and execution times.
If you right-click on the server in Management Studio you will see a 'Reports' option. There are a lot of built in reports which might give you what you need (the 'Server Dashboard' report in particular shows which databases are consuming the most CPU and I/O).
Alternatively the Profiler provides a lot of (perhaps too much) valuable data.
I often find myself writing one off queries to either answer someone's question or trouble shoot something and I would like to be able to quickly expose the on demand refreshable results of the query graphically so that I can share these results to others without having to go through the process of creating an SSRS report and publishing it to a reporting services server.
I have thought about using excel to do this or maybe running a local SSRS server but both of these options are still labor intensive and I cannot justify the time it would take to do these since no one has officially requested that I turn this data into a report.
The way I see it the business I work for has invested money in me creating these queries that often return potentially useful data that other people in the organization might want but since it isn't exposed in any way and I don't know that this data is something they want and they may not even realize they want this data, the potential value of the query is not realized. I want to increase the company's return on investment on all these one off queries that I and other developers write by exposing their results graphically so that they can be browsed by others and then potentially turned into more formalized SSRS reports if they provide enough value to justify the development of the report.
What is the fastest way for me to take a query and turn it into a refreshable graph of the results set?
Why dont you simply use what you may already have. Excel...you can import data via an ODBC / Oracle / SQL Connection. Get Data..and bam you can run the query and format it right in the spreadsheet and provide sorting etc. All you need to supply is the database name and user name and password to connect to the db.
JonH is right regarding Excel's built in ODBC support, but I have had tons of trouble with this. In my case, the ODBC connection required the client software to be installed so that it could use the encryption methods, etc. Also, even if that were not the case, the user (I believe) would still have to manually install and set up an ODBC connection.
Now if you just want something on your machine to do the queries and refresh them, JohH's solution is great and my caveats are probably irrelavent. But if you want other users to have access, you should consider having a middle-man app (basically a PHP script, assuming a web server is an option for you), that does a query, transforms the results into XML, and outputs it as "report-xyz.xml". You can then point anybody running a newer version of Excel to that address and they can very easily import the data into Excel with no overhead. (basically a kind of web service).
Keep in mind, I don't think you should have a web script that will allow users to make queries to your Database server! You would have some admin page where you make pass the query in and a new xml file with the results gets made. So my idea is also based on the idea that you want to run the same queries over and over without any specifics passed in. (if that were the case, I'd look into just finding a pre-built web services bridge for your database that already has security features built in. Then you could have users make the limited changes allowed.)