How can I capture all the queries that are being executed against a table? - sql

I have an ecommerce application that I believe is not properly caching all of our images and so I would like to capture all the queries that are occurring against our images table.
I need to be able to do this without installing anything or adding any code to the solution.
Can this be accomplished with SQL Profiler or another tool that does not require code modification?

The SQL Profiler is indeed the right tool for this.
You can attach it to your database, set some filters (for example, the text should contain the table name) and what events to log and off you go.

SQL Profiler will capture the queries as you have identified, you could also inspect the query cache, but it would not necessarily have all of the queries against that table still in cache, so should not be relied on.

Related

Viewing Logs in Azure SQL

We're having some queries in an Azure SQL database that are occasionally running very slowly. The issue has been difficult to properly diagnose, as the same queries will run fine at other times, even when the server is under a similar load.
To help, I'd like to be able to view log information for the server. If I could see a list of transactions, by time, and their outcome (completed, terminated/rolled back, etc) I believe it would be helpful. Several other SQL pages seem to allude to log-files you can access, but since this is an Azure SQL instance, there isn't a physical server I can just download a file from.
I know I can query sys.event_log to see when particular events are occurring (and in fact, I do see a high amount of deadlocks around our problem times), but I'm unaware of any way to see what query's were being handled at the time of these locks.
I'd like to be able to view log information for the server. If I could see a list of transactions, by time, and their outcome (completed, terminated/rolled back, etc) I believe it would be helpful.
The log information you are trying to view is not helpfull.
You can view slowly running queries running using the same manner like on premises using DMV's
You can also enable query store ,which can you show you different stages of query .This i think will help you more in troubleshooting slow queries and is not tied to Premium Databases only

Include but not Delete SQL Schema Compare

I am attempting to use SQL Schema Compare in Visual Studio 2013/15 and am running into the problem that discluding tables from delete removes them from being processed at all.
The issue is that the tables it is trying to delete are customer made tables, so when we sync our version against their databases it asks to delete them. We do not want to delete them, but some of their tables have constraints on ours so when it attempts to CCDR it fails due to table constraints. Is there a way to add the table to be (re-created? like the rest of them?), without writing scripts for each client to do what SQL Schema Compare already does just for those few tables?
Red-Gate's SQL Compare does this somehow, but it's hidden from us so not quite sure how it's achieved. Discluding doesn't delete, but does not error on the script either.
UPDATE:
The option "Drop constraints not in source" does not appear to work correctly. It does drop some, however there are others that it just does not drop the constraints. In red-gate's tool, when we compared I found how to get the SQL from it, and their product doesn't say the table needs to be updated at all, while Visual Studio's does. They seem to work almost identical, but the tables that fail are the ones that shouldn't be update at all (read below)
Update 2:
Another problem I've found is "Ignore column collation" also doesn't work correctly, as tables that shouldn't be getting dropped are being told they need to be updated even though it's only order of column changes, not actual column or data changes, which makes this feel like more of a bug report than anything.
My suggestion with these types of advance data calculations is to not use Visual Studio. Put the logic on the Sql engine and write the code for this in Sql. Due to the multi user locking issues of a Sql engine these types of processes are prone to fail when the wrong combinations of user actions happen at the same time. The Visual Studio tool can not interface with the data locking issues due to records changing that the Sql engine can. If you even get this to work it will only be safe to run if you are in single user mode.
It is a nice to use tool, easier than writing Sql but there are huge reliability and consistency risks for going down this path.
I don't know if this will help but I've found this paragraph
on the following page:
https://msdn.microsoft.com/en-us/library/hh272690(v=vs.103).aspx
The update will fail because our change involves changing a column
from NOT NULL to NULL and as a result causes data loss. If you want to
proceed with the update, click on the Options button (the fifth one
from the left) on the toolbar for the Schema Compare and uncheck the
block incremental deployment if data loss option.

Is there a way to reference a SQL statement to the C# EF code which generated the SQL?

When I troubleshoot a large .NET app which uses only stored procedures, I capture the sql which includes the SP name from SQL Server Profiler and then it's easy to do a global search for the SP in the source files and find the exact line which produced the SQL.
When using Entity Framework, this is not possible due to the dynamic creation of SQL statements. However there are times when I capture some problematic sql statements from production and want to know where in the code they were generated from.
I know one can have EF generate logs and tracing on demand. This probably would be taxing for a busy server and produces too much logs. I read some stuff about using mini profiler but not sure if it fits my needs as I don't have access to the production server. I do however have access to attach SQL Server Profiler to the database server.
My idea is to find a way to have EF attach/inject a unique code to the generated SQL but it doesn't affect the outcome of the SQL. I can then use it to cross reference it to the line of code which injected it into the SQL. The unique code is static which means a unique static code is used for every EF linq statement. Maybe sent as a dummy sql or a comment along with the sql statement.
I know this will add some extra traffic but in my case, it will add extra flexibility and cut a lot of troubleshooting time.
Any ideas of how to do this or any alternatives?
One very simple approach would be to execute something via ExecuteStoreCommand(): Refresh data from stored procedure. I'm not sure if you can "execute" just a comment, but at the very least you should be able to do something like:
ExecuteStoreCommand("DECLARE #MyTag VARCHAR(100) = 'some_unique_id';");
This is very simple, but you would have to find the association in two steps:
Get the SessionID (i.e. SPID) from poorly performing query in SQL Server Profiler
Search the Profiler entries for the prior SQL statement for that same SPID
Another option that might be a little more complicated but would remove that additional step when it comes to making that association is to "intercept" the commands before they get executed and inject a comment with your unique id. Please see the following S.O. Answer for details. You shouldn't need the full extent of what they did, but even if you do, it seems like all of the code (or all the relevant stuff) is there:
Adding a query hint when calling Table-Valued Function
By the way, this situation is a point in favor of using Stored Procedures instead of an ORM. And, what do you expect to be able to do in terms of performance tuning once you do find the offending app code? (another point in favor of using Stored Procedures instead of an ORM ;-).

SQL Server : list all columns used in queries

Is there a way to detect which columns and which tables are used in a SQL Server database?
Just against SQL Server 2012 would be fine.
We can assume there are no '*' for column usage in the legacy site.
Details:
I'm working on updating the table structure of a legacy system to work on a newer database (2005 to 2012)
There are a lot of bloated tables, with columns that are never used, and even tables that are never used. Identifying all of them would be a pain by manually going through the code.
(My assumption is that we can run SQL Server profiler while running a complete test pass on the app, but I don't know a convenient way to extract the columns)
Thanks.
You can list dependencies for a table in Mgmt Studio, which will show you which SPs, UDFs etc depend on the table in question. You can't do that for a single field. However, that would only show the internal dependencies. Sql Profiler would theoretically show you all fields that get requested by your app however it still would not really tell you much as the app may not do anything with the values it retrieves. If you are going to change the db it would only really make sense to put in the effort if you were also going to change the app and then you should be really get some input from users on what features are still useful and what is broken before you get too involved in a back-end refresh. IMHO.

Find details of a query or statement that caused an unexpected table update

We have been having problems with ghost updates in our DB (SQL Server 2005). Fields are changing, and we cannot find the routine that is performing the update.
Is there any way, (perhaps using an update trigger ?) to determine what caused the update? The SQL statement, process, username/login,etc?
Use SQL Server Profiler
You'll probably want to filter away the things you don't need so it might take a while to get it setup.
At least it'll get you to the procedure / query that is responsible as well as user / computer for the alterations, which leaves finding that in your code.
I found and article that might help you out over here:
http://aspadvice.com/blogs/andrewmooney/archive/2007/08/20/SQL-Server-2005-Audit-Log-Using-Triggers.aspx
All the information that you are asking for is available at the time the update is performed. The SQL Profiler will certainly work, but it is a bit of work to craft a filter that does not overwhelm you with data, particularly if you need to run it for days or weeks at a time. An update trigger is easy enough the create, and you can log the information that you need in a new table.
I would probably use AutoAudit to generate triggers on the table first.
It's somewhat limited in terms of knowing exactly what is changing your data, but it's a start.
You could always look at the triggers and modify them to only log certain columns you are interested in and perhaps get more information which it doesn't currently log.