Auditing execution of stored procedures in Sql Server - sql

My boss and I have been trying to see what sort of auditing plan we could try for our stored procedures. Currently there're two external applications taking information from our database through stored procedures and we're interested in auditing when they're being executed, and what values are passed as parameters. So far what I've done is simply create a table for the stored procedures one of the apps is using, and as they use the same input parameters, have one column per parameter. Obviously this isn't the best choice, but we wanted to get quick info to see if they were running batch processes and when they were running them. I've tried SQL Server Audit, but it doesn't catch the parameters unless you're executing a SP in a query.

SQL Server Profiler will do this for you; its included for free. Setup a trace and let it run.
You can also apply quite a bit of filtering to the trace, so you don't need to track everything; you can also direct the output to a file, or sql table for later analysis. This is probably your best bet for a time limited audit.

I think I've used the SQL Server Profiler (http://msdn.microsoft.com/en-us/library/ms181091.aspx) in the past to audit SQL execution. It's not something you would run all the time, but you can get a snapshot of what's running and how it's being executed.

I haven't tried using them, but you might look at event notifications and see if they will work for you.
From BOL
Event notifications can be used to do the following:
Log and review changes or activity occurring on the database.

Related

How to regularly update or create a SQL Server table?

I need to collect data from a SQL Server table, format it, and then put it into a different table.
I have access to SQL Server but cannot setup triggers or scheduled jobs.
I can create tables, stored procedures, views and functions.
What can I setup that will automatically collect the data and insert it into a SQL Server table for me?
I would probably create a stored procedure to do this task.
In the stored procedure you can create a CTE or use temp tables (depending on the task) and do all the data manipulation you require and once done, you can use the SELECT INTO statement to move all the data from the temp table into the SQL Server table you need.
https://www.w3schools.com/sql/sql_select_into.asp
You can then schedule this stored procedure to run at a time desired by you
A database is just a storage container. It doesn't "do" things automatically all by itself. Even if you did have the access to create triggers, something would have to happen to the table to cause the trigger to fire, typically a CRUD operation on the parent table. And something external needs to happen to initiate that CRUD operation.
When you start talking about automating a process, you're talking about the function of a scheduler program. SQL Server has one built in, the SQL Agent, and depending on your needs you may find that it's appropriate to enlist help from whoever in your organization does have access to it. I've worked in a couple of organizations, though, that only used the SQL Agent to schedule maintenance jobs, while data manipulation jobs were scheduled through an outside resource. The most common one I've run across is Control-M, but there are other players in that market. I even ran across one homemade scheduler protocol that was just built in C#.NET that worked great.
Based on the limitations you lay out in your question, and the comments you've made in response to others, it sounds to me like you need to do socialize your challenge within your organization to find out what their routine mechanism is for setting up data transfers. It's unlikely that this is the first time it's come up, unless the company was founded in the last week or two. It will probably require that you set up your code, probably a stored procedure or maybe an SSIS package, and then work with someone else, perhaps a DBA or a Site Operations team or some such, to get that process automated to fire when you need it to, whether through an Agent job or maybe a file listener.
Well you have two major options, SP and SSIS.
Both of them can be scheduled to run at a given time with a simple Job from the SQL Server Agent. Keep in mind that if you are doing this on a separate server you might need to add the source server as a Linked Server so you can access it from the script.
I've done this approach in the past and it has worked great. Note, for security reasons, I am not able to access the remote server's task scheduler, so I go through the SQL Server Agent:
Run a SQL Server Agent on a schedule of your choice
Use the SQL Server Agent to call an SSIS Package
The SSIS Package then calls an executable which can pull the data you want from your original table, evaluate it, and then insert a formatted version of it, one record at a time. Alternatively, you can simply create a C# script within the SSIS package via a Script Task.
I hope this helps. Please let me know if you need more details.

Fetch Stored procedure log from system tables in SQL - Server

i am looking to retrieve the SP execution log from system table.
I looking to retrieve the parameters passed for the sp at the time of execution.
You can't. There is no such thing available without running a trace or capturing system events.
There are NO logs that Sql Server maintains itself that will help you.
Your best course of action in the future would be to log this inside the procedure when it's call. Profile traces might help you, but depending on how the procedure was called, it also might not.
Not the answer you were looking for, but it's the answer none the less.

Is there a way to reference a SQL statement to the C# EF code which generated the SQL?

When I troubleshoot a large .NET app which uses only stored procedures, I capture the sql which includes the SP name from SQL Server Profiler and then it's easy to do a global search for the SP in the source files and find the exact line which produced the SQL.
When using Entity Framework, this is not possible due to the dynamic creation of SQL statements. However there are times when I capture some problematic sql statements from production and want to know where in the code they were generated from.
I know one can have EF generate logs and tracing on demand. This probably would be taxing for a busy server and produces too much logs. I read some stuff about using mini profiler but not sure if it fits my needs as I don't have access to the production server. I do however have access to attach SQL Server Profiler to the database server.
My idea is to find a way to have EF attach/inject a unique code to the generated SQL but it doesn't affect the outcome of the SQL. I can then use it to cross reference it to the line of code which injected it into the SQL. The unique code is static which means a unique static code is used for every EF linq statement. Maybe sent as a dummy sql or a comment along with the sql statement.
I know this will add some extra traffic but in my case, it will add extra flexibility and cut a lot of troubleshooting time.
Any ideas of how to do this or any alternatives?
One very simple approach would be to execute something via ExecuteStoreCommand(): Refresh data from stored procedure. I'm not sure if you can "execute" just a comment, but at the very least you should be able to do something like:
ExecuteStoreCommand("DECLARE #MyTag VARCHAR(100) = 'some_unique_id';");
This is very simple, but you would have to find the association in two steps:
Get the SessionID (i.e. SPID) from poorly performing query in SQL Server Profiler
Search the Profiler entries for the prior SQL statement for that same SPID
Another option that might be a little more complicated but would remove that additional step when it comes to making that association is to "intercept" the commands before they get executed and inject a comment with your unique id. Please see the following S.O. Answer for details. You shouldn't need the full extent of what they did, but even if you do, it seems like all of the code (or all the relevant stuff) is there:
Adding a query hint when calling Table-Valued Function
By the way, this situation is a point in favor of using Stored Procedures instead of an ORM. And, what do you expect to be able to do in terms of performance tuning once you do find the offending app code? (another point in favor of using Stored Procedures instead of an ORM ;-).

SQL Server 2005 system stored procedure to find out the list of tables affected

Is there any system defined sp is available in SQL Server 2005, to find what are the tables are got affected when the applicaion is running and we are navigating from one page to other.
There's really no easy way (if any at all) to find that out, unfortunately.
As SQL Server MVP Aaron Bertrand puts it in his excellent blog post When was my database / table last accessed? :
A frequently asked question that surfaced again today is, "how do I see when my data has been accessed last?" SQL Server does not track this information for you. SELECT triggers still do not exist. Third party tools are expensive and can incur unexpected overhead. And people continue to be reluctant or unable to constrain table access via stored procedures, which could otherwise perform simple logging. Even in cases where all table access is via stored procedures, it can be quite cumbersome to modify all the stored procedures to perform logging.
However, with the help of the sys.dm_db_index_usage_stats DMV (dynamic management views) function and some clever T-SQL programming by Aaron, you can find out a few of those answers - check out his very enlightening blog post for details !
However: since this information is based on a DMV and the "D" in DMV stands for dynamic, those values are only valid since the last server reboot and will be wiped out and not preserved when you next have to restart your SQL Server process / reboot your server machine.
I know of none, but Profiler offers a solution. Run Profiler (can be a developer box) and navigate. It will create an output file for you of what is being run.
There are also code tools that show dependencies. I would imagine at least one shows dependencies on SQL objects.
I don't think so. You can run the SQL-profiler to see which commands are fired against the SQL server but you will have to parse them yourself.
You could also try to empty the query cache and then look at it when your navigation is done, but this cache will be contaminated by other queries running on the server (including the ones run by SQL server itself).

Creating stored procedure on the fly. What are the risks/problems?

I am thinking about creating stored procedures on the fly.
ie running CREATE PROCEDURE... when the (web) application is running.
What are the risks or problems that it can cause?
I know that the database account needs to have the extra privileges.
It does NOT happen everyday. Only from time to time.
I am using sql server and interested in mysql and postgres as well.
Update1:
Thanks to comments, I am considering creating a new version of stored procedure and switching over instead of ALTERing the sp. example: sp1 -> sp2 -> sp3
Update2:
The reason:
My schema changes because of custom fields (unknown number and type of columns)
I tried dynamic sql and sp_executesql first. Of course it works. Dynamic sql works greate for 1,2,3 simple update,inserts.
But it got too ugly and a lot of work and it does not mix well with stored procedure, problems with sql parameterization because it is used inside a stored procedure and the number and type of params is not known at compile time (long story).
At least the basic scenario for this solution is not that complicated.
The logic of the sp does NOT change. For each custom field I have to add a new parameter to sp and add a column to update, insert, etc.
I also considered making stored procedure parameters dynamic like sp_executesql that accepts any number and type of params but could not find a way.
For a transactional system it's probably quite expensive. If you have a large batch job and want to use a code generator for some reason (quite a common architecture in ETL tools, notably Oracle Warehouse Builder and Wherescape Red), it's not unreasonable to do this.
You mentioned that you would be adding and/or changing the calling profile of the stored procedure when you do this alteration. How are you lock-stepping the new calling profile with the application that makes the call to this? What's your roll-back plan if you ever have to revert a change that was made?
In the past what I've done is just append an incrementing numeric suffix to the stored procedure name with the new calling profile -- then you can modify the old version of the SP to call the new one with a default value for the parameter, and then you can release your software calling the new version.
If something breaks in your new version and you have to rollback, calls to the old stored proc will still work without error, and just populate the custom fields with your default values.
Firstly, the answer to this question really depends on what exactly this stored procedure is intended to do. If it's just reading data or creating a result set for reporting and you don't mind if it's a little inconsistent, then you're probably fine. If it's doing anything remotely interesting with your data then it's a very risky thing to be doing. You should think about whether it's possible (and what would happen) for two users users (or the same user twice) to run multiple versions of the the same stored procedure at the same time. Smells like a train wreck to me. One option is to only allow this procedure alteration to take place when no other users are logged into the system, or forcibly boot them off the database if they are. Another option is to create your new stored procedure with a slightly different name and swap them over when you deem it safe to do so.
Another issue is that one of the major benefits of stored procedures is that the execution plan is cached, meaning it will execute faster. If you are creating them on the fly you lose that advantage.
If you really need to do this then you should randomise the name of the procedure to avoid clashing with other users. Remember always that other users may be doing their own thing at the same time - most database systems won't give transactional isolation for stored procedures (Postgres is the only one I know of that does).
It would be extremely rare that this would be a desirable thing to do - could you elaborate at all on what made you choose this approach?
I would not do that personally.
As you mentioned you will need extra privileges to grant access to create/alter database objects. That can create a serious security risk as nothing would stop your application from creating a malicious stored procedure if someone discovered a security hole in it.
If your schema changes, change the stored procedures with the schema.
You will not be able to alter the procedure if one or more users are running the procedure, or another procedure that references your procedure. You will block until all the dependent procedures and the procedure you want to compile (and I think the procedure s you invoke from your procedure, but I am not certain) are not in use. This may be a long time on a busy production system, and if you are unlucky, you may timeout waiting for all the dependencies to not be in use (5 minutes on Oracle).
You can also get into very ugly situations (I have). Take for example stored procedures B and C, both of which call A, the procedure that you are trying to compile. When no one is running B, the system locks B. Now any user trying to run B will stall. The system then tries to lock C, but C is generating a very lengthy report that will not be done for another 10 minutes. You will timeout waiting for the lock, and some of your users will have an unresponsive system for 5 minutes. My experience is with Oracle, I would make sure your target DBMS does not behave in the same fashion, or has quicker failures or a better lock acquisition strategy.
I guess I am cautioning that what looks like may work on a development server may fail dramatically on a busy production system.
I'm not sure that the locking discussed by Tony BanBrahim is true in SQL Server 2005.
I have some long-running SPs (a 3 hours batch process of about 30 sub-processes), and I have been able to alter the SP while it is still running. (I don't believe the changes take effect until the next run, but it doesn't cause any blocking or any error). Now the outer long-running SP does both call SPs dynamically with EXEC and statically, but I've change both the root and nested SPs while they are running without error messages or blocks.
WRT your original question, I would think that your tactic is fine if used in a controlled way.
I don't know for sure, but it sounds like one or both:
an architectural problem
is existing code locking the schema tables from the application?
I'd take a look to see what code is locking the schema tables and rewrite that code. Do you have a 3rd party something or other that is locking those tables?