Getting specific SQL Commands from DataAdapter.Update with parameters - sql

I'm trying to improve the current auditing I have with one of my databases. Currently this is done with access data macro's however I use vb.net as a front end.
Most of the updates use a data adapter and use the following command to update the backend
CurrentDataAdapter.Update
For the purposes of inserting the information into an audit table I would like to be able to list the SQL commands that take place with this. Using the command text just gives a single SQL command with the parameters place holders
CurrentDataAdapter.UpdateCommand.CommandText
Gives
UPDATE [Table] Set F1=#P1 WHERE ID=#P2
However I'm more after a list of
UPDATE [Table] SET F1=a WHERE ID=1
UPDATE [Table] Set F2=b WHERE ID=2
UPDATE [Table] SET F3=c WHERE ID=3
Is this possible? (Multiple SQL statements in one are not supported with Access backend)
Many thanks

Related

Access SQL - Update statement can't set field to NULL. Says

I've inherited an application that uses Access as the database. I have a field (Date/Time, not required) with some values I need to set to null after a specific date.
To update it I have a little program that runs the query and tells me how many rows affected. (Access isn't installed on the server, the mdb is constantly locked. So I can't download, update, replace. But I can use a simple VB program)
Anyway I need to set some values to null, and to do that I use the following query:
UPDATE [AppPosting] SET [approvedTime] = NULL WHERE [approvedTime] >= #25/10/2022 00:00:00#
Running it gives me "82 affected rows", and running it again gives me the same amount of affected rows. If I open access and look in the (local copy of the) database I can see they haven't update. If I run the same query in access also get 82 affected rows, but they're also set to null.
So what gives? My update says it updates through OleDbConnection, but doesn't update. Whereas through access it says it updated, and actually updates?

Identify the action that is deleting all rows in a table

There is SQL Server 2012 database that is used by three different applications. In that database there is a table that contains ~500k rows and for some mysterious reason this table gets emptied every now and then. I think this is possibly caused by:
A delete query without a where clause
A delete query in a loop gone wild
I am trying to locate the cause of this issue by reviewing code but no joy. I need an alternate strategy. I think I can use triggers to detect what/why all rows get deleted but I am not sure how to go about this. So:
Can I use triggers to check if a query is attempting to delete all rows?
Can I use triggers to log the problematic query and the application that issues that query?
Can I use triggers to log such actions into a text file/database table/email?
Is there a better way?
You can use Extended Events to monitor your system.
Here a simple screen shot where are.
A simple policy can monitor for delete and truncate statements.
When this events are raised details are written into file.
Here a screen with details (you can configure the script to collect more data) collected for delete statement.
Here the script used, modify the output file path
CREATE EVENT SESSION [CheckDelete] ON SERVER
ADD EVENT sqlserver.sql_statement_completed(SET collect_statement=(1)
ACTION(sqlserver.client_connection_id,sqlserver.client_hostname)
WHERE ([sqlserver].[like_i_sql_unicode_string]([statement],N'%delete%') OR [sqlserver].[like_i_sql_unicode_string]([statement],N'%truncate%')))
ADD TARGET package0.event_file(SET filename=N'C:\temp\CheckDelete.xel',max_file_size=(50))
WITH (MAX_MEMORY=4096 KB,EVENT_RETENTION_MODE=ALLOW_SINGLE_EVENT_LOSS,MAX_DISPATCH_LATENCY=30 SECONDS,MAX_EVENT_SIZE=0 KB,MEMORY_PARTITION_MODE=NONE,TRACK_CAUSALITY=OFF,STARTUP_STATE=OFF)
GO
This is a possibility that may help you. It creates a trigger on Table1 that sends an email when a process DELETEs more than 100 records. I'd modify the message to include some useful data like:
Process ID (##SPID)
Host (HOST_NAME())
Name of app (APP_NAME())
And possibly the entire query
CREATE TRIGGER Table1MassDeleteTrigger
ON dbo.Activities
FOR DELETE
AS
DECLARE #DeleteCount INT = (SELECT COUNT(*) FROM deleted)
IF(#DeleteCount > 100)
EXEC msdb.dbo.sp_send_dbmail
#profile_name = 'MailProfileName',
#recipients = 'admin#yourcompany.com',
#body = 'Something is deleting all your data!',
#subject = 'Oops!';

SQL Server Update Permissions

I'm currently working with SQL Server 2008 R2, and I have only READ access to a few tables that house production data.
I'm finding that in many cases, it would be extremely nice if I could run something like the following, and get the total record count back that was affected :
USE DB
GO
BEGIN TRANSACTION
UPDATE Person
SET pType = 'retailer'
WHERE pTrackId = 20
AND pWebId LIKE 'rtlr%';
ROLLBACK TRANSACTION
However, seeing as I don't have the UPDATE permission, I cannot successfully run this script without getting :
Msg 229, Level 14, State 5, Line 5
The UPDATE permission was denied on the object 'Person', database 'DB', schema 'dbo'.
My questions :
Is there any way that my account in SQL Server can be configured so that if I want to run an UPDATE script, it would automatically be wrapped in a transaction with an rollback (so no data is actually affected)
I know I could make a copy of that data and run my script against a local SSMS instance, but I'm wondering if there is a permission-based way of accomplishing this.
I don't think there is a way to bypass SQL Server permissions. And I don't think it's a good idea to develop on production database anyway. It would be much better to have development version of the database you work with.
If the number of affected rows is all you need then you can run select instead of update.
For example:
select count(*)
from Person
where pTrackId = 20
AND pWebId LIKE 'rtlr%';
If you are only after the amount of rows that would be affected with this update, that would be same amount of rows that currently comply to the WHERE clause.
So you can just run a SELECT statement as such:
SELECT COUNT(pType)
FROM Person WHERE pTrackId = 20
AND pWebId LIKE 'rtlr%';
And you'd get the resulting potential rows affected.
1.First Login as admin in sqlserver
2.Goto login->your name->Check the roles.
3.IF u have write access,then you can accomplish the above task.
4.If not make sure you grant access to write.
If it's strictly necessary to try the update, you could write a stored procedure, accepting dynamic SQL as a string (Your UPDATE query) and wrapping the dynamic SQL in a transaction context which is then rolled back. Your account could then be granted access to that stored procedure.
Personally, I think that's a terrible idea, and incredibly unsafe - some queries break out of such transaction contexts (e.g. ALTER TABLE). You may be able to block those somehow, but it would still be a security/auditing problem.
I recommend writing a query to count the relevant rows:
SELECT COUNT(*)
FROM --tables
WHERE --your where clause
-- any other clauses here e.g. GROUP BY, HAVING ...

Debug Insert and temporal tables in SQL 2012

I'm using SQL Server 2012, and I'm debugging a store procedure that do some INSERT INTO #temporal table SELECT.
There is any way to view the data selected in the command (the subquery of the insert into?)
There is any way to view the data inserted and/or the temporal table where the insert maked the changes?
It doesn't matter if is the total rows, not one by one
UPDATE:
Requirements from AT Compliance and Company Policy requires that any modification can be done in the process of test and it's probable this will be managed by another team. There is any way to avoid any change on the script?
The main idea is that the AT user check in their workdesktop the outputs, copy and paste them, without make any change on environment or product.
Thanks and kind regards.
If I understand your question correctly, then take a look at the OUTPUT clause:
Returns information from, or expressions based on, each row affected
by an INSERT, UPDATE, DELETE, or MERGE statement. These results can be
returned to the processing application for use in such things as
confirmation messages, archiving, and other such application
requirements.
For instance:
INSERT INTO #temporaltable
OUTPUT inserted.*
SELECT *
FROM ...
Will give you all the rows from the INSERT statement that was inserted into the temporal table, which were selected from the other table.
Is there any reason you can't just do this: SELECT * FROM #temporal? (And debug it in SQL Server Management Studio, passing in the same parameters your application is passing in).
It's a quick and dirty way of doing it, but one reason you might want to do it this way over the other (cleaner/better) answer, is that you get a bit more control here. And, if you're in a situation where you have multiple inserts to your temp table (hopefully you aren't), you can just do a single select to see all of the inserted rows at once.
I would still probably do it the other way though (now I know about it).
I know of no way to do this without changing the script. Howeer, for the future, you should never write a complex strored proc or script without a debug parameter that allows you to put in the data tests you will want. Make it the last parameter with a default value of 0 and you won't even have to change your current code that calls the proc.
Then you can add statements like the below everywhere you will want to check intermediate results. Further in debug mode you might always rollback any transactions so that a bug will not affect the data.
IF #debug = 1
BEGIN
SELECT * FROM #temp
END

multiple select statements in single ODBCdataAdapter

I am trying to use an ODBCdataadapter in C# to run a query which needs to select some data into a temporary table as a preliminary step. However, this initial select statement is causing the query to terminate so that data gets put into the temp table but I can't run the second query to get it out. I have determined that the problem is the presence of two select statements in a single dataadapter query. That is to say the following code only runs the first select:
select 1
select whatever from wherever
When I run my query directly through SQL Server Management Studio it works fine. Has anyone encountered this sort of issue before? I have tried the exact same query previously on similar databases using the same C# code (only the connection string is different) and had no problems.
Before you ask, the temp table is helpful because otherwise I would be running a whole lot of inner select statements which would bog down the database.
Assuming you're executing a Command that's command type is CommandText you need a ; to separate the statements.
select 1;
select whatever from wherever;
You might also want to consider using a Stored Procedure if possible. You should also use the SQL client objects instead of the ODBC client. That way you can take advantage of additional methods that aren't available otherwise. You're supposed to get better perf as well.
If you need to support multiple Databases you can just use the DataAdapter class and use a Factory o create the concrete types. This gives you the benefits of using the native drivers without being tied to a specific backend. ORMS that support multiple back ends typically do this. The Enterprise Library Data Access Application Block while not an ORM does this as well.
Unfortunately I do not have write access to the DB as my organization has been contracted just to extract information to a data warehouse. The program is one generalized for use on multiple systems which is why we went with ODBC. I suppose it would not be terrible to rewrite it using SQL Management Objects.
ODBC Connection requires a single select statement and its retrieval from SQL Server.
If any such functionality is required, a Hack can do the purpose
use the query
SET NOCOUNT ON
at the top of your select statement.
When SET NOCOUNT is ON, the count (indicating the number of rows affected by a Transact-SQL statement) is not returned.
When SET NOCOUNT is OFF, the count is returned. It is used with any SELECT, INSERT, UPDATE, DELETE statement.
The setting of SET NOCOUNT is set at execute or run time and not at parse time.
SET NOCOUNT ON mainly improves stored procedure (SP) performance.
Syntax:
SET NOCOUNT { ON | OFF }