Store T-SQL warning messages into table - sql

Is there any way I can insert the warning messages into a table?
Specifically I mean the warnings that are like 'The module 'x' depends on the missing object 'y'. The module will still be created; however, it cannot run successfully until the object exists.' which are not picked up by the system error messages

You probably don't want to know about every single message, as some of them are completely benign, for example Warning: Null value is eliminated by an aggregate or other SET operation.
You can use sp_altermessage to log specific errors that are not logged by default.
EXEC sp_altermessage
#message_id = 2007,
#parameter = 'WITH_LOG',
#parameter_value = 'false';
The last parameter specifies whether to log to the Windows event log also.
The error you are talking about is error 2007. You can view all errors using
select * from sys.messages
Another option is to set up an Extended Events session, which logs all errors.
For example, I have this one on my server (note that it only logs to the ring buffer, but you can log to a file also).
CREATE EVENT SESSION [Errors] ON SERVER
ADD EVENT sqlserver.error_reported (
ACTION (
sqlserver.client_hostname,
sqlserver.sql_text,
sqlserver.tsql_stack
)
WHERE ([severity] > 10 AND [error_number] <> 3980 AND [error_number] <> 17830)
)
ADD TARGET package0.ring_buffer (
SET max_events_limit = 50,
occurrence_number= 50
)
WITH (
MAX_MEMORY = 4096 KB,
EVENT_RETENTION_MODE = ALLOW_SINGLE_EVENT_LOSS,
MAX_DISPATCH_LATENCY = 30 SECONDS,
MAX_EVENT_SIZE = 0 KB,
MEMORY_PARTITION_MODE = NONE,
TRACK_CAUSALITY = OFF,
STARTUP_STATE=ON
);
GO

Related

PHP PDO MSSQL - Aborting DB transaction in query which raisses error with severity 0 (warnings)

I am creating web app in Laravel which connects to MSSQL DB (using PDO) and i'm stuck with a issue when running query (UPDATE in my case)
update dba.[vl_zahlavi] set [xcmd] = ' AFTERCOMMIT', [vl_zahlavi].[ts] = '2023-01-10 14:16:06.505' where [doklad] = '23VLTU0100000001'
which raises error with severity 0 within update trigger (used for sending progress bar information to client). Trigger is created within information system, so I'm not able to disable this functionality.
select #progress_count = convert(int/*BY*/,#akt_pocet), #progress_current = 0, #progress_dt_o_skl_o_skl_gen_poh_prub = DateAdd(ms, -1000, GetDate())
select #progress_arg = '#SQL_PROGRESS=O_SKL,O_SKL_GEN_POH_PRUB,' + convert(varchar(255)/*BY*/, ##nestlevel) + ',OPEN,' + convert(varchar(255)/*BY*/, #progress_count) + ';'
RAISERROR ('%s', 0, 1, #progress_arg) WITH NOWAIT
Problem is, that PHP client send abort and transaction is automatically rolled back. When I comment this section of code (or change severity to 16), everything works as expected. Bellow is printscreen from SQL server profiler.
printscreen
Is PDO able to catch user error messages?
Thank you in advance.
I have tried to change DB sets and PDO options, but without success.

SSIS package precedence Constraint not working for multiple expressions

I have a SSIS package in which we want to see if the DataLoadStatusID == 2 then go to the path that copies the file and deletes from the initial import folder. This part works as expected.
And if we load the same file again, due to duplicate values the DataLoadStatusID !=2, then go to the path where the file does not get copied or deleted. The file should remain in the initial import folder.** This is not working when I re-load the same file the DataLoadStatusID !=2 does not work.**
AND
I tired changing the Result Set to Single Row
Set the Result Set to DataLoadStatusID
Set the Parameter. One thing to note is that the DataLoadId is the only parameter setup in the stored procedure. But the DataLoadStatusId is setup as a variable in the SSIS package only. DataLoadStatusID is not setup as a parameter in the Stored Porcedure. But we have a update statement in the stored procedure.
-- If there were validation errors (not just warnings, mark the data load as failed and exit procedure
DECLARE #ErrorCount INT = 0
SELECT #ErrorCount = COUNT(*) from dbo.ClaimLoadValidation where DataLoadId = #DataLoadId and ValidationStatusId = 2
IF (#ErrorCount > 0)
BEGIN
UPDATE dbo.DataLoad set DataLoadStatusId = 3 where DataLoadId = #DataLoadId
RETURN
END
-- If everything worked, mark record as successful
UPDATE dbo.DataLoad set DataLoadStatusId = 2 where DataLoadId = #DataLoadId
Once I run the package it gives me below error:
"[Execute SQL Task] Error: Executing the query "execute uspLoadClaimHartford ?" failed with the following error: "Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done. Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly."
Are you sure the DataLoadStatus variable is != 2. Maybe put a breakpoint and watch on it to check it

How to Check duplicate records in table using ADF?

I am trying to send an alert if there is no records in destination table after copy activity is completed. right now I am try with lookup activity along with if activity but getting this below error
Operation on target Alert If no records in activity ts failed: The function 'int' was invoked with a parameter that is not valid. The value cannot be converted to the target type
You can do this more simple.
Add an variable to you pipeline, varError.
In your lookup, add a select count(*) from your_table
Add an activity "Set variable" where you set the variable to
#string(div(100, int(activity('IsRecordExist').output.firstrow.yourColumn)))
This will do an device by 0 if no values are copied, and cause the pipeline to fail.
You can then sett up a monitor for pipeline failed that sends you an alert.

synchronization between 2 applications pooling a SQL table

I have 2 instances of a VB.NET application each running on their own dedicated servers. The said application runs a While true loop with a 5s sleep on IDLE (IDLE is when the Table doesn't have any ProcessQuery to be treated). On each iteration, the application questions a table in the SQL Database to know if there is anything it could process.
The problem is that i sometimes encounter the problem where both of the instances are "taking" the same ProcessQuery.
I'm using EntityFramework6. I have looked into EntityState but i don't think it does exactly what i'm trying to accomplish.
I was wondering what would be my solution to have perfect parallel instances. It's not impossible at some point i have 12 instances running on 12 machines.
Thanks!
Dim conn As New Info_IndusEntities()
Dim DemandeWilma As WilmaDemandes = conn.WilmaDemandes.Where(Function(x) x.Site = 'LONDON' AndAlso x.Statut = 'toProcess').OrderBy(Function(x) x.RequestDate).FirstOrDefault
If Not IsNothing(DemandeWilma) Then
DemandeWilma.Statut = Statuts.EnTraitement.ToString
DemandeWilma.ServerName = Environment.MachineName
DemandeWilma.ProcessDate = DateTime.Now
conn.SaveChanges()
Return DemandeWilma
end if
UPDATE (21/06/19)
I found an article that I find interesting.
I started by adding a column to my Table :
UPDATED (21/06/19)
I then refreshed my model and changed the Concurrency Check property of RowVersion column in my ORM :
When I tested the update, here's the log of EF6 :
UPDATE [dbo].[WilmaDemandes] SET [Statut] = #0, [ServerName] = #1,
[DateDebut] = #2 WHERE (([ID] = #3) AND ([RowVersion] = #4)) SELECT
[RowVersion] FROM [dbo].[WilmaDemandes] WHERE ##ROWCOUNT > 0 AND [ID]
= #3
-- #0: 'EnTraitement' (Type = String, Size = 20)
-- #1: 'TRB5995' (Type = String, Size = 20)
-- #2: '2019-06-25 7:31:01 AM' (Type = DateTime2)
-- #3: '124373' (Type = Int32)
-- #4: 'System.Byte[]' (Type = Binary, Size = 8)
-- Executing at 2019-06-25 7:31:24 AM -04:00
-- Completed in 95 ms with result: SqlDataReader
Closed connection at 2019-06-25 7:31:24 AM -04:00
Exception thrown:
'System.Data.Entity.Infrastructure.DbUpdateConcurrencyException' in
EntityFramework.dll
UPDATED (25/06/19)
The problems, as explained in this post, starts when you are using DB-First instead of Code-First. Your property will get overwritten silently as soon as you update the model. Some people back then coded a console app workaround that they run on pre-build. I'm not sure i'm quite ready to take this solution as final solution.
Interesting tutorial on how to test optimistic concurrency and ways to resolve such an exception.
Add an "owner" column to your queue table
Your application updates one record (TOP 1) and sets the owner value to their identifier (WHERE Owner IS NULL)
Now your application goes back and reads their owned rows and processes them
It's a simple pattern and it works great. If any processes happen to take ownership 'simultaneously', only one will actually get the reservation.
I'm not very good at LINQ so here's a brute force method, multiline for clarity:
// First try reserving a row
conn.Database.ExecuteSqlCommand(
"WITH UpdateTop1 AS
(SELECT TOP 1 * FROM WilmaDemandes
WHERE Owner IS NULL
AND Site = 'LONDON'
ORDER BY RequestDate)
UPDATE UpdateTop1 SET Owner='ThisApplication'"
);
// See if we got one
Dim DemandeWilma As WilmaDemandes =
conn.WilmaDemandes.
Where(x => x.Owner=='ThisApplication').FirstOrDefault
// If we got a row, process it. Otherwise Idle and repeat
There's also no reason that you must reserve one row. You could reserve all the free rows and work your way through them. Meanwhile other processes will pick up any subsequently arriving rows
Personally I would refactor your status column and make it NULL for new records ready to be processed, otherwise it's the worker ID that has reserved it.
It also helps to add things like timestamp columns to record when the row was reserved etc.

Can I save a trace file/extended events file to another partition other than the C drive on the server? Or another server altogether?

I've recently set some traces and extended events up and running in SQL on our new virtual server to show the access that users have to each database and whether they have logged in recently, and have set the file to save as a physical file on the server rather than writing to a SQL table to save resource. I've set the traces as jobs running at 8am each morning with a 12-hour delay so we can record as much information as possible.
Our IT department ideally don't want anything other than the OS on the C drive of the virtual server, so I'd like to be able to write the trace from my SQL script either to a different partition or to another server altogether.
I have attempted to insert a direct path to a different server within my code and have entered a different partition to C, however unless I write the trace/extended event files to the C drive I get an error message.
CREATE EVENT SESSION [LoginTraceTest] ON SERVER
ADD EVENT sqlserver.existing_connection(SET collect_database_name=
(1),collect_options_text=(1)
ACTION(package0.event_sequence,sqlos.task_time,sqlserver.client_pid,
sqlserver.database_id,sqlserver.
database_name,sqlserver.is_system,sqlserver.nt_username,sqlserver.request_id,sqlserver.server_principal_sid,sqlserver.session_id,sqlserver.session_nt_username,
sqlserver.sql_text,sqlserver.username)),
ADD EVENT sqlserver.login(SET collect_database_name=
(1),collect_options_text=(1)
ACTION(package0.event_sequence,sqlos.task_time,sqlserver.client_pid,sqlserver.database_id,sqlserver.database_name,sqlserver.is_system,sqlserver.nt_username,sqlserver.request_id,sqlserver.server_principal_sid,sqlserver.session_id,sqlserver.
session_nt_username,sqlserver.sql_text,sqlserver.username) )
ADD TARGET package0.asynchronous_file_target (
SET FILENAME = N'\\SERVER1\testFolder\LoginTrace.xel',
METADATAFILE = N'\\SERVER1\testFolder\LoginTrace.xem' );
The error I receive is this:
Msg 25641, Level 16, State 0, Line 6
For target, "package0.asynchronous_file_target", the parameter "filename" passed is invalid. Target parameter at index 0 is invalid
If I change it to another partition rather than a different server:
SET FILENAME = N'D:\Traces\LoginTrace\LoginTrace.xel',
METADATAFILE = N'D:\Traces\LoginTrace\LoginTrace.xem' );
SQL server states that the command completed successfully, but the file isn't written to the partition.
Any ideas please as to what I can do to write the files to another partition or server?