Rollback transaction if parameter name too long - sql

I'm using the SimpleMembershipProvider that you get out of the box when you create an new .NET MVC application and I wanted to allow an admin user the ability to add roles. For learning purposes, (and I don't know if this is the correct way to do it) I wanted to limit the length of the description of the RoleName column to 15 characters, so I wrote the transaction:
create proc spInsertRole
(
#roleName varchar(50)
--really shouldn't be 50, but that's
--how I originally wrote my code
)
as
begin
set nocount on
begin try
begin tran
insert into dbo.webpages_Roles(RoleName)
values (#roleName)
commit transaction
end try
begin catch
select ERROR_MESSAGE() as ErrorMessage
if len(#roleName) > 15
rollback transaction
end catch
end
There is not a check constraint on the table for length of RoleName. This proc will compile but it will also let me add a RoleName of greater than 15 characters. What am I missing and what is the best way to do this?

You should check the length before you run the insert statement. By putting the length check in the catch block, you are telling the program to only check the length and roll back if there is some other error condition.
(My T-SQL is rusty and I don't have a database to test on so please verify before accepting. Also, given these changes, you probably don't need transactions anymore.)
create proc spInsertRole
(
#roleName varchar(50)
--really shouldn't be 50, but that's
--how I originally wrote my code
)
as
begin
set nocount on
begin try
begin tran
-- length check moved here. Raise error when > 15.
-- Severity (argument 2) needs to be higher than 10
-- to stop execution and trigger the catch block.
-- State (argument 3) is an arbitrary value between 0 and 255.
if len(#roleName) > 15
raiserror('Role name is too long.', 11, 5)
insert into dbo.webpages_Roles(RoleName)
values (#roleName)
commit transaction
end try
begin catch
select ERROR_MESSAGE() as ErrorMessage
-- length check was here. program will always roll back now.
rollback transaction
end catch
end
See RAISERROR for more information about how that works.

Related

Properly understanding the error Cannot use the ROLLBACK statement within an INSERT-EXEC statement - Msg 50000, Level 16, State 1

I understand there is a regularly quoted answer that is meant to address this question, but I believe there is not enough explanation on that thread to really answer the question.
Why earlier answers are inadequate
The first (and accepted) answer simply says this is a common problem and talks about having only one active insert-exec at a time (which is only the first half of the question asked there and doesn't address the ROLLBACK error). The given workaround is to use a table-valued function - which does not help my scenario where my stored procedure needs to update data before returning a result set.
The second answer talks about using openrowset but notes you cannot dynamically specify argument values for the stored procedure - which does not help my scenario because different users need to call my procedure with different parameters.
The third answer provides something called "the old single hash table approach" but does not explain whether it is addressing part 1 or 2 of the question, nor how it works, nor why.
No answer explains why the database is giving this error in the first place.
My use case / requirements
To give specifics for my scenario (although simplified and generic), I have procedures something like below.
In a nutshell though - the first procedure will return a result set, but before it does so, it updates a status column. Effectively these records represent records that need to be synchronised somewhere, so when you call this procedure the procedure will flag the records as being "in progress" for sync.
The second stored procedure calls that first one. Of course the second stored procedure wants to take those records and perform inserts and updates on some tables - to keep those tables in sync with whatever data was returned from the first procedure. After performing all the updates, the second procedure then calls a third procedure - within a cursor (ie. row by row on all the rows in the result set that was received from the first procedure) - for the purpose of setting the status on the source data to "in sync". That is, one by one it goes back and says "update the sync status on record id 1, to 'in sync'" ... and then record 2, and then record 3, etc.
The issue I'm having is that calling the second procedure results in the error
Msg 50000, Level 16, State 1, Procedure getValuesOuterCall, Line 484 [Batch Start Line 24]
Cannot use the ROLLBACK statement within an INSERT-EXEC statement.
but calling the first procedure directly causes no error.
Procedure 1
-- Purpose here is to return a result set,
-- but for every record in the set we want to set a status flag
-- to another value as well.
alter procedure getValues #username, #password, #target
as
begin
set xact_abort on;
begin try
begin transaction;
declare #tableVariable table (
...
);
update someOtherTable
set something = somethingElse
output
someColumns
into #tableVariable
from someTable
join someOtherTable
join etc
where someCol = #username
and etc
;
select
someCols
from #tableVariable
;
commit;
end try
begin catch
if ##trancount > 0 rollback;
declare #msg nvarchar(2048) = error_message() + ' Error line: ' + CAST(ERROR_LINE() AS nvarchar(100));
raiserror (#msg, 16, 1);
return 55555
end catch
end
Procedure 2
-- Purpose here is to obtain the result set from earlier procedure
-- and then do a bunch of data updates based on the result set.
-- Lastly, for each row in the set, call another procedure which will
-- update that status flag to another value.
alter procedure getValuesOuterCall #username, #password, #target
as
begin
set xact_abort on;
begin try
begin transaction;
declare #anotherTableVariable
insert into #anotherTableVariable
exec getValues #username = 'blah', #password = #somePass, #target = ''
;
with CTE as (
select someCols
from #anotherTableVariable
join someOtherTables, etc;
)
merge anUnrelatedTable as target
using CTE as source
on target.someCol = source.someCol
when matched then update
target.yetAnotherCol = source.yetAnotherCol,
etc
when not matched then
insert (someCols, andMoreCols, etc)
values ((select someSubquery), source.aColumn, source.etc)
;
declare #myLocalVariable int;
declare #mySecondLocalVariable int;
declare lcur_myCursor cursor for
select keyColumn
from #anotherTableVariable
;
open lcur_muCursor;
fetch lcur_myCursor into #myLocalVariable;
while ##fetch_status = 0
begin
select #mySecondLocalVariable = someCol
from someTable
where someOtherCol = #myLocalVariable;
exec thirdStoredProcForSettingStatusValues #id = #mySecondLocalVariable, etc
end
deallocate lcur_myCursor;
commit;
end try
begin catch
if ##trancount > 0 rollback;
declare #msg nvarchar(2048) = error_message() + ' Error line: ' + CAST(ERROR_LINE() AS nvarchar(100));
raiserror (#msg, 16, 1);
return 55555
end catch
end
The parts I don't understand
Firstly, I have no explicit 'rollback' (well, except in the catch block) - so I have to presume that an implicit rollback is causing the issue - but it is difficult to understand where the root of this problem is; I am not even entirely sure which stored procedure is causing the issue.
Secondly, I believe the statements to set xact_abort and begin transaction are required - because in procedure 1 I am updating data before returning the result set. In procedure 2 I am updating data before I call a third procedure to update further data.
Thirdly, I don't think procedure 1 can be converted to a table-valued function because the procedure performs a data update (which would not be allowed in a function?)
Things I have tried
I removed the table variable from procedure 2 and actually created a permanent table to store the results coming back from procedure 1. Before calling procedure 1, procedure 2 would truncate the table. I still got the rollback error.
I replaced the table variable in procedure 1 with a temporary table (ie. single #). I read the articles about how such a table persists for the lifetime of the connection, so within procedure 1 I had drop table if exists... and then create table #.... I still got the rollback error.
Lastly
I still don't understand exactly what is the problem - what is Microsoft struggling to accomplish here? Or what is the scenario that SQL Server cannot accommodate for a requirement that appears to be fairly straightforward: One procedure returns a result set. The calling procedure wants to perform actions based on what's in that result set. If the result set is scoped to the first procedure, then why can't SQL Server just create a temporary copy of the result set within the scope of the second procedure so that it can be acted upon?
Or have I missed it completely and the issue has something to do with the final call to a third procedure, or maybe to do with using try ... catch - for example, perhaps the logic is totally fine but for some reason it is hitting the catch block and the rollback there is the problem (ie. so if I fix the underlying reason leading us to the catch block, all will resolve)?

SQL Server ##error not working. Doesn't go into "if"

I wrote a very simple procedure. I am deliberately making a mistake in the procedure. but the error is not working. I want to return the error I want with raiserror. but it doesn't even go inside the "if".
ALTER PROCEDURE dene
AS
BEGIN
SET NOCOUNT ON;
INSERT INTO bddksektor ([tcmbkodu], [aktif], [bddksektorkodu])
VALUES ('a', 1, 1)
-- I knowingly made a mistake. normally the 1st parameter is int.
-- I'm entering varchar so that it can enter "if" and return an error. but it doesn't
IF ##ERROR <> 0
BEGIN
RAISERROR('Hata', 16, 1, 61106)
RETURN 61106
END
ELSE
BEGIN
SELECT
[tcmbkodu], [aktif], [bddksektorkodu]
FROM
[dbfactoringtest].[dbo].[bddksektor]
ORDER BY
tcmbkodu ASC
END
SET NOCOUNT OFF
END
SQL Server returns its own error when I run the procedure. It does not return the error I wrote because it does not enter the "if"
Error:
Msg 245, Level 16, State 1, Procedure dene, Line 11 [Batch Start Line 1]
Conversion failed when converting the varchar value 'a' to data type int
As I mention in the comment, use a TRY...CATCH. For your SQL if the first INSERT fails the batch will be aborted, and so the IF won't be entered, because it won't be reached. I also recommend switching to THROW instead of RAISERROR as noted in the documentation:
Note
The RAISERROR statement does not honor SET XACT_ABORT. New applications should use THROW instead of RAISERROR.
This gives you something like this:
CREATE OR ALTER PROCEDURE dbo.dene AS --Always schema qualify
BEGIN
SET NOCOUNT ON;
SET XACT_ABORT ON;
BEGIN TRY
INSERT INTO dbo.bddksektor ([tcmbkodu],[aktif],[bddksektorkodu]) --Always schema qualify
VALUES ('a',1,1);
SELECT [tcmbkodu]
,[aktif]
,[bddksektorkodu]
FROM [dbfactoringtest].[dbo].[bddksektor] --Why is this using 3 part naming? Is this copy of bddksektor in a different database?
ORDER BY tcmbkodu ASC;
END TRY
BEGIN CATCH
THROW 61106, N'Hata', 16;
RETURN 61106;
END CATCH;
SET NOCOUNT OFF;
END

T-SQL install / upgrade script as a transaction

I'm trying to write a single T-SQL script which will upgrade a system which is currently in deployment. The script will contain a mixture of:
New tables
New columns on existing tables
New functions
New stored procedures
Changes to stored procedures
New views
etc.
As it's a reasonably large upgrade I want the script to rollback if a single part of it fails. I have an outline of my attempted code below:
DECLARE #upgrade NVARCHAR(32);
SELECT #upgrade = 'my upgrade';
BEGIN TRANSACTION #upgrade
BEGIN
PRINT 'Starting';
BEGIN TRY
CREATE TABLE x ( --blah...
);
ALTER TABLE y --blah...
);
CREATE PROCEDURE z AS BEGIN ( --blah...
END
GO --> this is causing trouble!
CREATE FUNCTION a ( --blah...
END TRY
BEGIN CATCH
PRINT 'Error with transaction. Code: ' + ##ERROR + '; Message: ' + ERROR_MESSAGE();
ROLLBACK TRANSACTION #upgrade;
PRINT 'Rollback complete';
RETURN;
END TRY
END
PRINT 'Upgrade successful';
COMMIT TRANSACTION #upgrade;
GO
Note - I know some of the syntax is not perfect - I'm having to re-key the code
It seems as though I can't put Stored Procedures into a transaction block. Is there a reason for this? Is it because of the use of the word GO? If so, how can I put SPs into a transaction block? What are the limitations as to what can go into a transaction block? Or, what would be a better alternative to what I'm trying to achieve?
Thanks
As Thomas Haratyk said in his answer, your issue was the "go". However, you can have as many batches in a transaction as you want. It's the try/catch that doesn't like this. Here's a simple proof-of-concept:
begin tran
go
select 1
go
select 2
go
rollback
begin try
select 1
go
select 2
go
end try
begin catch
select 1
end catch
Remove the GO and create your procedure by using dynamic sql or it will fail.
EXEC ('create procedure z
as
begin
print "hello world"
end')
GO is not a SQL keyword, it is a batch separator. So it cannot be included into a transaction.
Please refer to those topics for further information :
sql error:'CREATE/ALTER PROCEDURE' must be the first statement in a query batch?
Using "GO" within a transaction
http://msdn.microsoft.com/en-us/library/ms188037.aspx

TSQL error checking code not working

I'm trying to add an alert to my log for a running insert of a view into a table when there is an error executing the query in the view. When I run the view alone, I get an invalid input into SUBSTRING (the exact wording of the error I can't remember). When I run it as part of my view -> table stored procedure, the error is ignored, then I have to go digging for the offending line and make an exception in the view's code to omit that line from the results (I know, it sounds kludge-y, but I'm doing data reduction on huge web-log files from a specialized webapp), but I digress.
I've tried two different methods for trying to catch the error and neither are triggered in such a way to insert the row indicating an error in my execution result table (refresh_results). I think I may be missing some fundamental - perhaps the errors are being encapsulated in come way. If I can't detect the error, the only way to notice an error is if someone notices the number of entries into the table is low for a given period of time.
SELECT #TransactionName = 'tname';
BEGIN TRANSACTION #TransactionName;
BEGIN TRY
print 'tname ***In Try***';
if exists (select name from sysobjects where name='tablename')
begin
drop table tablename;
end
select * into tablename
from opendatasource('SQLNCLI', 'Data Source=DATABASE;UID=####;password=####').dbo.viewname;
COMMIT TRANSACTION #TransactionName;
END TRY
BEGIN CATCH
print 'tablename ***ERROR - check for SUBSTRING***';
begin transaction
set #result_table = 'tablename ***ERROR - check for SUBSTRING***'
select #result_time = getdate(),
#result_rows = count(logtime)
from tablename
insert INTO [dbo].[refresh_results] (result_time, result_table, result_rows)
values (#result_time, #result_table, #result_rows);
commit transaction
ROLLBACK TRANSACTION #TransactionName;
END CATCH
or
if exists (select name from sysobjects where name='tablename')
begin
drop table tablename;
end
select * into tablename
from opendatasource('SQLNCLI', 'Data Source=DATABASE;UID=####;password=####').dbo.viewname;
print '##error'
print ##error
if ##error <> 0
Begin
print 'tablename ***ERROR - check for SUBSTRING***';
set #result_table = 'tablename ***ERROR - check for SUBSTRING***'
select #result_time = getdate(),
#result_rows = count(logtime)
from tablename
insert INTO [dbo].[refresh_results] (result_time, result_table, result_rows)
values (#result_time, #result_table, #result_rows);
End
Your nested transactions aren't doing what you think. You are rolling back the error you thought you stored. Roll back the initial transaction and then, if you feel the need, start a new transaction for logging the error.
See here.
You have two seperate problems
In your first example you are running transactions that do the following:
BEGIN TRAN
...error...
BEGIN TRAN
...log error...
COMMIT TRAN
ROLLBACK TRAN
The inner transaction is rolled back with the outer transaction. Maybe try:
BEGIN TRAN
...error...
ROLLBACK TRAN
BEGIN TRAN
...log error...
ROLLBACK TRAN
The second example you are using ##ERROR. As I understand it as soon as you run something ##ERROR is replaced. That something I think includes the print statement.
If you change it to something like:
DECLARE #Error INT
select * into tablename
from opendatasource('SQLNCLI', 'Data Source=DATA3;UID=;password=').dbo.viewname;
SET #Error = ##ERROR
print '##error'
print #Error
if #Error <> 0
...log the error
The advantage of the TRY CATCH is that if you have an error it will catch it. The ##ERROR method works 100% but it only works on the last line run. so if you have an error with DROP TABLE tablename ##ERROR won't get it (unless you add another check)
Ok, so I had to use a helper procedure to add a log entry. I think what was going on is that the rollback was also rolling back the log entry.
This is what I had to do:
DECLARE #myError tinyint;
BEGIN TRY
BEGIN TRANSACTION;
if exists (select name from sys.sysobjects where name='table_name')
begin
drop table table_name
end
select * into table_name
from opendatasource('SQLNCLI', 'Data Source=###;UID=###;password=###').view_Table
COMMIT TRANSACTION;
END TRY
BEGIN CATCH
set #myError = 1
ROLLBACK TRANSACTION;
END CATCH
if #myError <> 0
begin
exec dbo.table error
end
ELSE
EXEC exec dbo.table normal row

Add column to table and then update it inside transaction

I am creating a script that will be run in a MS SQL server. This script will run multiple statements and needs to be transactional, if one of the statement fails the overall execution is stopped and any changes are rolled back.
I am having trouble creating this transactional model when issuing ALTER TABLE statements to add columns to a table and then updating the newly added column. In order to access the newly added column right away, I use a GO command to execute the ALTER TABLE statement, and then call my UPDATE statement. The problem I am facing is that I cannot issue a GO command inside an IF statement. The IF statement is important within my transactional model. This is a sample code of the script I am trying to run. Also notice that issuing a GO command, will discard the #errorCode variable, and will need to be declared down in the code before being used (This is not in the code below).
BEGIN TRANSACTION
DECLARE #errorCode INT
SET #errorCode = ##ERROR
-- **********************************
-- * Settings
-- **********************************
IF #errorCode = 0
BEGIN
BEGIN TRY
ALTER TABLE Color ADD [CodeID] [uniqueidentifier] NOT NULL DEFAULT ('{00000000-0000-0000-0000-000000000000}')
GO
END TRY
BEGIN CATCH
SET #errorCode = ##ERROR
END CATCH
END
IF #errorCode = 0
BEGIN
BEGIN TRY
UPDATE Color
SET CodeID= 'B6D266DC-B305-4153-A7AB-9109962255FC'
WHERE [Name] = 'Red'
END TRY
BEGIN CATCH
SET #errorCode = ##ERROR
END CATCH
END
-- **********************************
-- * Check #errorCode to issue a COMMIT or a ROLLBACK
-- **********************************
IF #errorCode = 0
BEGIN
COMMIT
PRINT 'Success'
END
ELSE
BEGIN
ROLLBACK
PRINT 'Failure'
END
So what I would like to know is how to go around this problem, issuing ALTER TABLE statements to add a column and then updating that column, all within a script executing as a transactional unit.
GO is not a T-SQL command. Is a batch delimiter. The client tool (SSM, sqlcmd, osql etc) uses it to effectively cut the file at each GO and send to the server the individual batches. So obviously you cannot use GO inside IF, nor can you expect variables to span scope across batches.
Also, you cannot catch exceptions without checking for the XACT_STATE() to ensure the transaction is not doomed.
Using GUIDs for IDs is always at least suspicious.
Using NOT NULL constraints and providing a default 'guid' like '{00000000-0000-0000-0000-000000000000}' also cannot be correct.
Updated:
Separate the ALTER and UPDATE into two batches.
Use sqlcmd extensions to break the script on error. This is supported by SSMS when sqlcmd mode is on, sqlcmd, and is trivial to support it in client libraries too: dbutilsqlcmd.
use XACT_ABORT to force error to interrupt the batch. This is frequently used in maintenance scripts (schema changes). Stored procedures and application logic scripts in general use TRY-CATCH blocks instead, but with proper care: Exception handling and nested transactions.
example script:
:on error exit
set xact_abort on;
go
begin transaction;
go
if columnproperty(object_id('Code'), 'ColorId', 'AllowsNull') is null
begin
alter table Code add ColorId uniqueidentifier null;
end
go
update Code
set ColorId = '...'
where ...
go
commit;
go
Only a successful script will reach the COMMIT. Any error will abort the script and rollback.
I used COLUMNPROPERTY to check for column existance, you could use any method you like instead (eg. lookup sys.columns).
Orthogonal to Remus's comments, what you can do is execute the update in an sp_executesql.
ALTER TABLE [Table] ADD [Xyz] NVARCHAR(256);
DECLARE #sql NVARCHAR(2048) = 'UPDATE [Table] SET [Xyz] = ''abcd'';';
EXEC sys.sp_executesql #query = #sql;
We've needed to do this when creating upgrade scripts. Usually we just use GO but it has been necessary to do things conditionally.
I almost agree with Remus but you can do this with SET XACT_ABORT ON and XACT_STATE
Basically
SET XACT_ABORT ON will abort each batch on error and ROLLBACK
Each batch is separated by GO
Execution jumps to the next batch on error
Use XACT_STATE() will test if the transaction is still valid
Tools like Red Gate SQL Compare use this technique
Something like:
SET XACT_ABORT ON
GO
BEGIN TRANSACTION
GO
IF COLUMNPROPERTY(OBJECT_ID('Color'), 'CodeID', ColumnId) IS NULL
ALTER TABLE Color ADD CodeID [uniqueidentifier] NULL
GO
IF XACT_STATE() = 1
UPDATE Color
SET CodeID= 'B6D266DC-B305-4153-A7AB-9109962255FC'
WHERE [Name] = 'Red'
GO
IF XACT_STATE() = 1
COMMIT TRAN
--else would be rolled back
I've also removed the default. No value = NULL for GUID values. It's meant to be unique: don't try and set every row to all zeros because it will end in tears...
Have you tried it without the GO?
Normally you should not mix table changes and data changes in the same script.
Another alternative, if you don't want to split the code into separate batches, is to use EXEC to create a nested scope/batch
as here