I execute the code below:
use AdventureWorks2008R2
begin transaction
BEGIN
alter table HumanResources.Department add newcolumn int
update HumanResources.Department set newcolumn=1 where departmentid=1
END
commit
The error I get is:
Invalid column name 'newcolumn'.
Can ALTER statements be included in Transactions like this? If so, how can I prevent this error?
I have researched this online e.g. here. I have not found an answer to my specific question.
Yes, you can include an ALTER in a transaction. The problem is that the parser validates the syntax for your UPDATE statement, and can't "see" that you are also performing an ALTER. One workaround is to use dynamic SQL, so that the parser doesn't inspect your syntax (and validate column names) until runtime, where the ALTER will have already happened:
BEGIN TRANSACTION;
ALTER TABLE HumanResources.Department ADD newcolumn INT;
EXEC sp_executesql N'UPDATE HumanResources.Department
SET newcolumn = 1 WHERE DepartmentID = 1;';
COMMIT TRANSACTION;
Note that indentation makes code blocks much more easily identifiable (and your BEGIN/END was superfluous).
If you check the existence of column, then it should work.
BEGIN TRANSACTION;
IF COL_LENGTH('table_name', 'newcolumn') IS NULL
BEGIN
ALTER TABLE table_name ADD newcolumn INT;
END
EXEC sp_executesql N'UPDATE table_name
SET newcolumn = 1 WHERE DepartmentID = 1;';
COMMIT TRANSACTION;
Aaron has explained everything already. Another alternative that works for ad-hoc scripts in SSMS is to insert the batch separator GO so that the script is sent as two parts to the server. This only works if it is valid to split the script in the first place (you can't split an IF body for example).
Related
I have one Trigger called dbo.SendMail and multiple database,
not all database have the trigger dbo.SendMail.
I am using FluentMigrator to manage database versions and i want to do something like below
IF EXISTS (SELECT * FROM sys.triggers WHERE object_id = OBJECT_ID(N'[dbo].[SendMail]'))
BEGIN
ALTER TRIGGER [dbo].[SendMail]
ON [dbo].[Notification]
FOR INSERT
AS
BEGIN
some sql code
END
END
it is giving me error Incorrect syntax near begin, Expecting EXTERNAL.
is there any way to achieve this ?
Thanks in advance.
Try this:
IF OBJECT_ID(N'[dbo].[SendMail]', N'TR') IS NOT NULL
-- Do whatever
Else
-- Do something else
Here is your trigger code with some dynamic sql. You have to roll with dynamic sql here because creating or altering objects must be the only statement in a batch. You can't wrap the create/alter logic inside an IF statement.
IF OBJECT_ID('SendMail') is not null
begin
declare #SQL nvarchar(max)
set #SQL = 'ALTER TRIGGER [dbo].[SendMail]
ON [dbo].[Notification]
FOR INSERT
AS
BEGIN
some sql code
END'
exec sp_executesql #SQL
end
I am pretty new to SQL. I am working with SQL Server 2012. I need to do the following: add a column to an existing table and fill all rows in that column with the same value. To do this, I have come up with the following based on searching online:
ALTER TABLE myTable ADD myNewColumn VARCHAR(50) NULL
UPDATE myTable SET myNewColumn = 'test'
The problem is that in SQL server, I get the following error for the second statement:
Invalid column name 'myNewColumn
So, my guess is that a new column called myNewColumn wasn't created by the first statement.
You need to perform the update in a separate batch. Otherwise SQL Server tries to parse and validate that the column exists before ever trying to run the ALTER that creates it. You get the invalid column name at parse time, not at run time.
One workaround is to use GO between the two batches:
ALTER TABLE dbo.myTable ADD myNewColumn VARCHAR(50) NULL;
GO
UPDATE dbo.myTable SET myNewColumn = 'test';
(Always use schema prefixes to reference objects and always terminate statements with semi-colons.)
However this only works in Management Studio and other certain client applications, because it is not actually part of the T-SQL language; these client tools see it as a batch separator and it tells them to submit and evaluate these two batches separately. It will NOT work in code blocks submitted to SQL Server in other ways and as a single batch, e.g. in the body of a stored procedure:
CREATE PROCEDURE dbo.foo
AS
BEGIN
SET NOCOUNT ON;
SELECT 1;
GO
SELECT 2;
END
GO
This yields the following errors, because it actually splits the stored procedure into two separate batches:
Msg 102, Level 15, State 1, Procedure foo, Line 8
Incorrect syntax near ';'.
Msg 102, Level 15, State 1, Line 11
Incorrect syntax near 'END'.
What you can do as a different workaround is force the update into its own batch by executing it in dynamic SQL.
DECLARE #sql NVARCHAR(MAX), #value VARCHAR(50) = 'test';
ALTER TABLE dbo.myTable ADD myNewColumn VARCHAR(50) NULL;
SET #sql = N'UPDATE dbo.myTable SET myNewColumn = #value;';
EXEC sp_executesql #sql, N'#value VARCHAR(50)', #value;
(Why you should use EXEC sp_executesql vs. EXEC(#sql).)
Another workaround; perform the add and the update in one step:
ALTER TABLE dbo.myTable ADD myNewColumn VARCHAR(50) DEFAULT 'test' WITH VALUES;
(You can later drop the default constraint if you don't actually want any future rows to inherit that value in circumstances that would cause that behavior.)
Place the word GO after your alter statement
Alter and update cannot be executed at the same time. You need to segregate it using a built-in stored procedure to execute the update statement as under:
ALTER TABLE myTable ADD myNewColumn VARCHAR(50) NULL
Exec sp_executesql N'UPDATE myTable SET myNewColumn = ''test'''
This should definitely solve this problem.
try something like:
ALTER TABLE myTable ADD myNewColumn VARCHAR(50) NULL DEFAULT("Something")
The problem with your code is that the column does not exist until after the query completes. So you can reference it until then.
I am generating a script for automatically migrating changes from multiple development databases to staging/production. Basically, it takes a bunch of change-scripts, and merges them into a single script, wrapping each script in a IF whatever BEGIN ... END statement.
However, some of the scripts require a GO statement so that, for instance, the SQL parser knows about a new column after it's created.
ALTER TABLE dbo.EMPLOYEE
ADD COLUMN EMP_IS_ADMIN BIT NOT NULL
GO -- Necessary, or next line will generate "Unknown column: EMP_IS_ADMIN"
UPDATE dbo.EMPLOYEE SET EMP_IS_ADMIN = whatever
However, once I wrap that in an IF block:
IF whatever
BEGIN
ALTER TABLE dbo.EMPLOYEE ADD COLUMN EMP_IS_ADMIN BIT NOT NULL
GO
UPDATE dbo.EMPLOYEE SET EMP_IS_ADMIN = whatever
END
It fails because I am sending a BEGIN with no matching END. However, if I remove the GO it complains again about an unknown column.
Is there any way to create and update the same column within a single IF block?
I had the same problem and finally managed to solve it using SET NOEXEC.
IF not whatever
BEGIN
SET NOEXEC ON;
END
ALTER TABLE dbo.EMPLOYEE ADD COLUMN EMP_IS_ADMIN BIT NOT NULL
GO
UPDATE dbo.EMPLOYEE SET EMP_IS_ADMIN = whatever
SET NOEXEC OFF;
GO is not SQL - it is simply a batch separator used in some MS SQL tools.
If you don't use that, you need to ensure the statements are executed separately - either in different batches or by using dynamic SQL for the population (thanks #gbn):
IF whatever
BEGIN
ALTER TABLE dbo.EMPLOYEE ADD COLUMN EMP_IS_ADMIN BIT NOT NULL;
EXEC ('UPDATE dbo.EMPLOYEE SET EMP_IS_ADMIN = whatever')
END
You could try sp_executesql, splitting the contents between each GO statement into a separate string to be executed, as demonstrated in the example below. Also, there is a #statementNo variable to track which statement is being executed for easy debugging where an exception occurred. The line numbers will be relative to the beginning of the relevant statement number that caused the error.
BEGIN TRAN
DECLARE #statementNo INT
BEGIN TRY
IF 1=1
BEGIN
SET #statementNo = 1
EXEC sp_executesql
N' ALTER TABLE dbo.EMPLOYEE
ADD COLUMN EMP_IS_ADMIN BIT NOT NULL'
SET #statementNo = 2
EXEC sp_executesql
N' UPDATE dbo.EMPLOYEE
SET EMP_IS_ADMIN = 1'
SET #statementNo = 3
EXEC sp_executesql
N' UPDATE dbo.EMPLOYEE
SET EMP_IS_ADMIN = 1x'
END
END TRY
BEGIN CATCH
PRINT 'Error occurred on line ' + cast(ERROR_LINE() as varchar(10))
+ ' of ' + 'statement # ' + cast(#statementNo as varchar(10))
+ ': ' + ERROR_MESSAGE()
-- error occurred, so rollback the transaction
ROLLBACK
END CATCH
-- if we were successful, we should still have a transaction, so commit it
IF ##TRANCOUNT > 0
COMMIT
You can also easily execute multi-line statements, as demonstrated in the example above, by simply wrapping them in single quotes ('). Don't forget to escape any single quotes contained inside the string with a double single-quote ('') when generating the scripts.
You can enclose the statements in BEGIN and END instead of the GO inbetween
IF COL_LENGTH('Employees','EMP_IS_ADMIN') IS NULL --Column does not exist
BEGIN
BEGIN
ALTER TABLE dbo.Employees ADD EMP_IS_ADMIN BIT
END
BEGIN
UPDATE EMPLOYEES SET EMP_IS_ADMIN = 0
END
END
(Tested on Northwind database)
Edit: (Probably tested on SQL2012)
I ultimately got it to work by replacing every instance of GO on its own line with
END
GO
---Automatic replacement of GO keyword, need to recheck IF conditional:
IF whatever
BEGIN
This is greatly preferable to wrapping every group of statements in a string, but is still far from ideal. If anyone finds a better solution, post it and I'll accept it instead.
You may try this solution:
if exists(
SELECT...
)
BEGIN
PRINT 'NOT RUN'
RETURN
END
--if upper code not true
ALTER...
GO
UPDATE...
GO
I have used RAISERROR in the past for this
IF NOT whatever BEGIN
RAISERROR('YOU''RE ALL SET, and sorry for the error!', 20, -1) WITH LOG
END
ALTER TABLE dbo.EMPLOYEE ADD COLUMN EMP_IS_ADMIN BIT NOT NULL
GO
UPDATE dbo.EMPLOYEE SET EMP_IS_ADMIN = whatever
You can incorporate a GOTO and LABEL statements to skip over code, thus leaving the GO keywords intact.
I am creating a script that will be run in a MS SQL server. This script will run multiple statements and needs to be transactional, if one of the statement fails the overall execution is stopped and any changes are rolled back.
I am having trouble creating this transactional model when issuing ALTER TABLE statements to add columns to a table and then updating the newly added column. In order to access the newly added column right away, I use a GO command to execute the ALTER TABLE statement, and then call my UPDATE statement. The problem I am facing is that I cannot issue a GO command inside an IF statement. The IF statement is important within my transactional model. This is a sample code of the script I am trying to run. Also notice that issuing a GO command, will discard the #errorCode variable, and will need to be declared down in the code before being used (This is not in the code below).
BEGIN TRANSACTION
DECLARE #errorCode INT
SET #errorCode = ##ERROR
-- **********************************
-- * Settings
-- **********************************
IF #errorCode = 0
BEGIN
BEGIN TRY
ALTER TABLE Color ADD [CodeID] [uniqueidentifier] NOT NULL DEFAULT ('{00000000-0000-0000-0000-000000000000}')
GO
END TRY
BEGIN CATCH
SET #errorCode = ##ERROR
END CATCH
END
IF #errorCode = 0
BEGIN
BEGIN TRY
UPDATE Color
SET CodeID= 'B6D266DC-B305-4153-A7AB-9109962255FC'
WHERE [Name] = 'Red'
END TRY
BEGIN CATCH
SET #errorCode = ##ERROR
END CATCH
END
-- **********************************
-- * Check #errorCode to issue a COMMIT or a ROLLBACK
-- **********************************
IF #errorCode = 0
BEGIN
COMMIT
PRINT 'Success'
END
ELSE
BEGIN
ROLLBACK
PRINT 'Failure'
END
So what I would like to know is how to go around this problem, issuing ALTER TABLE statements to add a column and then updating that column, all within a script executing as a transactional unit.
GO is not a T-SQL command. Is a batch delimiter. The client tool (SSM, sqlcmd, osql etc) uses it to effectively cut the file at each GO and send to the server the individual batches. So obviously you cannot use GO inside IF, nor can you expect variables to span scope across batches.
Also, you cannot catch exceptions without checking for the XACT_STATE() to ensure the transaction is not doomed.
Using GUIDs for IDs is always at least suspicious.
Using NOT NULL constraints and providing a default 'guid' like '{00000000-0000-0000-0000-000000000000}' also cannot be correct.
Updated:
Separate the ALTER and UPDATE into two batches.
Use sqlcmd extensions to break the script on error. This is supported by SSMS when sqlcmd mode is on, sqlcmd, and is trivial to support it in client libraries too: dbutilsqlcmd.
use XACT_ABORT to force error to interrupt the batch. This is frequently used in maintenance scripts (schema changes). Stored procedures and application logic scripts in general use TRY-CATCH blocks instead, but with proper care: Exception handling and nested transactions.
example script:
:on error exit
set xact_abort on;
go
begin transaction;
go
if columnproperty(object_id('Code'), 'ColorId', 'AllowsNull') is null
begin
alter table Code add ColorId uniqueidentifier null;
end
go
update Code
set ColorId = '...'
where ...
go
commit;
go
Only a successful script will reach the COMMIT. Any error will abort the script and rollback.
I used COLUMNPROPERTY to check for column existance, you could use any method you like instead (eg. lookup sys.columns).
Orthogonal to Remus's comments, what you can do is execute the update in an sp_executesql.
ALTER TABLE [Table] ADD [Xyz] NVARCHAR(256);
DECLARE #sql NVARCHAR(2048) = 'UPDATE [Table] SET [Xyz] = ''abcd'';';
EXEC sys.sp_executesql #query = #sql;
We've needed to do this when creating upgrade scripts. Usually we just use GO but it has been necessary to do things conditionally.
I almost agree with Remus but you can do this with SET XACT_ABORT ON and XACT_STATE
Basically
SET XACT_ABORT ON will abort each batch on error and ROLLBACK
Each batch is separated by GO
Execution jumps to the next batch on error
Use XACT_STATE() will test if the transaction is still valid
Tools like Red Gate SQL Compare use this technique
Something like:
SET XACT_ABORT ON
GO
BEGIN TRANSACTION
GO
IF COLUMNPROPERTY(OBJECT_ID('Color'), 'CodeID', ColumnId) IS NULL
ALTER TABLE Color ADD CodeID [uniqueidentifier] NULL
GO
IF XACT_STATE() = 1
UPDATE Color
SET CodeID= 'B6D266DC-B305-4153-A7AB-9109962255FC'
WHERE [Name] = 'Red'
GO
IF XACT_STATE() = 1
COMMIT TRAN
--else would be rolled back
I've also removed the default. No value = NULL for GUID values. It's meant to be unique: don't try and set every row to all zeros because it will end in tears...
Have you tried it without the GO?
Normally you should not mix table changes and data changes in the same script.
Another alternative, if you don't want to split the code into separate batches, is to use EXEC to create a nested scope/batch
as here
I'd like to call Update ... Set ... Where ... to update a field as soon as that evil ERP process is changing the value of another.
I'm running MS SQL.
I can't test, but i guess its a trigger like this
CREATE TRIGGER TriggerName ON TableName FOR UPDATE AS
IF UPDATE(ColumnUpdatedByERP)
BEGIN
UPDATE ...
END
-- Edit - a better version, thanks for comment Tomalak
CREATE TRIGGER TriggerName ON TableName FOR UPDATE AS
DECLARE #oldValue VARCHAR(100)
DECLARE #newValue VARCHAR(100)
IF UPDATE(ColumnUpdatedByERP)
BEGIN
SELECT #oldValue = (SELECT ColumnUpdatedByERP FROM Deleted)
SELECT #newValue = (SELECT ColumnUpdatedByERP FROM Inserted)
IF #oldValue <> #newValue
BEGIN
UPDATE ...
END
END
You could use a trigger to update the other field.
Edit: I guess that may depend on what SQLesque database you are running.
You want to use a trigger but I would be very wary of the bug in the selected answer. See Brent Ozar's well written post http://www.brentozar.com/archive/2009/01/triggers-need-to-handle-multiple-records/ on Multiple Records.