SQL Column Not Found: Added earlier in program - sql

I'm using SQL Server 2008. I have a stored procedure with code that looks like this:
if not exists (select column_name from INFORMATION_SCHEMA.columns
where table_name = 'sample_table' and column_name = 'sample_column')
BEGIN
ALTER TABLE Sample_Table
ADD Sample_Column NVARCHAR(50)
END
Update dbo.Sample_Table
SET Sample_Column = '1'
When I execute, I get a "Column Not Found" error because the column doesn't originally exist in Sample_Table-it's added in the procedure. What's the correct way to get around this?
My workaround (below) is to wrap the update statement in an EXEC statement, so that it is forced to create the code and execute after the ALTER TABLE step. But is there a better method?
EXEC ('
Update dbo.Sample_Table
SET Sample_Column = ''1'' ')

If you do not really like your workaround, the only other option seems to be separating your DDL logic from the DML one, i.e. one SP will check/create the column (maybe other columns too, as necessary), another SP sets the value(s).
On the other hand, it looks like you are using your UPDATE statement merely as a means of providing a default value for the newly created column. If that is the case, you might consider an entirely different solution: creating a DEFAULT constraint (no need for the UPDATE statement). Here:
if not exists (select column_name from INFORMATION_SCHEMA.columns
where table_name = 'sample_table' and column_name = 'sample_column')
BEGIN
ALTER TABLE Sample_Table
ADD Sample_Column NVARCHAR(50) NOT NULL
CONSTRAINT DF_SampleTable_SampleColumn DEFAULT ('1');
ALTER TABLE Sample_Table
ALTER COLUMN Sample_Column NVARCHAR(50) NULL;
END
The second ALTER TABLE command is there only to drop the NOT NULL restriction, as it seems like you didn't mean your column to have it. But if you are fine with NOT NULL, then just scrap the second ALTER TABLE.

Related

SQL Server : removes null constraint when changing the column's datatype, Oracle does not

I was reviewing something for a project and noticed in SQL Server, modifying the datatype of a column removed an existing not null check. I wanted to compare the same to Oracle and noticed that the null check is not removed when a data type change occurs.
My question: is there a reason why SQL Server does not preserve the null check without explicitly specifying the alter statement to make the column not null? After googling some, couldn't really find an answer. Maybe this is specific to a setting in SQL Server that is off?
If there isn't config, seems maybe there is a really good reason that I can't see for why this occurs.
Here is the SQL I was using to compare:
-- SQL Server
CREATE TABLE TestTable (Name varchar(50) NOT NULL);
-- Does not allow null
SELECT COLUMNPROPERTY(OBJECT_ID('dbo.TestTable', 'U'), 'Name', 'AllowsNull');
ALTER TABLE TestTable ALTER COLUMN Name varchar(250);
-- Allows null now
SELECT COLUMNPROPERTY(OBJECT_ID('dbo.TestTable', 'U'), 'Name', 'AllowsNull');
DROP TABLE TestTable;
-- Oracle
CREATE TABLE MYSCHEMA.TestTable (Name VARCHAR2(50) NOT NULL);
select nullable from all_tab_columns where owner = 'MYSCHEMA' and table_name = 'TESTTABLE' and column_name = 'NAME';
ALTER TABLE MYSCHEMA.TestTable MODIFY Name VARCHAR2(250);
select nullable from all_tab_columns where owner = 'MYSCHEMA' and table_name = 'TESTTABLE' and column_name = 'NAME';
Drop Table MYSCHEMA.TestTable;
Environment:
SQL Server 2017
Oracle 12c
Both running in docker on linux.
NULL may be the default.
From https://learn.microsoft.com/en-us/sql/t-sql/statements/alter-table-transact-sql?view=sql-server-ver15
When you create or alter a table with the CREATE TABLE or ALTER TABLE
statements, the database and session settings influence and possibly
override the nullability of the data type that's used in a column
definition. Be sure that you always explicitly define a column as NULL
or NOT NULL for noncomputed columns.
Did you try this?
ALTER TABLE TestTable ALTER COLUMN Name varchar(250) NOT NULL;

SQL Server - Check column if exists >> rename and change type

SQL Server:
Check column if exists when
If True : (Change/Modify) column_name and dataType
If False : Create
Schema name : Setup
Code:
IF EXISTS (SELECT 1 FROM sys.columns
WHERE Name = N'bitIntialBalance'
AND Object_ID = Object_ID(N'Setup.LeaveVacationsSubType'))
BEGIN
ALTER TABLE [Setup].[LeaveVacationsSubType]
ALTER COLUMN intIntialBalance INT NULL;
EXEC sp_RENAME 'Setup.LeaveVacationsSubType.bitIntialBalance', 'intIntialBalance', 'COLUMN';
--ALTER TABLE [Setup].[LeaveVacationsSubType] MODIFY [intIntialBalance] INT; not working
END
GO
IF NOT EXISTS(SELECT 1 FROM sys.columns
WHERE Name = N'intIntialBalance'
AND Object_ID = Object_ID(N'Setup.LeaveVacationsSubType'))
BEGIN
ALTER TABLE [Setup].[LeaveVacationsSubType]
ADD intIntialBalance INT NULL;
END
GO
If I guess correctly, the problem is that query plan is made for the whole script, and SQL Server also checks that it can actually perform all the operations, even if it is inside an if statement. That's why you'll get an error, even if in the reality that statement would never be executed.
One way to get around this issue is to make all those statements dynamic, something like this:
execute ('ALTER TABLE [Setup].[LeaveVacationsSubType] MODIFY [intIntialBalance] INT')

How to write update to add column with value for existing

I have a table Members with existing data, I want to add a non-nullable bit column called 'IsOnlineUser', I want all my existing rows to be set to false. I have a set of scripts that run each time I deploy so I need a check to see if the table
The first SQL I tried was
SET #ColumnExists = (SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = 'Member' AND COLUMN_NAME = 'IsOnlineUser');
IF (#ColumnExists = 0)
BEGIN
ALTER TABLE Member ADD IsOnlineUser bit NULL;
UPDATE Member SET IsOnlineUser= 0;
ALTER TABLE Member ALTER COLUMN IsOnlineUser bit NOT NULL;
END
GO
But that gives me
Invalid column name 'IsOnlineUser'
. Assumedly this is because the UPDATE fails to find the created column so I thought if I put a 'GO' between the two statements it would help so I did the following:
SET #ColumnExists = (SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = 'Member' AND COLUMN_NAME = 'IsOnlineUser');
IF (#ColumnExists = 0)
BEGIN
ALTER TABLE Member ADD IsOnlineUser bit NULL;
END
GO
IF (#ColumnExists = 0)
BEGIN
UPDATE Member SET IsOnlineUser= 0;
ALTER TABLE Member ALTER COLUMN IsOnlineUser bit NOT NULL;
END
GO
However this says
Must declare the scalar variable "#ColumnExists".
Assumedly this is because of the GO stopping me access the scalar variable between the two.
It seems like a fairly common use case, so I assume I am just missing something, any help would be much appreciated
You could add the column as not null with a default constraint:
alter table Member add IsOnlineUser bit not null default 0;
Optionally, you can give the constraint a specific name at the same time like so:
alter table member
add IsOnlineUser bit not null
constraint df_Member_IsOnlineUser default 0;
To simplify your if, you skip the variable and use not exists() like so:
if not exists (
select 1
from information_schema.columns
where table_name = 'Member'
and column_name = 'IsOnlineUser'
)
begin;
alter table member
add IsOnlineUser bit not null
constraint df_Member_IsOnlineUser default 0;
end;
If you just want to make the existing code work while maintaining your current logic, you can execute sql strings instead of inline code with exec or sp_executesql.
...
if (#ColumnExists = 0)
begin;
exec sp_executesql N'alter table Member add IsOnlineUser bit null';
exec sp_executesql N'update Member set IsOnlineUser= 0;';
exec sp_executesql N'alter table Member alter column IsOnlineUser bit not null;';
end;
go
You can check if column does not exist and add it if needed. Put GO and go for UPDATE and ALTER after that. It will run correctly in both the cases wherever column exist or not. No need for variables.
IF NOT EXISTS (SELECT * FROM sys.columns WHERE name = 'IsOnlineUser' AND object_id = OBJECT_ID('Member'))
ALTER TABLE Member ADD IsOnlineUser bit NULL;
GO
UPDATE Member SET IsOnlineUser= 0;
ALTER TABLE Member ALTER COLUMN IsOnlineUser bit NOT NULL;
GO
Note: I prefer the use of SQL Server's sys views rather than ANSI INFORMATION.SCHEMA, and wanted to show that alternative, but that's just me. You can keep INFORMATION.SCHEMA if you like.
Edit-PS: Variable exist only within the one batch. After you use GO, you can't use variables declared prior to.

Add column to existing table and default value to another column without dynamic sql

For Sql Server 2005 and 2008 I want to check if a column already exists on a given table and create it if it doesn't. This new column should have a default value of an ExistingColumn. Currently I need to use dynamic sql to fill the new column because sql server will complain of a syntax error.
Here is the current sql server code:
IF NOT EXISTS (SELECT TOP 1 1 FROM sys.columns WHERE [name] = N'NewColumn' AND OBJECT_ID = OBJECT_ID(N'ExistingTable'))
BEGIN
ALTER TABLE [dbo].[ExistingTable] ADD [NewColumn] VARCHAR(50) NULL;
exec sp_executesql N'UPDATE [dbo].[ExistingTable] SET NewColumn = ExistingColumn'
ALTER TABLE [dbo].[ExistingTable] ALTER COLUMN [NewColumn] VARCHAR(50) NOT NULL
END
GO
Is there any other way to solve this problem without resorting to dynamic sql?
Since you're creating the column regardless, you could do two separate batches.
IF NOT EXISTS (SELECT TOP 1 1 FROM sys.columns WHERE [name] = N'NewColumn' AND OBJECT_ID = OBJECT_ID(N'ExistingTable'))
BEGIN
ALTER TABLE [dbo].[ExistingTable] ADD [NewColumn] VARCHAR(50) NULL;
END
GO
IF EXISTS (SELECT TOP 1 1 FROM sys.columns WHERE [name] = N'NewColumn' AND OBJECT_ID = OBJECT_ID(N'ExistingTable'))
BEGIN
IF EXISTS (SELECT 1 FROM [dbo].[ExistingTable] WHERE NewColumn IS NULL)
BEGIN
UPDATE [dbo].[ExistingTable] SET NewColumn = ExistingColumn
ALTER TABLE [dbo].[ExistingTable] ALTER COLUMN [NewColumn] VARCHAR(50) NOT NULL
END
END
GO
SQL Server is parsing your statement before your ALTER runs, and saying "Hey, no such column." The parser doesn't understand IF and other branching and can't follow the sequence of events when you mix DDL and DML - or predict the sequence the events will take and what branching will happen at runtime.
Deferred name resolution allows you to access objects that don't exist yet, but not columns that don't exist yet on objects that do.
So, dynamic SQL seems like the way you'll have to do it.

How can I set all columns' default value equal to null in PostgreSQL

I would like to set the default value for every column in a number of tables equal to Null. I can view the default constraint under information_schema.columns.column_default. When I try to run
update information_schema.columns set column_default = Null where table_name = '[table]'
it throws "ERROR: cannot update a view HINT: You need an unconditional ON UPDATE DO INSTEAD rule."
What is the best way to go about this?
You need to run an ALTER TABLE statement for each column. Never ever try to do something like that by manipulating system tables (even if you find the correct one - INFORMATION_SCHEMA only contains view to the real system tables)
But you can generate all needed ALTER TABLE statements based on the data in the information_schema views:
SELECT 'ALTER TABLE '||table_name||' ALTER COLUMN '||column_name||' SET DEFAULT NULL;'
FROM information_schema.columns
WHERE table_name = 'foo';
Save the output as a SQL script and then run that script (don't forget to commit the changes)