I have a big database and I should to normalize it. One of the table contains field with type "integer" but it contains only 1 and 0 values. So it is good reason to convert this field to bit type. But when I try to save changes in SQL Server Management Studio it tells me that I can't do it. Also I have many field with values like nvarchar that should be converted to int or float that should be converted to int too.
Moreover I should create migration scripts for all changes so I can update real database without loosing data. Maybe somebody knows useful utility for this?
EDIT: It tells me that I can't update unable without drop it. And I want to update table without losing any data.
SQL version 2014
---Create one Temp. Column
Alter Table [dbo].[Demo2]
Add tempId int
GO
--Copy Data in temp. Coulmn
Update [dbo].[Demo2] set tempId=Id
--Drop column which you want to modify
Alter Table [dbo].[Demo2]
Drop Column Id
Go
--Create again that column with bit type
Alter Table [dbo].[Demo2]
Add Id bit
GO
--copy date back
Update [dbo].[Demo2] set Id=tempId
--drop temp column
Alter Table [dbo].[Demo2]
Drop Column tempId
Go
Here's how to add a new column to a table, set that column to the old column and then remove the old column
CREATE TABLE #test
(inttest int
)
Insert [#test]
( [inttest] )
Values ( 0
)
Insert [#test]
( [inttest] )
Values ( 1
)
Alter Table [#test] Add bittest bit
Update [#test] Set bittest=inttest
Alter Table [#test] Drop Column [inttest]
SELECT * FROM [#test] [T]
To generate migration script you don't need a special utility, SSMS does it pretty well.
Right-click the table in SSMS object explorer. Choose Design item in the context menu. Change the type of the column. In the main menu Table Designer choose item Generate Change Script. Save the generated script to a file, review it and make sure you understand each line in it before you run it on a production system. Adjust the script if needed.
On the other hand, to change the column type from int to bit you can use the ALTER TABLE statement:
ALTER TABLE dbo.TableName
ALTER COLUMN ColumnName bit NOT NULL
Before running this you should check that actual int values are indeed only 0 and 1.
After reading of all comments and posts I found solution in building procedure which will convert passed table and column in required. So I wrote this function
IF EXISTS (select * from dbo.sysobjects where id = object_id(N'IntToBit') and OBJECTPROPERTY(id, N'IsProcedure') = 1)
DROP PROCEDURE IntToBit
GO
IF OBJECT_ID('convertion_table', 'U') IS NOT NULL
DROP TABLE dbo.convertion_table;
go
CREATE TABLE dbo.convertion_table
(
bitTypeColumn bit NOT NULL DEFAULT 0,
intTypeColumnt integer ,
)
go
CREATE procedure IntToBit
#table nvarchar(150),
#column nvarchar(150)
AS
begin
DECLARE #sql nvarchar(4000)
SELECT #sql ='
--copy data to temp table
INSERT INTO convertion_table (bitTypeColumn)
SELECT '+#column +'
FROM ' +#table+'
--Drop column which you want to modify
Alter Table ' +#table+'
Drop Column '+#column +'
--Create again that column with bit type
Alter Table ' +#table+'
Add '+#column +' bit NOT NULL DEFAULT(0)
--copy date back
INSERT INTO '+#table+'('+#column+')
SELECT bitTypeColumn
FROM convertion_table
--cleare temp table
--DELETE bitTypeColumn FROM convertion_table
'
exec sp_executesql #sql
end
GO
and then call it passing field and table name :
exec dbo.IntToBit #table = 'tbl_SystemUsers', #column='intUseLogin';
Special thanks to Chris K and Hitesh Thakor
Simply Use TSql script to modify the table rather than using the designer
ALTER TABLE YourTableNameHere
ALTER COLUMN YourColumnNameHere INT
If you are using sql Server then you might wanna generate script for the table before altering the table so that you dont loose any data ..and you can simply retrieve everything using the script
Related
Is there a way to clone the table definition from an existing table and recreate as a table variable?
DECLARE #TempTable1 TABLE (ID INT, Description VARCHAR(256))
I need to recreate a set of tables with same number of columns and definitions without repeating the DECLARE TABLE statement.
This process is available on MySQL as below.
CREATE TABLE TempTable1 LIKE TempTableMain;
Is it possible to do this is Microsoft SQL Server?
Please note that the actual scenario contains more that 60 columns in the #TempTable and need to create more than 10 instances from the original table.
I am not talking about data insertion or SELECT ion from another table as below. I need to create the table definition.
DECLARE #TempTable TABLE(ID INT, Description VARCHAR(100))
INSERT INTO #TempTable
VALUES (1, 'Test1'), (1, 'Test1');
SELECT *
INTO #TempTable2
FROM #TempTable1
SELECT * FROM #TempTable2
Create a user defined type with the columns of your table, lets say like that:
CREATE TYPE MyTableType AS TABLE (ID INT, Description VARCHAR(256));
And then declare your table variables using this type:
DECLARE #Table1 MyTableType;
DECLARE #Table2 MyTableType;
DECLARE #Table3 MyTableType;
SQL Server management studio gives you the option to create a sql script to create an already existing table.
Right click your table -> script table as -> CREATE To -> New Query Editor window
This way you dont have to write out the whole query every single time.
You could even create a stored procedure which takes as argument the name of your to be created table and run this from a while loop.
You can perform the following command:
SELECT * INTO #MyTable_tmp FROM MyTable
Then modify your MyTable, and copy your data back in. Other approaches I've seen is to create a new table calling it Mytable_Tmp (Not a temp table), which will be your new table.
Then copy your data doing any migrations you need. Then you will drop the original table and do a rename on Mytable.
When you run SELECT * INTO #MyTable FROM MyTable, SQL Server creates a new temporary table called #MyTable that matches each column and data type from your select clause. In this case we are selecting * so it will match MyTable. This only creates the columns it doesn't copy defaults, constraints indexes or anything else.
If you are using table variables, it means that you don't want to use them in long period of time, as they will be "forgotten" after every script completion.
So, easiest in my opinion is to use such construct:
IF OBJECT_ID('tempdb.dbo.#tmpTable', 'U') IS NOT NULL
DROP TABLE #tmpTable;
SELECT * INTO #tmpTable FROM MyPrimaryTable
It creates temporary table exactly like yours, if you want empty table, you can just use:
SELECT * INTO #tmpTable FROM MyPrimaryTable WHERE 1 = 0
Then, temporary table will have exact same schema as your primary table.
You can apply as many times as you need (create as many temporary tables as you need).
You could use regular tables instead of temporary tables as well.
If you want to re-create table after dropping the existing table then you can use the below query.
/*
Create brands table
*/
-- Old block of code
IF EXISTS (SELECT * FROM sys.objects
WHERE object_id = OBJECT_ID(N'[TOY].[BRANDS]') AND type in (N'U'))
DROP TABLE [TOY].[BRANDS]
GO
-- New block of code
DROP TABLE IF EXISTS [TOY].[BRANDS]
GO
-- Add new table
CREATE TABLE TOY.BRANDS
(
ID INT NOT NULL,
NAME VARCHAR(20) NULL
)
GO
-- Load the table with data
INSERT INTO TOY.BRANDS (ID, NAME) VALUES
(1, 'Ford'),
(2, 'Chevy'),
(3, 'Dodge'),
(4, 'Plymouth'),
(5, 'Oldsmobile'),
(6, 'Lincoln'),
(7, 'Mercury');
GO
I have a database in which all the tables have column named ID. I need to alter all tables in database and add identity to these columns. Is it possible with some query ? Or do I have to do it manually?
Thank you very much.
Unfortunately in SQL Server you cannot add identity property to existing columns. You need to drop an existing one, then create new with this property. You can automate this task by quering system tables and using dynamic sql. But if you already have some data in ID column this will make things more tricky because you need to preserve existing data.
Here is a script for a single table, automating this for all tables in database using dynamic sql will be kinda tricky...
Table Test_table has 2 columns: id and val.
-- Move data to temp storage
SELECT ID,
VAL
INTO #temp_table
FROM dbo.test_table
-- Remove data from original table
DELETE
FROM dbo.test_table
-- Drop and Create ID column
ALTER TABLE dbo.test_table
DROP COLUMN ID
ALTER TABLE dbo.test_table
ADD ID int IDENTITY(1,1)
-- Move data back to original table
SET IDENTITY_INSERT dbo.test_table ON
INSERT INTO dbo.test_table (ID, VAL)
SELECT ID, VAL
FROM #temp_table
DECLARE #MaxID int
SELECT #MaxID = MAX(ID) + 1
FROM dbo.test_table
SET IDENTITY_INSERT dbo.test_table OFF
-- Reseed IDENTITY property
DBCC CHECKIDENT ('dbo.test_table', RESEED, #MaxID)
There is no way to do this for all tables. Here's what I'd do: Use T-SQL to generate a (big) script that performs all the changes, then manually run that script.
You can add the identity property to existing columns without data movement using SWITCH.
I want to change varchar to varbinary(max) in SQL Server with this query:
ALTER TABLE [dbo].[Attachments]
ALTER COLUMN [Content] varbinary(max) NOT NULL
but this throws the following exception:
Implicit conversion from data type varchar to varbinary(max) is not allowed. Use the CONVERT function to run this query
What should I change in this situation ?
Are you sure you want varbinary(max)? If so, I believe you need to do this in steps:
ALTER TABLE Attachments
ADD Content2 varbinary(max)
UPDATE Attachments
SET Content2 = CONVERT(varbinary(MAX),Content)
ALTER TABLE Attachments
DROP COLUMN Content
sp_RENAME 'Attachments.[Content2]' , '[Content]', 'COLUMN'
Depending on the nature of the table, it might be faster to convert it via a select into:
SELECT Content = CAST(Content AS VARBINARY(MAX))
,other fields
INTO NewTable
FROM OldTable
Then drop the old table and rename the new:
DROP TABLE OldTable
GO
SP_RENAME 'NewTable', 'OldTable'
You need to stage the process:
ALTER TABLE [dbo].[Attachments]
ADD [TempContent] varbinary(max)
go
UPDATE Attachements SET TempContent = CAST(Content as VARBINARY)
go
ALTER TABLE [dbo].[Attachments]
DROP COLUMN [Content]
go
sp_RENAME 'Attachements.[TempContent ]' , '[Content ]', 'COLUMN'
go
You can also do this in SQL Server Management Studio, and if you fire up profiler it will show you the code it used (always helpful)
Use this method when inserting the file into database:
SqlCommand cmd = new SqlCommand("Insert into tblJobSeeker Values('"+txtUserName.Text+"',#Data)",con);
cmd.Parameters.AddWithValue("#Data",Filesize);
In our database there is a table which is created with ANSI_NULLS OFF. Now we have created a view using this table. And we want to add a clustered index for this view.
While creating the clustered index it is showing an error like can't create an index since the ANSI_NULL is off for this particular table.
This table contains a large amount of data. So I want to change this option to ON without losing any data.
Is there any way to alter the table to modify this option . Please give your suggestions.
This was cross posted on Database Administrators so I might as well post my answer from there here too to help future searchers.
It can be done as a metadata only change (i.e. without migrating all the data to a new table) using ALTER TABLE ... SWITCH.
Example code below
/*Create table with option off*/
SET ANSI_NULLS OFF;
CREATE TABLE dbo.YourTable (X INT)
/*Add some data*/
INSERT INTO dbo.YourTable VALUES (1),(2),(3)
/*Confirm the bit is set to 0*/
SELECT uses_ansi_nulls, *
FROM sys.tables
WHERE object_id = object_id('dbo.YourTable')
GO
BEGIN TRY
BEGIN TRANSACTION;
/*Create new table with identical structure but option on*/
SET ANSI_NULLS ON;
CREATE TABLE dbo.YourTableNew (X INT)
/*Metadata only switch*/
ALTER TABLE dbo.YourTable SWITCH TO dbo.YourTableNew;
DROP TABLE dbo.YourTable;
EXECUTE sp_rename N'dbo.YourTableNew', N'YourTable','OBJECT';
/*Confirm the bit is set to 1*/
SELECT uses_ansi_nulls, *
FROM sys.tables
WHERE object_id = object_id('dbo.YourTable')
/*Data still there!*/
SELECT *
FROM dbo.YourTable
COMMIT TRANSACTION;
END TRY
BEGIN CATCH
IF XACT_STATE() <> 0
ROLLBACK TRANSACTION;
PRINT ERROR_MESSAGE();
END CATCH;
WARNING: when your table contains an IDENTITY column you need to reseed the IDENTITY value.
The SWITCH TO will reset the seed of the identity column and if you do not have a UNIQUE or PRIMARY KEY constraint on the identity (e.g. when using CLUSTERED COLUMNSTORE index in SQL 2014) you won't notice it right away.
You need to use DBCC CHECKIDENT ('dbo.YourTable', RESEED, [reseed value]) to correctly set the seed value again.
Unfortunately, there is no way how to do it without recreating. You need to create new table with ANSI_NULLS ON and copy there all data.
It should be something like:
SET ANSI_NULLS ON;
CREATE TABLE new_MyTBL (
....
)
-- stop all processes changing your data at this point
SET IDENTITY_INSERT new_MyTBL ON
INSERT new_MyTBL (...) -- including IDENTITY field
SELECT ... -- including IDENTITY field
FROM MyTBL
SET IDENTITY_INSERT new_MyTBL OFF
-- alter/drop WITH SCHEMABINDING objects at this point
EXEC sp_rename #objname = 'MyTBL', #newname = 'old_MyTBL'
EXEC sp_rename #objname = 'new_MyTBL', #newname = 'MyTBL'
-- alter/create WITH SCHEMABINDING objects at this point
-- re-enable your processes
DROP TABLE old_MyTBL -- do that when you are sure that system works OK
If there are any depending objects, they will work with new table as soon as you rename it. But if some of them are WITH SCHEMABINDING you need to DROP and CREATE them manualy.
I tried the SWITCH option recommended above but was unable to RESEED the identity. I could not find out why.
I used the following alternative approach instead:
Create database snapshot for the database that contains the table
Script table definition of the table you intend to update
Delete the table that you intend to update (Make sure the database snapshot is successfully created)
Update SET ANSI NULLs from OFF to ON from the script obtained from step 2 and run updated script. Table is now recreated.
Populate data from database snapshot to your table:
SET IDENTITY_INSERT TABLE_NAME ON
INSERT INTO TABLE_NAME (PK, col1, etc.)
SELECT PK, col1, etc.
FROM [Database_Snapshot].dbo.TABLE_NAME
SET IDENTITY_INSERT TABLE_NAME OFF
Migrate non clustered index manually (get script from database snapshot)
Using the above:
I did not have to worry about constraints and keys since table/constraint names always remain the same (I do not need to rename anything)
I have a backup of my data (the snapshot) which I can rely on to double check that nothing is missing.
I do not need to reseed the identity
I realize deleting table may not always be straightforward if table is referenced in other tables. That was not the case for me in this instance.. I was lucky.
I have this SQL change script that runs as part of my nant orchestrated DB creation or update:
SET XACT_ABORT ON
BEGIN TRANSACTION
PRINT 'Change MyColumn column to MyNewColumn column in MyTable table'
IF EXISTS (SELECT *
FROM sys.columns
WHERE Name = 'MyColumn' AND Object_id = OBJECT_ID('[dbo].[MyTable]'))
BEGIN
PRINT '-> Exists, change it'
/* NOTE THE NEXT LINE */
SET #Value = (SELECT MyColumn FROM [dbo].[MyTable])
ALTER TABLE [dbo].[MyTable]
DROP CONSTRAINT DF_MyTable_MyColumn
ALTER TABLE [dbo].[MyTable]
DROP COLUMN MyColumn
ALTER TABLE [dbo].[MyTable]
ADD MyNewColumn nvarchar(20) NULL
ALTER TABLE [dbo].[MyTable]
ADD CONSTRAINT DF_MyTable_MyNewColumn DEFAULT ('') FOR MyNewColumn
PRINT '-> Add values back into table'
SET #Dynamic_Sql = 'UPDATE [dbo].[MyTable] SET MyNewColumn = ''' + #Value + ''''
EXEC(#Dynamic_Sql)
PRINT '-> Alter to NOT NULL'
ALTER TABLE [dbo].[MyTable]
ALTER COLUMN MyNewColumn nvarchar(20) NOT NULL
END
ELSE
BEGIN
PRINT '-> Does not exist, skip it'
END
I have already ran this update script before and made the changes to the DB (so MyColumn no longer exists). But now I have a new script that comes after this one, but my "build" fails on this line of this script with:
Msg 207, Level 16, State 1, Line 15
Invalid column name 'MyColumn'
where Line 15 is the FROM sys.columns line. But this is actually complaining about the line I have within the IF statement, where I have put in the NOTE comment. Why would this be the behaviour? Of course the column name will be invalid if it no longer exists.
Do you include the GO batch separator after you create all of your columns? If not, the columns won't be created by the time your first query runs, because the query parser parses it all at the same time -- at parse time, the column really doesn't exist.
By adding the GO batch separator, you force it to parse the portions of the query which use your newly created columns after the columns are actually created.
The problem (as Dave Markle alludes to, so feel free to accept his answer) is that SQL Server parses the entire section of the script. It sees that you're referring to MyColumn and that column doesn't exist, so it gives you the error. It doesn't matter that it's within an IF statement.
You can test it easily with this script:
CREATE TABLE dbo.Test (my_id int)
GO
IF (1=0)
SELECT blah FROM Test
If I can find a way to defer the parsing, I'll update this answer, but other than using dynamic SQL I don't think that you can.
EDIT:
Here's one possible solution. I didn't go through Martin's link yet, but that may be another.
CREATE FUNCTION dbo.Get_my_id ()
RETURNS INT
AS
BEGIN
DECLARE #my_id INT
SELECT #my_id = blah FROM dbo.Test
RETURN #my_id
END
GO
CREATE TABLE dbo.Test (my_id INT)
GO
DECLARE #my_id INT
IF (1=0)
SELECT #my_id = dbo.Get_my_id()
GO
BTW, if your table has more than one row in it, you realize that the value of your variable cannot be predicted, correct?