In a nutshell, I have a temp table which stores a range of data. The number of rows could be dynamic depending on when this query is run. I'm then trying to alter the table and add an identity which I want to start at a defined value that I have chosen.
Whatever I seem to have tried, when viewing the data the identity column always starts at 1…and increments and not the value I specified.
--At this point of the code the data is already in the table!
Set existingNumber =
(
--Get a number from a table
)
Set existingNumber = existingNumber + 1
Alter table #myTable
Add testID bigINT Identity
DBCC CHECKIDENT (#myTable, RESEED, #existingNumber)
Is there a way that I can simply restart the Identity column for pre existing values?
I have tried looking at creating the identity with the table but this causes another problem due to how our company framework is unfortunately.
EDIT:
Using SQL Management Studio
Your #existingNumber variable is not being set correctly.
The below example works fine..
declare #existingNumber int = 1000
create table #test (col1 varchar(20))
alter table #test add testId bigint Identity
dbcc checkident (#test, reseed, #existingNumber)
insert into #test values ('test record')
select * from #test
drop table #test
Related
I have the following table :
CREATE TABLE Seq2 (val INT NOT NULL IDENTITY);
How to populate this table knowing that I tried this :
INSERT INTO Seq2(val) VALUES (1)
I have the following error :
Cannot insert explicit value for identity column in table 'Seq2' when
IDENTITY_INSERT is set to OFF.
Having such a table seems completely pointless, if I must say. If the table has only an IDENTITY then it effectively holds no meaning, so there's no point it being there.
That being said, if you did have such a table, you can INSERT values into the IDENTITY using DEFAULT VALUES:
INSERT INTO dbo.Seq2
DEFAULT VALUES;
INSERT INTO dbo.Seq2
DEFAULT VALUES;
With a new table, this would create rows with the values 1 and 2.
If you want to explicitly INSERT values into the table, then you're better off remove the IDENTITY option. Considering this is a new table, just DROP it and recreate it with the IDENTITY property:
DROP TABLE dbo.Seq2;
GO
CREATE TABLE Seq2 (val INT NOT NULL);
Having a table with a single IDENTITY column, that you're then going to define the results for really is pointless. Either don't use IDENTITY and define the values, or use IDENTITY and let SQL Server handle it.
SET IDENTITY_INSERT Seq2 ON
INSERT INTO Seq2(val)VALUES (1)
SET IDENTITY_INSERT Seq2 OFF
Simply, enable IDENTITY_INSERT for the table. That looks like this:
SET IDENTITY_INSERT IdentityTable ON
INSERT INTO Seq2(val) VALUES (1)
SET IDENTITY_INSERT IdentityTable OFF
Keep in mind :
It can only be enabled on one table at a time. If you try to enable
it on a second table while it is still enabled on a first table SQL
Server will generate an error.
When it is enabled on a table you must specify a value for the
identity column.
The user issuing the statement must own the object, be a system
administrator (sysadmin role), be the database owner (dbo) or be a
member of the db_ddladmin role in order to run the command.
SELECT SCOPE_IDENTITY() // to get last identity value generated in the same session and scope
SELECT ##IDENTITY // to get the last identity vaue generated in a session irrespective of scope
I'm attempting to create a 'history' table that gets updated every time a row on the source table is updated.
Here's the (SQL Server) code I'm using to create the history table:
DROP TABLE eventGroup_History
SELECT
CAST(NULL AS UNIQUEIDENTIFIER) AS NewId,
CAST(NULL AS varchar(255)) AS DoneBy,
CAST(NULL AS varchar(255)) AS Operation,
CAST(NULL AS datetime) AS DoneAt,
*
INTO
eventGroup_History
FROM
eventGroup
WHERE
1 = 0
GO
ALTER TABLE eventGroup_History
ALTER COLUMN NewId UNIQUEIDENTIFIER NOT NULL
go
ALTER TABLE eventGroup_History
ADD PRIMARY KEY (NewId)
GO
ALTER TABLE eventGroup_History
ADD CONSTRAINT DF_eventGroup_History_NewId DEFAULT NewSequentialId() FOR NewId
GO
The trigger is created like this:
drop trigger eventGroup_LogUpdate
go
create trigger eventGroup_LogUpdate
on dbo.eventGroup
for update
as
declare #Now as DateTime = GetDate()
set nocount on
insert into eventGroup_History
select #Now, SUser_SName(), 'update-deleted', *
from deleted
insert into eventGroup_History
select SUser_SName(), 'update-inserted', #Now, *
from inserted
go
exec sp_settriggerorder #triggername = 'eventGroup_LogUpdate', #order = 'last', #stmttype = 'update'
But when I update a row in SQL Server Management Studio, I get a message:
The data in row 2 was not committed.
Error Source: .Net SqlClient Data Provider.
Error Message: Conversion failed when converting from a character string to uniqueidentifier.
I think that the trigger is attempting to insert the SUserSName() as the first column of the row but that is the PK NewId:
There are no other uniqueidentifier columns in the table.
If I add row from the SQL Management Studio's edit grid, the row gets added without me having to specify the NewId value.
So, why is the SQL Server trigger attempting to populate NewId with first item in the INSERT INTO clause rather than skipping it to let the normal IDENTITY operation provide a value?
(And how do I stop this happening so that the trigger works?)
Because the automatic skipping only applies to IDENTITY columns - a GUID column set with the NewSequentialId() constraint behaves similarly to IDENTITY in many ways but not this one.
You can achieve what you are looking for by specifying the columns for the INSERT explicitly.
If you're going to use a default value on your NewId column, you need to explicitly list the column names in the INSERT statements. By default, SQL Server will insert the columns in the order they're listed in the SELECT, unless you give it enough information to do otherwise. Listing out the columns explicitly is a best practice, one way or the other, in order to avoid just this sort of unanticipated result.
So your statements will end up looking like this:
INSERT INTO eventGroup_History
(
DoneBy,
Operation,
DoneAt,
<All the other columns that are masked by the *>
)
SELECT....
I have a big database and I should to normalize it. One of the table contains field with type "integer" but it contains only 1 and 0 values. So it is good reason to convert this field to bit type. But when I try to save changes in SQL Server Management Studio it tells me that I can't do it. Also I have many field with values like nvarchar that should be converted to int or float that should be converted to int too.
Moreover I should create migration scripts for all changes so I can update real database without loosing data. Maybe somebody knows useful utility for this?
EDIT: It tells me that I can't update unable without drop it. And I want to update table without losing any data.
SQL version 2014
---Create one Temp. Column
Alter Table [dbo].[Demo2]
Add tempId int
GO
--Copy Data in temp. Coulmn
Update [dbo].[Demo2] set tempId=Id
--Drop column which you want to modify
Alter Table [dbo].[Demo2]
Drop Column Id
Go
--Create again that column with bit type
Alter Table [dbo].[Demo2]
Add Id bit
GO
--copy date back
Update [dbo].[Demo2] set Id=tempId
--drop temp column
Alter Table [dbo].[Demo2]
Drop Column tempId
Go
Here's how to add a new column to a table, set that column to the old column and then remove the old column
CREATE TABLE #test
(inttest int
)
Insert [#test]
( [inttest] )
Values ( 0
)
Insert [#test]
( [inttest] )
Values ( 1
)
Alter Table [#test] Add bittest bit
Update [#test] Set bittest=inttest
Alter Table [#test] Drop Column [inttest]
SELECT * FROM [#test] [T]
To generate migration script you don't need a special utility, SSMS does it pretty well.
Right-click the table in SSMS object explorer. Choose Design item in the context menu. Change the type of the column. In the main menu Table Designer choose item Generate Change Script. Save the generated script to a file, review it and make sure you understand each line in it before you run it on a production system. Adjust the script if needed.
On the other hand, to change the column type from int to bit you can use the ALTER TABLE statement:
ALTER TABLE dbo.TableName
ALTER COLUMN ColumnName bit NOT NULL
Before running this you should check that actual int values are indeed only 0 and 1.
After reading of all comments and posts I found solution in building procedure which will convert passed table and column in required. So I wrote this function
IF EXISTS (select * from dbo.sysobjects where id = object_id(N'IntToBit') and OBJECTPROPERTY(id, N'IsProcedure') = 1)
DROP PROCEDURE IntToBit
GO
IF OBJECT_ID('convertion_table', 'U') IS NOT NULL
DROP TABLE dbo.convertion_table;
go
CREATE TABLE dbo.convertion_table
(
bitTypeColumn bit NOT NULL DEFAULT 0,
intTypeColumnt integer ,
)
go
CREATE procedure IntToBit
#table nvarchar(150),
#column nvarchar(150)
AS
begin
DECLARE #sql nvarchar(4000)
SELECT #sql ='
--copy data to temp table
INSERT INTO convertion_table (bitTypeColumn)
SELECT '+#column +'
FROM ' +#table+'
--Drop column which you want to modify
Alter Table ' +#table+'
Drop Column '+#column +'
--Create again that column with bit type
Alter Table ' +#table+'
Add '+#column +' bit NOT NULL DEFAULT(0)
--copy date back
INSERT INTO '+#table+'('+#column+')
SELECT bitTypeColumn
FROM convertion_table
--cleare temp table
--DELETE bitTypeColumn FROM convertion_table
'
exec sp_executesql #sql
end
GO
and then call it passing field and table name :
exec dbo.IntToBit #table = 'tbl_SystemUsers', #column='intUseLogin';
Special thanks to Chris K and Hitesh Thakor
Simply Use TSql script to modify the table rather than using the designer
ALTER TABLE YourTableNameHere
ALTER COLUMN YourColumnNameHere INT
If you are using sql Server then you might wanna generate script for the table before altering the table so that you dont loose any data ..and you can simply retrieve everything using the script
I have a database in which all the tables have column named ID. I need to alter all tables in database and add identity to these columns. Is it possible with some query ? Or do I have to do it manually?
Thank you very much.
Unfortunately in SQL Server you cannot add identity property to existing columns. You need to drop an existing one, then create new with this property. You can automate this task by quering system tables and using dynamic sql. But if you already have some data in ID column this will make things more tricky because you need to preserve existing data.
Here is a script for a single table, automating this for all tables in database using dynamic sql will be kinda tricky...
Table Test_table has 2 columns: id and val.
-- Move data to temp storage
SELECT ID,
VAL
INTO #temp_table
FROM dbo.test_table
-- Remove data from original table
DELETE
FROM dbo.test_table
-- Drop and Create ID column
ALTER TABLE dbo.test_table
DROP COLUMN ID
ALTER TABLE dbo.test_table
ADD ID int IDENTITY(1,1)
-- Move data back to original table
SET IDENTITY_INSERT dbo.test_table ON
INSERT INTO dbo.test_table (ID, VAL)
SELECT ID, VAL
FROM #temp_table
DECLARE #MaxID int
SELECT #MaxID = MAX(ID) + 1
FROM dbo.test_table
SET IDENTITY_INSERT dbo.test_table OFF
-- Reseed IDENTITY property
DBCC CHECKIDENT ('dbo.test_table', RESEED, #MaxID)
There is no way to do this for all tables. Here's what I'd do: Use T-SQL to generate a (big) script that performs all the changes, then manually run that script.
You can add the identity property to existing columns without data movement using SWITCH.
Following statement can reset seed
DBCC CHECKIDENT ('TableName', RESEED, 1)
but this time I have to change its increment .
Or you can use Sql Server Management Studio:
Using this approach will most likely recreate the table.
Hope this helps
ALTER TABLE MyCustomers
ALTER COLUMN CustId IDENTITY (200, 2)
Code from ms-help://MS.VSCC.v90/MS.MSDNQTR.v90.en/ssmprog3/html/5719d3e4-14db-4073-bed7-d08f39416a39.htm
We can't update a column's identity to increment by 2 on each entry. The easiest method is to create another table with IDENTITY(1,2) and move the data to that table before dropping the actual table. Please go through the script below.
Let TableA is our actual table.
CREATE TABLE TableB(col1 INT IDENTITY (1,2) NOT NULL, col2 VARCHAR(10) NULL);
INSERT INTO TableB SELECT col2 FROM TableA;
DROP TABLE TableA;
sp_rename TableB, TableA;
You can reset the auto increment to a higher value by
Set IDENTITY_INSERT off
Running an insert statement with ID set to where you want to continue from
Set IDENTITY_INSERT on
Delete the row created (assuming it was only done to increment seed)
For example;
Table 'People' with last ID = 10 can be set to continue from 400,000 by;
SET IDENTITY_INSERT AccessToken off
insert into People(Id, name) values (400000,'Bob')
SET IDENTITY_INSERT AccessToken on
Delete from People where ID = 400000