SQL Query creating copy of old SQL entry with two values changed [duplicate] - sql

This question already has answers here:
Quickest way to clone row in SQL
(5 answers)
Closed 8 years ago.
I have a table with around 50 to 60 cols (and counting), and I would like to know whether I can create a generic query for INSERT ... SELECT to copy one row, but with two cols changed.
More specifically, I want to fetch one global config from table configs and insert it into table configs with flag global set to false and new id auto-increment value.
Sth. like:
INSERT INTO configs
(SELECT TOP 1 * FROM configs WHERE global=1)
UPDATE global=0, id=?
(And of course the new autoincrement id should be returned to me, for I have to update the user's profile.)

Here is a fully functional solution with a demonstration of how it works. I'm assuming you are completing this action inside a stored procedure. I basically clone the current global=1 row into a temp table, then drop off the IDENTITY column so you can use SELECT * to reinsert the record. By using SELECT *, you will not have to update this whenever the column count increases.
-- setup demonstration with two sample columns of data
CREATE TABLE #configs (ID INT IDENTITY(100,1), [Global] INT, ColA CHAR(2), ColB VARCHAR(2));
-- fill with values
SET NOCOUNT ON;
INSERT #configs VALUES (1,'AA','BB');
INSERT #configs VALUES (1,'CC','DD');
INSERT #configs VALUES (1,'EF','GH');
SET NOCOUNT OFF;
-- This is the target ID we are working with
DECLARE #CloneID INT = 100;
-- Examine the ID
SELECT * FROM #configs WHERE ID=#CloneID;
-- This work should be completed in a transaction
BEGIN TRANSACTION;
-- copy current "global=1" record into a temp table and change its value to 0
SELECT * INTO #temp FROM #configs WHERE ID=#CloneID AND [Global]=1;
UPDATE #temp SET [Global]=0;
-- drop off the IDENTITY column so we can select it into main table again
ALTER TABLE #temp DROP COLUMN [ID];
-- copy the old "global=1" record back into main table, its value has been changed
INSERT #configs SELECT * FROM #temp;
COMMIT;
-- Examine
SELECT * FROM #configs;
-- cleanup
DROP TABLE #temp;
DROP TABLE #configs;

Related

Create a string from inserted.id via trigger

I'm creating a trigger on insert that will create a new record in another table using the id from the newly inserted row.
I need to format the id before I insert it into the other table. This is my code so far.... having problems creating the formated #PseudoID.
CREATE TRIGGER OnInsertCreateUnallocatedPseudo
ON tblTeams
AFTER INSERT
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Insert statements for trigger here
DECLARE #PseudoID NVARCHAR(50), #tmID NVARCHAR(10)
SELECT #tmID = CONVERT(nvarchar(10),inserted.tmID) FROM inserted
--NEED SOME CODE TO CREATE A PADDED OUT PseudoID e.g.
-- if #tmID = '7' then #PseudoID = 'PSU0007'
-- if #tnID = '25' then #PseudoID = 'PSU0025'
INSERT INTO [dbo].[tblUsersPseudo].....
END
You can't assign it to a variable within the trigger since there could be multiple rows, but you could do something like this to insert into the other table.
LEFT('PSU0000',LEN('PSU0000')-LEN(CONVERT(nvarchar(10),inserted.tmID))) + CONVERT(nvarchar(10),inserted.tmID)
You can't assume that Inserted has only a single row, you have to treat it as a table with 0-N rows and carry out set-based operations on it (rather than procedural).
FORMAT will format your existing id into the new format you require.
INSERT INTO [dbo].[tblUsersPseudo] (id, col1, col2, ...)
SELECT FORMAT(id,'PSU0000')
, col1, col2
FROM Inserted;

Cloning a table definition to a table variable in SQL Server

Is there a way to clone the table definition from an existing table and recreate as a table variable?
DECLARE #TempTable1 TABLE (ID INT, Description VARCHAR(256))
I need to recreate a set of tables with same number of columns and definitions without repeating the DECLARE TABLE statement.
This process is available on MySQL as below.
CREATE TABLE TempTable1 LIKE TempTableMain;
Is it possible to do this is Microsoft SQL Server?
Please note that the actual scenario contains more that 60 columns in the #TempTable and need to create more than 10 instances from the original table.
I am not talking about data insertion or SELECT ion from another table as below. I need to create the table definition.
DECLARE #TempTable TABLE(ID INT, Description VARCHAR(100))
INSERT INTO #TempTable
VALUES (1, 'Test1'), (1, 'Test1');
SELECT *
INTO #TempTable2
FROM #TempTable1
SELECT * FROM #TempTable2
Create a user defined type with the columns of your table, lets say like that:
CREATE TYPE MyTableType AS TABLE (ID INT, Description VARCHAR(256));
And then declare your table variables using this type:
DECLARE #Table1 MyTableType;
DECLARE #Table2 MyTableType;
DECLARE #Table3 MyTableType;
SQL Server management studio gives you the option to create a sql script to create an already existing table.
Right click your table -> script table as -> CREATE To -> New Query Editor window
This way you dont have to write out the whole query every single time.
You could even create a stored procedure which takes as argument the name of your to be created table and run this from a while loop.
You can perform the following command:
SELECT * INTO #MyTable_tmp FROM MyTable
Then modify your MyTable, and copy your data back in. Other approaches I've seen is to create a new table calling it Mytable_Tmp (Not a temp table), which will be your new table.
Then copy your data doing any migrations you need. Then you will drop the original table and do a rename on Mytable.
When you run SELECT * INTO #MyTable FROM MyTable, SQL Server creates a new temporary table called #MyTable that matches each column and data type from your select clause. In this case we are selecting * so it will match MyTable. This only creates the columns it doesn't copy defaults, constraints indexes or anything else.
If you are using table variables, it means that you don't want to use them in long period of time, as they will be "forgotten" after every script completion.
So, easiest in my opinion is to use such construct:
IF OBJECT_ID('tempdb.dbo.#tmpTable', 'U') IS NOT NULL
DROP TABLE #tmpTable;
SELECT * INTO #tmpTable FROM MyPrimaryTable
It creates temporary table exactly like yours, if you want empty table, you can just use:
SELECT * INTO #tmpTable FROM MyPrimaryTable WHERE 1 = 0
Then, temporary table will have exact same schema as your primary table.
You can apply as many times as you need (create as many temporary tables as you need).
You could use regular tables instead of temporary tables as well.
If you want to re-create table after dropping the existing table then you can use the below query.
/*
Create brands table
*/
-- Old block of code
IF EXISTS (SELECT * FROM sys.objects
WHERE object_id = OBJECT_ID(N'[TOY].[BRANDS]') AND type in (N'U'))
DROP TABLE [TOY].[BRANDS]
GO
-- New block of code
DROP TABLE IF EXISTS [TOY].[BRANDS]
GO
-- Add new table
CREATE TABLE TOY.BRANDS
(
ID INT NOT NULL,
NAME VARCHAR(20) NULL
)
GO
-- Load the table with data
INSERT INTO TOY.BRANDS (ID, NAME) VALUES
(1, 'Ford'),
(2, 'Chevy'),
(3, 'Dodge'),
(4, 'Plymouth'),
(5, 'Oldsmobile'),
(6, 'Lincoln'),
(7, 'Mercury');
GO

Insert Values from Table Variable into already EXISTING Temp Table

I'm successfully inserting values from Table Variable into new (not yet existing table) Temp Table. Have not issues when inserting small number of rows (eg. 10,000), but when inserting into a Table Variable a lot of rows (eg. 30,000) is throws an error "Server ran out of memory and external resources).
To walk around the issue:
I split my (60,000) Table Variable rows into small batches (eg. 10,000) each, thinking I could insert new data to already existing Temp Table, but I'm getting this error message:
There is already an object named '##TempTable' in the database.
My code is:
USE MyDataBase;
Go
Declare ##TableVariable TABLE
(
[ID] bigint PRIMARY KEY,
[BLD_ID] int NOT NULL
-- 25 more columns
)
Insert Into ##TableVariable VALUES
(1,25),
(2,30)
-- 61,000 more rows
Select * Into #TempTable From ##TableVariable;
Select Count(*) From #TempTable;
Below is the error message I'm getting
The problem is that SELECT INTO wants to create the destination table, so at second run you get the error.
first you have to create the #TempTable:
/* this creates the temptable copying the #TableVariable structure*/
Select *
Into #TempTable
From #TableVariable
where 1=0;
now you can loop through your batches and call this insert as many times you want..
insert Into #TempTable
Select * From #TableVariable;
pay attention that #TempTable is different from ##TempTable ( # = Local, ## = Global ) and remember to drop it when you have finished.
also you should NOT use ## for you table variable, use only #TableVariable
I hope this help

Insert into a temporary table and update another table in one SQL query (Oracle)

Here's what I'm trying to do:
1) Insert into a temp table some values from an original table
INSERT INTO temp_table SELECT id FROM original WHERE status='t'
2) Update the original table
UPDATE original SET valid='t' WHERE status='t'
3) Select based on a join between the two tables
SELECT * FROM original WHERE temp_table.id = original.id
Is there a way to combine steps 1 and 2?
You can combine the steps by doing the update in PL/SQL and using the RETURNING clause to get the updated ids into a PL/SQL table.
EDIT:
If you still need to do the final query, you can still use this method to insert into the temp_table; although depending on what that last query is for, there may be other ways of achieving what you want. To illustrate:
DECLARE
id_table_t IS TABLE OF original.id%TYPE INDEX BY PLS_INTEGER;
id_table id_table_t;
BEGIN
UPDATE original SET valid='t' WHERE status='t'
RETURNING id INTO id_table;
FORALL i IN 1..id_table.COUNT
INSERT INTO temp_table
VALUES (id_table(i));
END;
/
SELECT * FROM original WHERE temp_table.id = original.id;
No, DML statements can not be mixed.
There's a MERGE statement, but it's only for operations on a single table.
Maybe create a TRIGGER wich fires after inserting into a temp_table and updates the original
Create a cursor holding the values from insert and then loop through the cursor updating the table. No need to create temp table in the first place.
You can combine steps 1 and 2 using a MERGE statement and DML error logging. Select twice as many rows, update half of them, and force the other half to fail and then be inserted into an error log that you can use as your temporary table.
The solution below assumes that you have a primary key constraint on ID, but there are other ways you could force a failure.
Although I think this is pretty cool, I would recommend you not use it. It looks very weird, has some strange issues (the inserts into TEMP_TABLE are auto-committed), and is probably very slow.
--Create ORIGINAL table for testing.
--Primary key will be intentionally violated later.
create table original (id number, status varchar2(10), valid varchar2(10)
,primary key (id));
--Create TEMP_TABLE as error log. There will be some extra columns generated.
begin
dbms_errlog.create_error_log(dml_table_name => 'ORIGINAL'
,err_log_table_name => 'TEMP_TABLE');
end;
/
--Test data
insert into original values(1, 't', null);
insert into original values(2, 't', null);
insert into original values(3, 's', null);
commit;
--Update rows in ORIGINAL and also insert those updated rows to TEMP_TABLE.
merge into original original1
using
(
--Duplicate the rows. Only choose rows with the relevant status.
select id, status, valid, rownumber
from original
cross join
(select 1 rownumber from dual union all select 2 rownumber from dual)
where status = 't'
) original2
on (original1.id = original2.id and original2.rownumber = 1)
--Only math half the rows, those with rownumber = 1.
when matched then update set valid = 't'
--The other half will be inserted. Inserting ID causes a PK error and will
--insert the data into the error table, TEMP_TABLE.
when not matched then insert(original1.id, original1.status, original1.valid)
values(original2.id, original2.status, original2.valid)
log errors into temp_table reject limit 999999999;
--Expected: ORIGINAL rows 1 and 2 have VALID = 't'.
--TEMP_TABLE has the two original values for ID 1 and 2.
select * from original;
select * from temp_table;

Insert into Table select result set from stored procedure but column count is not same

I need something like that which is of course not working.
insert into Table1
(
Id,
Value
)
select Id, value from
(
exec MySPReturning10Columns
)
I wanted to populate Table1 from result set returned by MySPReturning10Columns. Here the SP is returning 10 columns and the table has just 2 columns.
The following way works as long as table and result set from SP have same number of columns but in my case they are not same.
INSERT INTO TableWith2Columns
EXEC usp_MySPReturning2Columns;
Also, I want to avoid adding "." as linked server just to make openquery and openrowset work anyhow.
Is there a way not to have define table strucutre in temp table (all columns with datatypes and lenght)? Something like CTE.
You could use a temporary table as a go-between:
insert into #TempTable exec MySP
insert into Table1 (id, value) select id, value from #TempTable
You could solve the problem in two steps by doing the insert from the stored procedure into a temporary table, then do the insert selecting just the columns you want from the temporary table.
Information on temporary tables: http://www.sqlteam.com/article/temporary-tables
-- Well, declare a temp table or a table var, depending on the number of rows expected
-- from the SP. This table will be basically the result set of your SP.
DECLARE #spResult AS TABLE
(
ID INT,
VALUE FLOAT,
....
);
-- Get the result set of the SP into the temp table.
INSERT #spResult EXEC STORED_PROC;
-- Now you can query the SP's result set for ID and Value;
INSERT Table1 (ID, VALUE)
SELECT ID, VALUE FROM #spResult;
You dont need to create a temporary table, you can do it with single query by creating temporary view like this
with tempView as EXEC MySPReturning10Columns insert into Table1 select id, value from tempView
The temporary view disappears as soon as the statement finishes execution