I'm debugging a stored procedure by running various pieces of the code manually. The problem is the code creates a lot of temp tables, so I have to add a lot of "DROP TABLE #name" to the start of my query in order for it to work for multiple runs.
Is there a way to drop all temp tables before my query runs?
You can find the solution in an old post here ...
I would suggest first make sure that its only deleting the table you want to delete, by print the query prepared
declare #sql nvarchar(max)
select #sql = isnull(#sql+';', '') + 'drop table ' + quotename(name)
from tempdb..sysobjects
where name like '##%'
select #sql
And then if all table names look good to you. Run with exec(SQL)
EXEC( #sql )
Below is the test I did for this, and it worked for because my temp tables has names starting with '##' which in my cases didn't pulled any global temp tables.
declare #sqlCreate nvarchar(max)
declare #sqldrop nvarchar(max)
--Creating Temp tables
set #sqlCreate = 'create table ##tempTable1( c int, b int, a nvarchar(50) ) ; create table ##tempTable2( c int, b int, a nvarchar(50) )'
EXEC( #sqlCreate )
--Preparing drop statement
select #sqldrop = isnull(#sqldrop+';', '') + 'drop table ' + quotename(name)
from tempdb..sysobjects
where name like '##%'
--Making sure that only my temp tables are returned.
select #sqldrop
--Executing the drop query
EXEC(#sqldrop)
Related
I am getting temp table with dynamically generated columns let say it is columns A,B,C,D etc from other source.
Now in my hand I have temp table with column generated. I had to write stored procedure with the use of temp table.
So my stored procedure is like
create proc someproc()
as
begin
Insert into #searchtable
select isnull(#temp.*,0.00)
End
Now #searchresult is table created by me to store temp table columns. The problem arises when I want to check isnull for #tempdb columns. Because from source it comes it may be 3 columns, again next time it may be 4 columns. It changes.
Since it is dynamically generated I cannot use each column name and use like below:
isnull(column1,0.00)
isnull(column2,0.00)
I had to use all column generated and check if value is empty use 0.00
I tried this below but not working:
isnull(##temp.*,0.00),
Try with Dynamic code by fetching the column name for your dynamic table from [database].NFORMATION_SCHEMA.COLUMNS
--Get the Column Names for the your dynamic table and add the ISNULL Check:
DECLARE #COLS VARCHAR(MAX) = ''
SELECT #COLS = #COLS + ', ISNULL(' + COLUMN_NAME + ', 0.00) AS ' + COLUMN_NAME
FROM tempdb.INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME LIKE '#temp[_]%' -- Dynamic Table (here, Temporary table)
DECLARE #COLNAMES VARCHAR(MAX) = STUFF(#COLS, 1, 1, '')
--Build your Insert Command:
DECLARE #cmd VARCHAR(MAX) = '
INSERT INTO #temp1
SELECT ' + #COLNAMES + ' FROM #temp'
--Execute:
EXEC (#cmd)
Hope, I understood your comment right:
CREATE PROCEDURE someproc
AS
IF OBJECT_ID(N'#searchtable') IS NOT NULL DROP TABLE #searchtable
IF OBJECT_ID(N'#temp') IS NOT NULL
BEGIN
DECLARE #sql nvarchar(max),
#cols nvarchar(max)
SELECT #cols = (
SELECT ',COALESCE('+QUOTENAME([name])+',0.00) as '+QUOTENAME([name])
FROM sys.columns
WHERE [object_id] = OBJECT_ID(N'#temp')
FOR XML PATH('')
)
SELECT #sql = N'SELECT '+STUFF(#cols,1,1,'')+' INTO #searchtable FROM #temp'
EXEC sp_executesql #sql
END
This SP checks if #temp table exists. If exists then it takes all column names from sys.columns table and we make a string like ,COALESCE([Column1],0.00) as [Column1], etc. Then we make a dynamic SQL query like:
SELECT COALESCE([Column1],0.00) as [Column1] INTO #searchtable FROM #temp
And execute it. This query result will be stored in #searchtable.
Notes: Use COALESCE instead of ISNULL, and sp_executesql instead of direct exec. It is a good practice.
I'm currently shifting roles at my job and trying to teach myself some SQL Skills.
Scenario: I'm in charge of 1 database - 10 tables with 10 Primary Keys. Every month, our code team publishes updates to the tables. I am suppose to drop the tables and generate scripts to create the updated tables.
Rather than just drop the old tables and stored procedures, I want to rename my current tables to preserve the structure/data for whatever reason.
In my database, I have an additional table called "TableUpdateList" with 1 column "TableName" and 10 rows - each row containing the name of the updated column (Row 1 = TableName1, Row 2 = TableName2, Row 3 = TableName3)
I would like to be able to "loop" through the TableUpdateList Table and insert each value into a set of SQL statements.
For Example, here are the SQL statements I want to run:
--drop the previous backup table
IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLES where TABLE_NAME = '*TableName1*'+'_Old') DROP TABLE TableName1_Old
-- rename the current tables to _old
EXEC sp_rename *TableName1*, TableName1_Old;
I'm trying to find a way to scroll through the column of my TableUpdateList and run the above two statements filling in where I've italicized with whatever value is present in that row.
Just taking a wild stab because I think in order to get an answer here, you have to try something so here is my pseudo-code:
Declare #TableNames as List
For i in #TableNames
IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLES where TABLE_NAME = '*i*'+'_Old') DROP TABLE TableName1_Old
-- rename the current tables to _old
EXEC sp_rename *i*, TableName1_Old;
Oi, thanks in advance for any help or a point in the right direction to where I could do some further reading about the above online.
You can use sp_executesql with CURSORS for such type of work. Here is what i think you need:
Test objects:
CREATE TABLE TableName1 ( ID INT )
GO
CREATE TABLE TableName2 ( ID INT )
GO
CREATE TABLE TableNames ( Name NVARCHAR(MAX) )
GO
INSERT INTO TableNames
VALUES ( 'TableName1' ),
( 'TableName2' )
Script itself:
DECLARE #name NVARCHAR(MAX) ,
#dropStatement NVARCHAR(MAX),
#renameStatement NVARCHAR(MAX)
DECLARE cur CURSOR FAST_FORWARD READ_ONLY
FOR
SELECT Name
FROM dbo.TableNames
OPEN cur
FETCH NEXT FROM cur INTO #name
WHILE ##FETCH_STATUS = 0
BEGIN
IF EXISTS ( SELECT *
FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_NAME = #name + '_Old' )
BEGIN
SET #dropStatement = 'DROP TABLE ' + #name + '_Old'
EXEC sp_executesql #dropStatement
END
SET #renameStatement = 'sp_rename ' + #name + ', ' + #name + '_Old';
EXEC sp_executesql #renameStatement
FETCH NEXT FROM cur INTO #name
END
CLOSE cur
DEALLOCATE cur
After this you should add TableName1 and TableName2 again.
Cursors must be avoided as long as possible.
--Preparing script which would check if the old tables exists. If it does,
--it drops the old table
--e.g. first the value 'Table1' is found in TableUpdateList table.
--Then, Table1_Old is deleted and Table1 is renamed to Table1_Old
SELECT 'DROP TABLE ' + b.name + '_Old; EXEC sp_rename ''' + b.name+ ''', ''' + b.name+ '_Old;''' AS [Action]
INTO #Action
FROM INFORMATION_SCHEMA.TABLES A JOIN TableUpdateList B ON A.TABLE_NAME = b.NAME + '_Old'
DECLARE #sql VARCHAR(8000)
SELECT #sql = COALESCE(#sql + ' ', '') + [Action]
FROM #Action
select #sql
--EXEC (#sql)
First verify the value of variable #sql. Then, uncomment the last line to execute the code.
SQL fiddle
I need some help
My goal:
I have a #query in a stored procedure which is working perfectly. Now I have to perform more operation on the returning table from #query execution within same stored procedure. e.g I have to add more columns in the returning table and add new data from more queries.
My problem:
I am not able to get returning table form (EXEC sp_executesql) into a variable (#TempTable). And one more problem is that the number of columns returning are not known (dynamic).
Steps should be like this:
Declare TempTable
TempTable = EXEC sq_executesql
Add new columns to TempTable
Fill more data
Kindly guide me
this can be achieved using global temp tables like below :
DECLARE #sql NVARCHAR(1000)
DECLARE #Column NVARCHAR(1000)
SET #Column = 'YouColumnList' -- id,name etc. created by you dynamically.
IF( Object_id('tempdb..##IntermediateTable') IS NOT NULL )
DROP TABLE ##IntermediateTable
SET #sql = '
SELECT ' + #Column + '
Into ##IntermediateTable
FROM YourTable
WHERE id = 123
'
EXEC sp_executesql
#sql
IF( Object_id('tempdb..#temptable') IS NOT NULL )
DROP TABLE #temptable
SELECT *
INTO #temptable
FROM ##IntermediateTable
IF( Object_id('tempdb..##IntermediateTable') IS NOT NULL )
DROP TABLE ##IntermediateTable
SELECT *
FROM #temptable --resulting temptable to use. alter it or do whatever desired.
Was trying to select...into a temp Table #TempTable in sp_Executedsql.
Not its successfully inserted or not but there Messages there written
(359 row(s) affected) that mean successful inserted?
Script below
DECLARE #Sql NVARCHAR(MAX);
SET #Sql = 'select distinct Coloum1,Coloum2 into #TempTable
from SPCTable with(nolock)
where Convert(varchar(10), Date_Tm, 120) Between #Date_From And #Date_To';
SET #Sql = 'DECLARE #Date_From VARCHAR(10);
DECLARE #Date_To VARCHAR(10);
SET #Date_From = '''+CONVERT(VARCHAR(10),DATEADD(d,DATEDIFF(d,0,GETDATE()),0)-1,120)+''';
SET #Date_To = '''+CONVERT(VARCHAR(10),DATEADD(d,DATEDIFF(d,0,GETDATE()),0)-1,120)+''';
'+ #Sql;
EXECUTE sp_executesql #Sql;
After executed,its return me on messages (359 row(s) affected).
Next when trying to select out the data from #TempTable.
Select * From #TempTable;
Its return me:
Msg 208, Level 16, State 0, Line 2
Invalid object name '#TempTable'.
Suspected its working only the 'select' section only. The insert is not working.
how fix it?
Using a global temporary table in this scenario could cause problems as the table would exist between sessions and may result in some problems using the calling code asynchronously.
A local temporary table can be used if it defined before calling sp_executesql e.g.
CREATE TABLE #tempTable(id int);
execute sp_executesql N'INSERT INTO #tempTable SELECT myId FROM myTable';
SELECT * FROM #tempTable;
Local temporary table #table_name is visible in current session only, global temporary ##table_name tables are visible in all sessions. Both lives until their session is closed.
sp_executesql - creates its own session (maybe word "scope" would be better) so that's why it happens.
In your #sql string, don't insert into #TempTable. Instead, call your SELECT statement without an INSERT statement.
Finally, insert the results into your temporary table like so:
INSERT INTO #tmpTbl EXEC sp_executesql #sql
Also, you'll need to declare the temporary table if you use this approach
DECLARE #tmpTbl TABLE (
//define columns here...
)
your temp table in dynamic SQL is out of scope in the non dynamic SQL part.
Look here how to deal with this: A bit about sql server's local temp tables
Temporary tables only live as long as the connection that creates them. I would expect that you're unintentionally issuing the select on a separate connection. You can test this by momentarily doing your insert into a non-temporary table and seeing if your data is there. If that is the case you can go back to your original solution and just be sure to pass the connection object to your select.
declare #sql varchar(1000)
set #sql="select * into #t from table;"
set #sql =#sql + "select * from #t;"
execute SP_EXECUTESQL #sql
This worked for me
declare #sql nvarchar(max)
create table #temp ( listId int, Name nvarchar(200))
set #sql = 'SELECT top 10 ListId, Name FROM [V12-ListSelector].[dbo].[List]'
insert into #temp
exec sp_executesql #sql
select * from #temp
drop table #temp
To work around this issue use a CREATE TABLE #TEMPTABLE command first to generate an empty temp table before running sp_executesql. Then run the INSERT INTO #TEMPTABLE with sp_executesql. This will work. This is how I overcome this problem as I have a setup in which all my queries are usually run via sp_executesql.
This one worked for me:
DECLARE #Query as NVARCHAR(MAX);
SET #Query=(SELECT * FROM MyTable) ;
SET #Query=(SELECT 'SELECT * INTO dbo.TempTable FROM ('+#Query +') MAIN;');
EXEC sp_executesql #Query;
SELECT * INTO #TempTable FROM dbo.TempTable;
DROP TABLE dbo.TempTable;
SELECT * FROM #TempTable;
Note, from T-SQL 2021 onwards, dm_exec_describe_first_result_set() can be used to build a temporary table in the right shape to INSERT INTO - as it gives you the column names and types that will be returned from your dynamic SELECT or EXEC ... so you can build dynamic SQL to ALTER a temporary table into the shape you need.
DECLARE #strSQL NVarChar(max) = 'EXEC [YourSP] #dtAsAt=''2022-11-09'', #intParameter2=42'
--*** Build temporary table: create it with dummy column, add columns dynamically
--*** using an exec of sys.dm_exec_describe_first_result_set() and dropping the dummy column
DROP TABLE IF EXISTS #tblResults;
CREATE TABLE #tblResults ([zz] INT);
DECLARE #strUpdateSQL NVarChar(max);
SELECT #strUpdateSQL = STRING_AGG( CONCAT( 'ALTER TABLE #tblResults ADD ',
QUOTENAME([name]), ' ',
[system_type_name], ';')
, ' ') WITHIN GROUP (ORDER BY [column_ordinal])
FROM sys.dm_exec_describe_first_result_set (#strSQL, NULL, 0)
SET #strUpdateSQL += 'ALTER TABLE #tblResults DROP COLUMN [zz];'
EXEC (#strUpdateSQL);
--*** Now we have #tblResults in the right shape to insert into, and use afterwards
INSERT INTO #tblResults EXEC (#strSQL);
SELECT * FROM #tblResults;
--*** And tidy up
DROP TABLE IF EXISTS #tblResults;
I am writing a query to pivoting table elements where column name is generated dynamically.
SET #query = N'SELECT STUDENT_ID, ROLL_NO, TITLE, STUDENT_NAME, EXAM_NAME, '+
#cols +
' INTO ##FINAL
FROM
(
SELECT *
FROM #AVERAGES
UNION
SELECT *
FROM #MARKS
UNION
SELECT *
FROM #GRACEMARKS
UNION
SELECT *
FROM #TOTAL
) p
PIVOT
(
MAX([MARKS])
FOR SUBJECT_ID IN
( '+
#cols +' )
) AS FINAL
ORDER BY STUDENT_ID ASC, DISPLAYORDER ASC, EXAM_NAME ASC;'
EXECUTE(#query)
select * from ##FINAL
This query works properly in my local database, but it doesn't work in SQL Azure since global temp tables are not allowed there.
Now if i change ##FINAL to #FINAL in my local database, but it gives me error as
Invalid object name '#FINAL' .
How can I resolve this issue?
Okay, after saying I didn't think it could be done, I might have a way. It's ugly though. Hopefully, you can play with the below sample and adapt it to your query (without having your schema and data, it's too tricky for me to attempt to write it):
declare #cols varchar(max)
set #cols = 'object_id,schema_id,parent_object_id'
--Create a temp table with the known columns
create table #Boris (
ID int IDENTITY(1,1) not null
)
--Alter the temp table to add the varying columns. Thankfully, they're all ints.
--for unknown types, varchar(max) may be more appropriate, and will hopefully convert
declare #tempcols varchar(max)
set #tempcols = #cols
while LEN(#tempcols) > 0
begin
declare #col varchar(max)
set #col = CASE WHEN CHARINDEX(',',#tempcols) > 0 THEN SUBSTRING(#tempcols,1,CHARINDEX(',',#tempcols)-1) ELSE #tempcols END
set #tempcols = CASE WHEN LEN(#col) = LEN(#tempcols) THEN '' ELSE SUBSTRING(#tempcols,LEN(#col)+2,10000000) END
declare #sql1 varchar(max)
set #sql1 = 'alter table #Boris add [' + #col + '] int null'
exec (#sql1)
end
declare #sql varchar(max)
set #sql = 'insert into #Boris (' + #cols + ') select ' + #cols + ' from sys.objects'
exec (#sql)
select * from #Boris
drop table #Boris
They key is to create the temp table in the outer scope, and then inner scopes (code running within EXEC statements) have access to the same temp table. The above worked on SQL Server 2008, but I don't have an Azure instance to play with, so not tested there.
If you create a temp table, it's visible from dynamic sql executed in your spid, if you create the table in dynamic sql, it's not visible outside of that.
There is a workaround. You can create a stub table and alter it in your dynamic sql. It requires a bit of string manipulation but I've used this technique to generate dynamic datasets for tsqlunit.
CREATE TABLE #t1
(
DummyCol int
)
EXEC(N'ALTER TABLE #t1 ADD foo INT')
EXEC ('insert into #t1(DummyCol, foo)
VALUES(1,2)')
EXEC ('ALTER TABLE #t1 DROP COLUMN DummyCol')
select *from #t1