I need to store dynamic sql result into a temporary table #Temp.
Dynamic SQL Query result is from a pivot result, so number of columns varies(Not fixed).
SET #Sql = N'SELECT ' + #Cols + ' FROM
(
SELECT ResourceKey, ResourceValue
FROM LocaleStringResources where StateId ='
+ LTRIM(RTRIM(#StateID)) + ' AND FormId =' + LTRIM(RTRIM(#FormID))
+ ' AND CultureCode =''' + LTRIM(RTRIM(#CultureCode)) + '''
) x
pivot
(
max(ResourceValue)
for ResourceKey IN (' + #Cols + ')
) p ;'
--#Cols => Column Names which varies in number
Now I have to insert dynamic sql result to #Temp Table and use this #Temp Table with another existing table to perform joins or something else.
(#Temp table should exist there to perform operations with other existing tables)
How can I Insert dynamic SQL query result To a Temporary table?
Thanks
Can you please try the below query.
SET #Sql = N'SELECT ' + #Cols + '
into ##TempTable
FROM
(
SELECT ResourceKey, ResourceValue
FROM LocaleStringResources where StateId ='
+ LTRIM(RTRIM(#StateID)) + ' AND FormId =' + LTRIM(RTRIM(#FormID))
+ ' AND CultureCode =''' + LTRIM(RTRIM(#CultureCode)) + '''
) x
pivot
(
max(ResourceValue)
for ResourceKey IN (' + #Cols + ')
) p ;'
You can then use the ##TempTable for further operations.
However, do not forget to drop the ##TempTable at the end of your query as it will give you error if you run the query again as it is a Global Temporary Table
As was answered in (https://social.msdn.microsoft.com/Forums/sqlserver/en-US/144f0812-b3a2-4197-91bc-f1515e7de4b9/not-able-to-create-hash-table-inside-stored-proc-through-execute-spexecutesql-strquery?forum=sqldatabaseengine),
you need to create a #Temp table in advance:
CREATE TABLE #Temp(columns definition);
It seems that the task is impossible, if you know nothing about the dynamic list of columns in advance. But, most likely you do know something.
You do know the types of dynamic columns, because they come from PIVOT. Most likely, you know the maximum possible number of dynamic columns. Even if you don't, SQL Server has a limit of 1024 columns per (nonwide) table and there is a limit of 8060 bytes per row (http://msdn.microsoft.com/en-us/library/ms143432.aspx). So, you can create a #Temp table in advance with maximum possible number of columns and use only some of them (make all your columns NULLable).
So, CREATE TABLE will look like this (instead of int use your type):
CREATE TABLE #Temp(c1 int NULL, c2 int NULL, c3 int NULL, ..., c1024 int NULL);
Yes, column names in #Temp will not be the same as in #Cols. It should be OK for your processing.
You have a list of columns in your #Cols variable. You somehow make this list of columns in some external code, so when #Cols is generated you know how many columns there are. At this moment you should be able to generate a second list of columns that matches the definition of #Temp. Something like:
#TempCols = N'c1, c2, c3, c4, c5';
The number of columns in #TempCols should be the same as the number of columns in #Cols. Then your dynamic SQL would look like this (I have added INSERT INTO #Temp (#TempCols) in front of your code):
SET #Sql = N'INSERT INTO #Temp (' + #TempCols + N') SELECT ' + #Cols + N' FROM
(
SELECT ResourceKey, ResourceValue
FROM LocaleStringResources where StateId ='
+ LTRIM(RTRIM(#StateID)) + ' AND FormId =' + LTRIM(RTRIM(#FormID))
+ ' AND CultureCode =''' + LTRIM(RTRIM(#CultureCode)) + '''
) x
pivot
(
max(ResourceValue)
for ResourceKey IN (' + #Cols + ')
) p ;'
Then you execute your dynamic SQL:
EXEC (#Sql) OR sp_executesql #Sql
And then do other processing using the #Temp table and temp column names c1, c2, c3, ...
MSDN says:
A local temporary table created in a stored procedure is dropped
automatically when the stored procedure is finished.
You can also DROP the #Temp table explicitly, like this:
IF OBJECT_ID('tempdb..#Temp') IS NOT NULL
DROP TABLE #Temp'
All this T-SQL code (CREATE TABLE, EXEC, ...your custom processing..., DROP TABLE) would naturally be inside the stored procedure.
Alternative to create a temporary table is to use the subquery
select t1.name,t1.lastname from(select * from table)t1.
where "select * from table" is your dyanmic query. which will return result which you can use as temp table t1 as given in example .
IF OBJECT_ID('tempdb..##TmepTable') IS NOT NULL DROP TABLE ##TmepTable
CREATE TABLE ##TmepTable (TmpCol CHAR(1))
DECLARE #SQL NVARCHAR(max) =' IF OBJECT_ID(''tempdb..##TmepTable'') IS NOT
NULL DROP TABLE ##TmepTable
SELECT * INTO ##TmepTable from [MyTableName]'
EXEC sp_executesql #SQL
SELECT Alias.* FROM ##TmepTable as Alias
IF OBJECT_ID('tempdb..##TmepTable') IS NOT NULL DROP TABLE ##TmepTable
Here is step by step solution for your problem.
Check for your temporary tables if they exist, and delete them.
IF OBJECT_ID('tempdb..#temp') IS NOT NULL
DROP TABLE #temp
IF OBJECT_ID('tempdb..##abc') IS NOT NULL
DROP TABLE ##abc
Store your main query result in first temp table (this step is for simplicity and more readability).
SELECT *
INTO #temp
FROM (SELECT ResourceKey, ResourceValue
FROM LocaleStringResources
where StateId ='+ LTRIM(RTRIM(#StateID)) + ' AND FormId =' + LTRIM(RTRIM(#FormID))
+ ' AND CultureCode =' + LTRIM(RTRIM(#CultureCode)) + ') AS S
Write below query to create your pivot and store result in another temp table.
DECLARE #str NVARCHAR(1000)
DECLARE #sql NVARCHAR(1000)
SELECT #str = COALESCE(#str+',', '') + ResourceKey FROM #temp
SET #sql = N'select * into ##abc from (select ' + #str + ' from (SELECT ResourceKey, ResourceValue FROM #temp) as A
Pivot
(
max(ResourceValue)
for ResourceKey in (' + #str + ')
)as pvt) as B'
Execute below query to get the pivot result in your next temp table ##abc.
EXECUTE sp_executesql #sql
And now you can use ##abc as table where-ever you want like
select * from ##abc
Hope this will help you.
Related
I'm new to sql and i'm trying to create SSRS.
I found this code in internet to create SSRS report and it works good to me. However i need to adjust this code to get the value as well from selected column
USE [project]
GO
/****** Object: StoredProcedure [dbo].[Report] Script Date: 26-1-2020 01:19:45 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER procedure [dbo].[Report]
#SchemaName VARCHAR(128)='sys',
#TableName VARCHAR(128)='columns',
#ColumnList VARCHAR(MAX)='object_id,column_id,name,max_length,system_type_id'
AS
BEGIN
DECLARE #ColumnNames VARCHAR(MAX)
DECLARE #ColumnNamesVAR VARCHAR(MAX)
--drop ##Temp_Data Table
IF OBJECT_ID('tempdb..##Temp_Data') IS NOT NULL
DROP TABLE ##Temp_Data
--drop ##Temp_Data_Final Table
IF OBJECT_ID('tempdb..##Temp_Data_Final') IS NOT NULL
DROP TABLE ##Temp_Data_Final
--drop #Temp_Columns Table
IF OBJECT_ID('tempdb..#Temp_Columns') IS NOT NULL
DROP TABLE #Temp_Columns
Create table #ColumnList (Data NVARCHAR(MAX))
insert into #ColumnList values (#ColumnList)
--convert all column list to VARCHAR(1000) for unpivot
;with Cte_ColumnList as (
SELECT
'['+LTRIM(RTRIM(m.n.value('.[1]','varchar(8000)')))+']' AS ColumnList
FROM
(
SELECT CAST('<XMLRoot><RowData>' + REPLACE(Data,',','</RowData><RowData>')
+ '</RowData></XMLRoot>' AS XML) AS x
FROM #ColumnList
)t
CROSS APPLY x.nodes('/XMLRoot/RowData')m(n))
,CTE_ColumnListVarchar as
(Select 'CAST('+ColumnList+' as VARCHAR(1000)) AS '+ColumnList AS ColumnListVAR,ColumnList from Cte_ColumnList)
SELECT #ColumnNamesVAR = COALESCE(#ColumnNamesVAR + ', ', '') + ColumnListVAR,
#ColumnNames = COALESCE(#ColumnNames + ', ', '') + ColumnList
FROM CTE_ColumnListVarchar
--Insert data into ##Temp_Data Table
DECLARE #SQL NVARCHAR(MAX)
DECLARE #TempTbleSQL NVARCHAR(MAX)
SET #TempTbleSQL='Select ROW_NUMBER()
OVER (order by (Select 1)) AS R,'+#ColumnNames +' into ##Temp_Data from ['+#SchemaName+'].['+#TableName+']'
--Print #TempTbleSQL
EXEC(#TempTbleSQL)
SET #SQL='
select
R,columnname,value into ##Temp_Data_Final from
(select R,'+#ColumnNamesVAR+' from ##Temp_Data )u
unpivot
(value for columnname in ('+#ColumnNames+'))v'
--Print #SQL
EXEC(#SQL)
Select * From ##Temp_Data_Final
END
SO, Now i can select Schema, Table & column. but i couldn't know how i get drop list for values in selected column.
And one more thing. how i can deploy this report to web form.Or if there any way to create dynamic sql with cascading parameters where i can select schema, table, column and values
PLEASE SOMEBODY HELP ME WITH THIS IT REALLY IMPORTANT
Here i can choose Schema, then table and the column. So i want to extend the code to be able to get another drop list with value of selected column
I used also the following datasets for each parameter
--ds_schema
SELECT NAME AS schemaname FROM sys.schemas
WHERE NAME not in (
'guest',
'information_schema',
'sys',
'db_owner',
'db_accessadmin',
'db_securityadmin',
'db_ddladmin',
'db_backupoperator',
'db_datareader',
'db_datawriter',
'db_denydatareader',
'db_denydatawriter')
----DSTables
Select Distinct Table_Name as TableName from INFORMATION_SCHEMA.TABLES
where TABLE_SCHEMA=#SchemaName
order by Table_Name
----DS_Columns
Select COLUMN_NAME as ColumnName from INFORMATION_SCHEMA.COLUMNS
where TABLE_SCHEMA=#SchemaName
and TABLE_NAME=#TableName
To get a list of values in a column you need to build a SQL statement then execute it.
As you have your parameters you can do something like this...
SET #SQL = 'SELECT DISTINCT ' + QUOTENAME(#ColumnName) + ' FROM ' + QUOTENAME(#SchemaName) + '.' + QUOTENAME(#TableName) + ' ORDER BY ' + QUOTENAME(#ColumnName)
EXEC (#SQL)
notes
This gives a DISTINCT list of values and also sorts them using the ORDER BY clause, just edit the SET #SQL = line to adjust the query that is executed.
I've used QUOTENAME() to put square brackets around the schema, table and column names e.g. SELECT DISTINCT [myColumnName] FROM .....
You can add PRINT #SQL at the end to see the generated SQL if you like.
I am using SQL Server 2012. i have a table with 90 columns. I am trying to select only columns that contains data. After searching i used the following procedure:
1- Getting all columns count using one select query
2- Pivoting Result Table into a Temp table
3- Creating Select query
4- Executing this query
Here is the query i used:
DECLARE #strTablename varchar(100) = 'dbo.MyTable'
DECLARE #strQuery varchar(max) = ''
DECLARE #strSecondQuery varchar(max) = 'SELECT '
DECLARE #strUnPivot as varchar(max) = ' UNPIVOT ([Count] for [Column] IN ('
CREATE TABLE ##tblTemp([Column] varchar(50), [Count] Int)
SELECT #strQuery = ISNULL(#strQuery,'') + 'Count([' + name + ']) as [' + name + '] ,' from sys.columns where object_id = object_id(#strTablename) and is_nullable = 1
SELECT #strUnPivot = ISNULL(#strUnPivot,'') + '[' + name + '] ,' from sys.columns where object_id = object_id(#strTablename) and is_nullable = 1
SET #strQuery = 'SELECT [Column],[Count] FROM ( SELECT ' + SUBSTRING(#strQuery,1,LEN(#strQuery) - 1) + ' FROM ' + #strTablename + ') AS p ' + SUBSTRING(#strUnPivot,1,LEN(#strUnPivot) - 1) + ')) AS unpvt '
INSERT INTO ##tblTemp EXEC (#strQuery)
SELECT #strSecondQuery = #strSecondQuery + '[' + [Column] + '],' from ##tblTemp WHERE [Count] > 0
DROP TABLE ##tblTemp
SET #strSecondQuery = SUBSTRING(#strSecondQuery,1,LEN(#strSecondQuery) - 1) + ' FROM ' + #strTablename
EXEC (#strSecondQuery)
The problem is that this query is TOO SLOW. Is there a best way to achieve this?
Notes:
Table have only one clustered index on primary key Column ID and does not contains any other indexes.
Table is not editable.
Table contains very large data.
Query is taking about 1 minute to be executed
Thanks in advance.
I do not know if this is faster, but you might use one trick: FOR XML AUTO will ommit columns without content:
DECLARE #tbl TABLE(col1 INT,col2 INT,col3 INT);
INSERT INTO #tbl VALUES (1,2,NULL),(1,NULL,NULL),(NULL,NULL,NULL);
SELECT *
FROM #tbl AS tbl
FOR XML AUTO
This is the result: col3 is missing...
<tbl col1="1" col2="2" />
<tbl col1="1" />
<tbl />
Knowing this, you could find the list of columns, which are not NULL in all rows, like this:
DECLARE #ColList VARCHAR(MAX)=
STUFF
(
(
SELECT DISTINCT ',' + Attr.value('local-name(.)','nvarchar(max)')
FROM
(
SELECT
(
SELECT *
FROM #tbl AS tbl
FOR XML AUTO,TYPE
) AS TheXML
) AS t
CROSS APPLY t.TheXML.nodes('/tbl/#*') AS A(Attr)
FOR XML PATH('')
),1,1,''
);
SELECT #ColList
The content of #ColList is now col1,col2. This string you can place in a dynamically created SELECT.
UPDATE: Hints
It would be very clever, to replace the SELECT * with a column list created from INFORMATION_SCHEMA.COLUMNS excluding all not-nullable. And - if needed and possible - types, wich contain very large data (BLOBs).
UPDATE2: Performance
Don't know what your very large data means actually... Just tried this on a table with about 500.000 rows (with SELECT *) and it returned correctly after less than one minute. Hope, this is fast enough...
Try using this condition:
where #columnname IS NOT NULL AND #columnname <> ' '
I am getting temp table with dynamically generated columns let say it is columns A,B,C,D etc from other source.
Now in my hand I have temp table with column generated. I had to write stored procedure with the use of temp table.
So my stored procedure is like
create proc someproc()
as
begin
Insert into #searchtable
select isnull(#temp.*,0.00)
End
Now #searchresult is table created by me to store temp table columns. The problem arises when I want to check isnull for #tempdb columns. Because from source it comes it may be 3 columns, again next time it may be 4 columns. It changes.
Since it is dynamically generated I cannot use each column name and use like below:
isnull(column1,0.00)
isnull(column2,0.00)
I had to use all column generated and check if value is empty use 0.00
I tried this below but not working:
isnull(##temp.*,0.00),
Try with Dynamic code by fetching the column name for your dynamic table from [database].NFORMATION_SCHEMA.COLUMNS
--Get the Column Names for the your dynamic table and add the ISNULL Check:
DECLARE #COLS VARCHAR(MAX) = ''
SELECT #COLS = #COLS + ', ISNULL(' + COLUMN_NAME + ', 0.00) AS ' + COLUMN_NAME
FROM tempdb.INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME LIKE '#temp[_]%' -- Dynamic Table (here, Temporary table)
DECLARE #COLNAMES VARCHAR(MAX) = STUFF(#COLS, 1, 1, '')
--Build your Insert Command:
DECLARE #cmd VARCHAR(MAX) = '
INSERT INTO #temp1
SELECT ' + #COLNAMES + ' FROM #temp'
--Execute:
EXEC (#cmd)
Hope, I understood your comment right:
CREATE PROCEDURE someproc
AS
IF OBJECT_ID(N'#searchtable') IS NOT NULL DROP TABLE #searchtable
IF OBJECT_ID(N'#temp') IS NOT NULL
BEGIN
DECLARE #sql nvarchar(max),
#cols nvarchar(max)
SELECT #cols = (
SELECT ',COALESCE('+QUOTENAME([name])+',0.00) as '+QUOTENAME([name])
FROM sys.columns
WHERE [object_id] = OBJECT_ID(N'#temp')
FOR XML PATH('')
)
SELECT #sql = N'SELECT '+STUFF(#cols,1,1,'')+' INTO #searchtable FROM #temp'
EXEC sp_executesql #sql
END
This SP checks if #temp table exists. If exists then it takes all column names from sys.columns table and we make a string like ,COALESCE([Column1],0.00) as [Column1], etc. Then we make a dynamic SQL query like:
SELECT COALESCE([Column1],0.00) as [Column1] INTO #searchtable FROM #temp
And execute it. This query result will be stored in #searchtable.
Notes: Use COALESCE instead of ISNULL, and sp_executesql instead of direct exec. It is a good practice.
I'm working with MS SQL Server 2008. I'm trying to create a stored procedure to Merge (perhaps) several rows of data (answers) into a single row on target table(s). This uses a 'table_name' field and 'column_name' field from the answers table. The data looks like something like this:
answers table
--------------
id int
table_name varchar
column_name varchar
answer_value varchar
So, the target table (insert/update) would come from the 'table_name'. Each row from the anwsers would fill one column on the target table.
table_name_1 table
--------------
id int
column_name_1 varchar
column_name_2 varchar
column_name_3 varchar
etc...
Note, there can be many target tables (variable from answers table: table_name_1, table_name_2, table_name_3, etc.) that insert into many columns (column_name_1...2...3) on each target table.
I thought about using a WHILE statement to loop through the answers table. This could build a variable which would be the insert/update statement(s) for the target tables. Then executing those statements somehow. I also noticed Merge looks like it might help with this problem (select/update/insert), but my MS SQL Stored Procedure experience is very little. Could someone suggestion a strategy or solution to this problem?
Note 6/23/2014: I'm considering using a single Merge statement, but I'm not sure it is possible.
I'm probably missing something, but the basic idea to solve the problem is to use meta-programming, like a dynamic pivot.
In this particular case there is another layer to make the solution more difficult: the result need to be in different execution instead of beeing grouped.
The backbone for a possible solution is
DECLARE #cols AS NVARCHAR(MAX)
DECLARE #query AS NVARCHAR(MAX)
--using a cursor on SELECT DISTINCT table_name FROM answers iterate:
--*Cursor Begin Here*
--mock variable for the first value of the cursor
DECLARE #table AS NVARCHAR(MAX) = 't1'
-- Column list
SELECT #cols = STUFF((SELECT distinct
',' + QUOTENAME(column_name)
FROM answers with (nolock)
WHERE table_name = #table
FOR XML PATH(''), TYPE
).value('.', 'NVARCHAR(MAX)')
, 1, 1, '')
--Query definition
SET #query = '
SELECT ' + #cols + '
INTO ' + #table + '
FROM (SELECT column_name, answer_value
FROM answers
WHERE table_name = ''' + #table + ''') b
PIVOT (MAX(answer_value) FOR column_name IN (' + #cols + ' )) p '
--select #query
EXEC sp_executesql #query
--select to verify the execution
--SELECT * FROM t1
--*Cursor End Here*
SQLFiddle Demo
The cursor definition is omitted, because I'm not sure if it'll work on SQLFiddle
In addition to the template for a Dynamic Pivot the columns list is filtered by the new table name, and in the query definition there is a SELECT ... INTO instead of a SELECT.
This script does not account for table already in the database, if that's a possibility the query can be divided in two:
SET #query = '
SELECT TOP 0 ' + #cols + '
INTO ' + #table + '
FROM (SELECT column_name, answer_value
FROM answers
WHERE table_name = ''' + #table + ''') b
PIVOT (MAX(answer_value) FOR column_name IN (' + #cols + ' )) p '
to create the table without data, if needed, and
SET #query = '
INSERT INTO ' + #table + '(' + #cols + ')'
SELECT ' + #cols + '
FROM (SELECT column_name, answer_value
FROM answers
WHERE table_name = ''' + #table + ''') b
PIVOT (MAX(answer_value) FOR column_name IN (' + #cols + ' )) p '
or a MERGE to insert/update the values in the table.
Another possibility will be to DROP and recreate every table.
Approach I took to this complex problem:
Create several temporary tables to work with your data
Select and populate the temporary tables with the data
Use dynamic pivoting to pivot the rows into one row
Use a CURSOR with WHILE loop for multiple table entries
SET #query with the dynamically built MERGE statement
EXECUTE(#query)
Drop temporary tables
I am writing a query to pivoting table elements where column name is generated dynamically.
SET #query = N'SELECT STUDENT_ID, ROLL_NO, TITLE, STUDENT_NAME, EXAM_NAME, '+
#cols +
' INTO ##FINAL
FROM
(
SELECT *
FROM #AVERAGES
UNION
SELECT *
FROM #MARKS
UNION
SELECT *
FROM #GRACEMARKS
UNION
SELECT *
FROM #TOTAL
) p
PIVOT
(
MAX([MARKS])
FOR SUBJECT_ID IN
( '+
#cols +' )
) AS FINAL
ORDER BY STUDENT_ID ASC, DISPLAYORDER ASC, EXAM_NAME ASC;'
EXECUTE(#query)
select * from ##FINAL
This query works properly in my local database, but it doesn't work in SQL Azure since global temp tables are not allowed there.
Now if i change ##FINAL to #FINAL in my local database, but it gives me error as
Invalid object name '#FINAL' .
How can I resolve this issue?
Okay, after saying I didn't think it could be done, I might have a way. It's ugly though. Hopefully, you can play with the below sample and adapt it to your query (without having your schema and data, it's too tricky for me to attempt to write it):
declare #cols varchar(max)
set #cols = 'object_id,schema_id,parent_object_id'
--Create a temp table with the known columns
create table #Boris (
ID int IDENTITY(1,1) not null
)
--Alter the temp table to add the varying columns. Thankfully, they're all ints.
--for unknown types, varchar(max) may be more appropriate, and will hopefully convert
declare #tempcols varchar(max)
set #tempcols = #cols
while LEN(#tempcols) > 0
begin
declare #col varchar(max)
set #col = CASE WHEN CHARINDEX(',',#tempcols) > 0 THEN SUBSTRING(#tempcols,1,CHARINDEX(',',#tempcols)-1) ELSE #tempcols END
set #tempcols = CASE WHEN LEN(#col) = LEN(#tempcols) THEN '' ELSE SUBSTRING(#tempcols,LEN(#col)+2,10000000) END
declare #sql1 varchar(max)
set #sql1 = 'alter table #Boris add [' + #col + '] int null'
exec (#sql1)
end
declare #sql varchar(max)
set #sql = 'insert into #Boris (' + #cols + ') select ' + #cols + ' from sys.objects'
exec (#sql)
select * from #Boris
drop table #Boris
They key is to create the temp table in the outer scope, and then inner scopes (code running within EXEC statements) have access to the same temp table. The above worked on SQL Server 2008, but I don't have an Azure instance to play with, so not tested there.
If you create a temp table, it's visible from dynamic sql executed in your spid, if you create the table in dynamic sql, it's not visible outside of that.
There is a workaround. You can create a stub table and alter it in your dynamic sql. It requires a bit of string manipulation but I've used this technique to generate dynamic datasets for tsqlunit.
CREATE TABLE #t1
(
DummyCol int
)
EXEC(N'ALTER TABLE #t1 ADD foo INT')
EXEC ('insert into #t1(DummyCol, foo)
VALUES(1,2)')
EXEC ('ALTER TABLE #t1 DROP COLUMN DummyCol')
select *from #t1