I am working on a reporting project based in SQL but I have restricted access to the DB; I can only make SELECT Queries and insert the data I retrieve into temp Tables/table variables. I cannot create/execute stored procedures or any sort of functions.
The query I am running is meant to pool together all Engineers and the different key skills that they have so that we can later on see what Skills each engineer has or which Engineers fall under a certain skill.
To this end, I am trying to create a table variable/temp table with a flexible structure, a structure based on previously obtained values in the same query.
For E.g.
1st Output:
Adam
Brad
Julio
Martinez
2nd Output (Skill separated by white space):
VOIP
TTS
DBA
Exchange
Server
Create temp table/table variable that uses 1st output as rows and 2nd output as columns or vice versa. I will then populate this new table according to different values on the main DB.
Please advise how this can be done, or provide any other solution to this problem.
Thank you
I believe you can.
First of all you need to create temp table with dynamic structure based on query. It can be done like this:
declare script template:
Set #ScriptTmpl = 'Alter table #tempTable Add [?] varchar(100);
build script that will insert columns you need based on query:
Select #TableScript = #TableScript + Replace(#ScriptTmpl, '?',
ColumnName) From ... Where ...
then execute script and then fill your new table with values from second query
UPD:
here is the full sample of temporary table dynamic creation. I used global temporary table in my sample:
declare #scriptTemplate nvarchar(MAX)
declare #script nvarchar(MAX)
declare #tableTemplate nvarchar(MAX)
SET #tableTemplate = 'create table ##tmptable (?)'
SET #scriptTemplate = '? nvarchar(500),'
SET #script = ''
Drop table ##tmptable
Select #script = #script + Replace(#scriptTemplate, '?', [Name])
From Account
Where name like 'ES_%'
SET #script = LEFT(#script, LEN(#script) - 1)
SET #script = Replace(#tableTemplate, '?', #script)
Select #script
exec(#script)
Select * from ##tmptable
Firstly, you may be able to achieve what you want through pivots, rather than temporary tables.
Secondly, if you really want to create a table with column name "Adam Brad", the solution is dynamic SQL, which you may not be able to do based on your permissions.
Related
Is there a way to create a dynamic temp table. Below sql code is declaring a variable #tic. I am planning to insert contents from table1 to temp table #df. So instead of giving directly as #df, I am passing as a variable. But below is code is not successful. Can anyone help me here?
declare #tic as varchar(100) = 'df'
select *
into '#' + #tic from (
select * from [dbo].[table1])
select * from #df
Is there a way? Well, I think of the answer as "yes and no and maybe".
As far as I know, there is no way to do this using a local temporary table. As Stu explains in the comment, you would need dynamic SQL to define the table name and then the table would not be visible in the outer scope, because it is a local temporary table.
The "yes" is because one type of temporary table are global temporary tables. These are tables that persist across different scopes. And they are defined using ## instead of # as the prefix. So this works:
declare #tic as varchar(100) = 'df'
declare #sql nvarchar(max);
set #sql = 'select * into ##' + #tic + ' from table1';
select #sql;
exec sp_executesql #sql;
select * from ##df;
(Here is a db<>fiddle.)
The "maybe" is because I'm quite skeptical that you really need this. Dynamic table names are rarely useful in SQL systems, precisely because they depend on dynamic SQL. Introducing dynamic names into SQL (whether columns or tables) is dangerous, both because of the danger of SQL injection and also because it can introduce hard-to-debug syntax errors.
If you are trying to solve a real problem, there might be alternative approaches that are better suited to SQL Server.
I have a requirement.
Using SSIS i am import data from flat file/excel file into my staging table. From staging table i need to filter data and transfer it to different databases over different linked server.i.e. let say for California i have dbCalifornia on Server A, For Taxes i have dbTaxes on Server B etc etc.
I need to read config table and redirect data accordingly.i.e. if column value =CALI insert data in dbCalifornia.tblA, for column value =TAX insert data in dbTaxes.tblA. I am trying to use Server Name and Database name as variable (because i am reading these from config table) i.e.
INSERT INTO [#server].[#database].[DBO].[BASIC]
But i am getting error .
I am not expert DBA please suggest my solution how can i implement this scenario.
TIA
You can do it using dynamic sql like this:
declare #server varchar(100), #database varchar(100);
DECLARE #sql varchar(8000) ='INSERT INTO[' + #server +'].['+ #database + '.[DBO].[BASIC]' +
'(EmpID,EmployeeID,ADDR1, ADDR2, ADDR3,ADDR4,TELNUM,MARRIED,LNAME,MNAME,FNAME,' +
'SEX, EMAIL,COUNTRYCODE, CITIZEN) ' +
'select EmpID,EmployeeID,ADDR1, ADDR2, ADDR3,ADDR4,TELNUM,MARRIED,LNAME,MNAME,FNAME,
SEX, EMAIL,COUNTRYCODE, CITIZEN from dbo.myExcelTable where state = ' + #database;
exec(#sql);
I don't understand what are your 100 variables tht you use in your insert,
didn't you say
if column value =CALI insert data in dbCalifornia.tblA, for column
value =TAX insert data in dbTaxes.tblA.
?
So you just need to filter your table using #database value and insert those rows in corresponding table
normally when reading a table from another server and DB. I use the LinkedServer, user and table like this:
Select * from Link..User.Table ;
after a linked server I have to use 2 dots.
I don't know if this can help since this is to read from an Oracle database.
It looks like #temptables created using dynamic SQL via the EXECUTE string method have a different scope and can't be referenced by "fixed" SQLs in the same stored procedure.
However, I can reference a temp table created by a dynamic SQL statement in a subsequence dynamic SQL but it seems that a stored procedure does not return a query result to a calling client unless the SQL is fixed.
A simple 2 table scenario:
I have 2 tables. Let's call them Orders and Items. Order has a Primary key of OrderId and Items has a Primary Key of ItemId. Items.OrderId is the foreign key to identify the parent Order. An Order can have 1 to n Items.
I want to be able to provide a very flexible "query builder" type interface to the user to allow the user to select what Items he want to see. The filter criteria can be based on fields from the Items table and/or from the parent Order table. If an Item meets the filter condition including and condition on the parent Order if one exists, the Item should be return in the query as well as the parent Order.
Usually, I suppose, most people would construct a join between the Item table and the parent Order tables. I would like to perform 2 separate queries instead. One to return all of the qualifying Items and the other to return all of the distinct parent Orders. The reason is two fold and you may or may not agree.
The first reason is that I need to query all of the columns in the parent Order table and if I did a single query to join the Orders table to the Items table, I would be repoeating the Order information multiple times. Since there are typically a large number of items per Order, I'd like to avoid this because it would result in much more data being transfered to a fat client. Instead, as mentioned, I would like to return the two tables individually in a dataset and use the two tables within to populate a custom Order and child Items client objects. (I don't know enough about LINQ or Entity Framework yet. I build my objects by hand). The second reason I would like to return two tables instead of one is because I already have another procedure that returns all of the Items for a given OrderId along with the parent Order and I would like to use the same 2-table approach so that I could reuse the client code to populate my custom Order and Client objects from the 2 datatables returned.
What I was hoping to do was this:
Construct a dynamic SQL string on the Client which joins the orders table to the Items table and filters appropriate on each table as specified by the custom filter created on the Winform fat-client app. The SQL build on the client would have looked something like this:
TempSQL = "
INSERT INTO #ItemsToQuery
OrderId, ItemsId
FROM
Orders, Items
WHERE
Orders.OrderID = Items.OrderId AND
/* Some unpredictable Order filters go here */
AND
/* Some unpredictable Items filters go here */
"
Then, I would call a stored procedure,
CREATE PROCEDURE GetItemsAndOrders(#tempSql as text)
Execute (#tempSQL) --to create the #ItemsToQuery table
SELECT * FROM Items WHERE Items.ItemId IN (SELECT ItemId FROM #ItemsToQuery)
SELECT * FROM Orders WHERE Orders.OrderId IN (SELECT DISTINCT OrderId FROM #ItemsToQuery)
The problem with this approach is that #ItemsToQuery table, since it was created by dynamic SQL, is inaccessible from the following 2 static SQLs and if I change the static SQLs to dynamic, no results are passed back to the fat client.
3 around come to mind but I'm look for a better one:
1) The first SQL could be performed by executing the dynamically constructed SQL from the client. The results could then be passed as a table to a modified version of the above stored procedure. I am familiar with passing table data as XML. If I did this, the stored proc could then insert the data into a temporary table using a static SQL that, because it was created by dynamic SQL, could then be queried without issue. (I could also investigate into passing the new Table type param instead of XML.) However, I would like to avoid passing up potentially large lists to a stored procedure.
2) I could perform all the queries from the client.
The first would be something like this:
SELECT Items.* FROM Orders, Items WHERE Order.OrderId = Items.OrderId AND (dynamic filter)
SELECT Orders.* FROM Orders, Items WHERE Order.OrderId = Items.OrderId AND (dynamic filter)
This still provides me with the ability to reuse my client sided object-population code because the Orders and Items continue to be returned in two different tables.
I have a feeling to, that I might have some options using a Table data type within my stored proc, but that is also new to me and I would appreciate a little bit of spoon feeding on that one.
If you even scanned this far in what I wrote, I am surprised, but if so, I woul dappreciate any of your thoughts on how to accomplish this best.
You first need to create your table first then it will be available in the dynamic SQL.
This works:
CREATE TABLE #temp3 (id INT)
EXEC ('insert #temp3 values(1)')
SELECT *
FROM #temp3
This will not work:
EXEC (
'create table #temp2 (id int)
insert #temp2 values(1)'
)
SELECT *
FROM #temp2
In other words:
Create temp table
Execute proc
Select from temp table
Here is complete example:
CREATE PROC prTest2 #var VARCHAR(100)
AS
EXEC (#var)
GO
CREATE TABLE #temp (id INT)
EXEC prTest2 'insert #temp values(1)'
SELECT *
FROM #temp
1st Method - Enclose multiple statements in the same Dynamic SQL Call:
DECLARE #DynamicQuery NVARCHAR(MAX)
SET #DynamicQuery = 'Select * into #temp from (select * from tablename) alias
select * from #temp
drop table #temp'
EXEC sp_executesql #DynamicQuery
2nd Method - Use Global Temp Table:
(Careful, you need to take extra care of global variable.)
IF OBJECT_ID('tempdb..##temp2') IS NULL
BEGIN
EXEC (
'create table ##temp2 (id int)
insert ##temp2 values(1)'
)
SELECT *
FROM ##temp2
END
Don't forget to delete ##temp2 object manually once your done with it:
IF (OBJECT_ID('tempdb..##temp2') IS NOT NULL)
BEGIN
DROP Table ##temp2
END
Note: Don't use this method 2 if you don't know the full structure on database.
I had the same issue that #Muflix mentioned. When you don't know the columns being returned, or they are being generated dynamically, what I've done is create a global table with a unique id, then delete it when I'm done with it, this looks something like what's shown below:
DECLARE #DynamicSQL NVARCHAR(MAX)
DECLARE #DynamicTable VARCHAR(255) = 'DynamicTempTable_' + CONVERT(VARCHAR(36), NEWID())
DECLARE #DynamicColumns NVARCHAR(MAX)
--Get "#DynamicColumns", example: SET #DynamicColumns = '[Column1], [Column2]'
SET #DynamicSQL = 'SELECT ' + #DynamicColumns + ' INTO [##' + #DynamicTable + ']' +
' FROM [dbo].[TableXYZ]'
EXEC sp_executesql #DynamicSQL
SET #DynamicSQL = 'IF OBJECT_ID(''tempdb..##' + #DynamicTable + ''' , ''U'') IS NOT NULL ' +
' BEGIN DROP TABLE [##' + #DynamicTable + '] END'
EXEC sp_executesql #DynamicSQL
Certainly not the best solution, but this seems to work for me.
I would strongly suggest you have a read through http://www.sommarskog.se/arrays-in-sql-2005.html
Personally I like the approach of passing a comma delimited text list, then parsing it with text to table function and joining to it. The temp table approach can work if you create it first in the connection. But it feel a bit messier.
Result sets from dynamic SQL are returned to the client. I have done this quite a lot.
You're right about issues with sharing data through temp tables and variables and things like that between the SQL and the dynamic SQL it generates.
I think in trying to get your temp table working, you have probably got some things confused, because you can definitely get data from a SP which executes dynamic SQL:
USE SandBox
GO
CREATE PROCEDURE usp_DynTest(#table_type AS VARCHAR(255))
AS
BEGIN
DECLARE #sql AS VARCHAR(MAX) = 'SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = ''' + #table_type + ''''
EXEC (#sql)
END
GO
EXEC usp_DynTest 'BASE TABLE'
GO
EXEC usp_DynTest 'VIEW'
GO
DROP PROCEDURE usp_DynTest
GO
Also:
USE SandBox
GO
CREATE PROCEDURE usp_DynTest(#table_type AS VARCHAR(255))
AS
BEGIN
DECLARE #sql AS VARCHAR(MAX) = 'SELECT * INTO #temp FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = ''' + #table_type + '''; SELECT * FROM #temp;'
EXEC (#sql)
END
GO
EXEC usp_DynTest 'BASE TABLE'
GO
EXEC usp_DynTest 'VIEW'
GO
DROP PROCEDURE usp_DynTest
GO
I need som help with a problem, our company have a vendor that deliver a database to us. inside that database, the vendor has a table with alot of t sql scripts. What i want to do is the following, i want to make a select to find the script and then execut the script and store the result in a variable or temp tabel. I can not alter the script from the vedor, so I need the result into something i can manupilate. Another problem is that i dont know how many columns ther result will have. So it has to be flexible. Like one script have 5 columns and and the next script has 8 and so on.
exsample:
DECLARE #SQL nvarchar(MAX) = ( Select distinct script_details
from scripttable where .......)
This will give me the script I want to use, then I use
EXEC(#SQL)
to execute the script.
Then my problem is, the result from this I want into a variable or a table.
I have tryed to make a temp table like this:
create table #TmpTblSP (col1 varchar(MAX),col2 varchar(MAX),col3 varchar(MAX),col4 varchar(MAX),col5 varchar(MAX),col6 varchar(MAX),col7 varchar(MAX),col8 varchar(MAX),col9 varchar(MAX),col10 varchar(MAX),col11 varchar(MAX),col12 varchar(MAX))
then
insert into #TmpTblSP
EXEC(#SQL)
This gives me the following error:
Msg 213, Level 16, State 7, Line 1
Column name or number of supplied values does not match table definition.
But if i know how many columns there are and specify that into the insert it works.
insert into #TmpTblSP(Col1,Col2,Col3)
EXEC(#SQL)
But here you se my problem, I dont know how many columns there are in every script. I could make one script for every script the vendor has, but that will be alot, it's like 3000 scripts in that table and they change them often.
You could try something like:
DECLARE #SQL nvarchar(MAX) = (
Select distinct script_details
into #temptbl
from scripttable where .......
);
EXEC(#SQL);
If you don't know how many columns yous #sql gives then the only solution is use SELECT INTO. I use it in this way:
DECLARE #QRY nvarchar(MAX) = ( Select distinct script_details
from scripttable where .......)
SET #sql = 'SELECT * into ' + #temptablename + ' FROM (' + #qry + ') A '
It gives some flexibility
Remember that it is easy to check structure of the table created in this way in sys so you can build another #SQL from this info if needed.
I this as well recommended to split "SELECT INTO" to 2 parts
One is
SELECT INTO ......... WHERE 1=2
Second
INSERT INTO SELECT ......
Creation of table locks all DB. So it is good to create it as fast as possible and then insert into it.
I am getting
Statement 'SELECT INTO' is not supported in this version of SQL Server
in SQL Server
for the below query inside stored procedure
DECLARE #sql NVARCHAR(MAX)
,#sqlSelect NVARCHAR(MAX) = ''
,#sqlFrom NVARCHAR(MAX) = ''
,#sqlTempTable NVARCHAR(MAX) = '#itemSearch'
,#sqlInto NVARCHAR(MAX) = ''
,#params NVARCHAR(MAX)
SET #sqlSelect ='SELECT
,IT.ITEMNR
,IT.USERNR
,IT.ShopNR
,IT.ITEMID'
SET #sqlFrom =' FROM dbo.ITEM AS IT'
SET #sqlInto = ' INTO ' + #sqlTempTable + ' ';
IF (#cityId > 0)
BEGIN
SET #sqlFrom = #sqlFrom +
' INNER JOIN dbo.CITY AS CI2
ON CI2.CITYID = #cityId'
SET #sqlSelect = #sqlSelect +
'CI2.LATITUDE AS CITYLATITUDE
,CI2.LONGITUDE AS CITYLONGITUDE'
END
SELECT #params =N'#cityId int '
SET #sql = #sqlSelect +#sqlInto +#sqlFrom
EXEC sp_executesql #sql,#params
I have around 50,000 records, so decided to use Temp Table. But surprised to see this error.
How can i achieve the same in SQL Azure?
Edit: Reading this blog http://blogs.msdn.com/b/sqlazure/archive/2010/05/04/10007212.aspx suggesting us to CREATE a Table inside Stored procedure for storing data instead of Temp table. Is it safe under concurrency? Will it hit performance?
Adding some points taken from http://blog.sqlauthority.com/2011/05/28/sql-server-a-quick-notes-on-sql-azure/
Each Table must have clustered index. Tables without a clustered index are not supported.
Each connection can use single database. Multiple database in single transaction is not supported.
‘USE DATABASE’ cannot be used in Azure.
Global Temp Tables (or Temp Objects) are not supported.
As there is no concept of cross database connection, linked server is not the concept in Azure at this moment.
SQL Azure is shared environment and because of the same there is no concept of Windows Login.
Always drop TempDB objects after their need as they create pressure on TempDB.
During buck insert use batchsize option to limit the number of rows to be inserted. This will limit the usage of Transaction log space.
Avoid unnecessary usage of grouping or blocking ORDER by operations as they leads to high end memory usage.
SELECT INTO is one of the many things that you can unfortunately not perform in SQL Azure.
What you'd have to do is first create the temporary table, then perform the insert. Something like:
CREATE TABLE #itemSearch (ITEMNR INT, USERNR INT, IT.ShopNR INT, IT.ITEMID INT)
INSERT INTO #itemSearch
SELECT IT.ITEMNR, IT.USERNR, IT.ShopNR ,IT.ITEMID
FROM dbo.ITEM AS IT
The new Azure DB Update preview has this problem resolved:
The V12 preview enables you to create a table that has no clustered
index. This feature is especially helpful for its support of the T-SQL
SELECT...INTO statement which creates a table from a query result.
http://azure.microsoft.com/en-us/documentation/articles/sql-database-preview-whats-new/
Create the table using # prefix, e.g. create table #itemsearch then use insert into. The scope of the temp table is limited to the session so there will no concurrency problems.
Well, As we all know SQL Azure table must have a clustered index, that is why SELECT INTO failure copy data from one table in to another table.
If you want to migrate, you must create a table first with same structure and then execute INSERT INTO statement.
For temporary table which followed by # you don't need to create Index.
how to create index and how to execute insert into for temp table?