Send query as parameter to SQL function - sql

I want to create a SQL tabled-value function that will receive a query as n parameter through my API. In my function I want execute that query. The query will be a SELECT statement.
This is what I have done so far and what to achieve but it is not the correct way to do so.
CREATE FUNCTION CUSTOM_EXPORT_RESULTS (
#query varchar(max),
#guid uniqueidentifier,
#tableName varchar(200))
RETURNS TABLE
AS
RETURN
(
-- Execute query into a table
SELECT *
INTO #tableName
FROM (
EXEC(#query)
)
)
GO
Please suggest the correct way!

Try this one -
CREATE PROCEDURE dbo.sp_CUSTOM_EXPORT_RESULTS
#query NVARCHAR(MAX) = 'SELECT * FROM dbo.test'
, #guid UNIQUEIDENTIFIER
, #tableName VARCHAR(200) = 'test2'
AS BEGIN
SELECT #query =
REPLACE(#query,
'FROM',
'INTO [' + #tableName + '] FROM')
DECLARE #SQL NVARCHAR(MAX)
SELECT #SQL = '
IF OBJECT_ID (N''' + #tableName + ''') IS NOT NULL
DROP TABLE [' + #tableName + ']
' + #query
PRINT #SQL
EXEC sys.sp_executesql #SQL
RETURN 0
END
GO
Output -
IF OBJECT_ID (N'test2') IS NOT NULL
DROP TABLE [test2]
SELECT * INTO [test2] FROM dbo.test

What I see in your question is encapsulation of:
taking a dynamic SQL expression
executing it to fill a parametrized table
Why do you want to have such an encapsulation?
First, this can have a negative impact on your database performance. Please read this on EXEC() and sp_executesql() . I hope your SP won't be called from multiple parts of your application, because this WILL get you into trouble, at least performance-wise.
Another thing is - how and where are you constructing your SQL? Obviously you do it somewhere else and it seems its manually created. If we're talking about a contemporary application, there are lot of OR/M solutions for this and manual construction of TSQL in runtime should be always avoided if possible. Not to mention EXEC is not guarding you against any form of SQL injection attacks. However, if all of this is a part of some database administration TSQL bundle, forget his paragraph.
At the end, if you want to simply load a new table from some existing table (or part of it) as a part of some administration task in TSQL, consider issuing a SELECT ... INTO ... This will create a new target table structure for you (omitting indexes and constraints) and copy the data. SELECT INTO will outperform INSERT INTO SELECT because SELECT INTO gets minimally logged.
I hope this will get you (and others) at least a bit on the right track.

You can use stored procedure as well, here is the code that you can try.
CREATE FUNCTION CUSTOM_EXPORT_RESULTS
(
#query varchar(max),
#guid uniqueidentifier,
#tableName varchar(200)
)
RETURNS TABLE
AS
RETURN
(
declare #strQuery nvarchar(max)
-- Execute query into a table
SET #strQuery = REPLACE(#query,'FROM', 'INTO '+#tableName+' FROM')
exec sp_executesql #strQuery
)
GO

Related

Select into tables dynamically with variables

I have some code to create tables based on a set of dates I define.
Example, I have 5 dates, and they are aren't consecutive. For any of these dates, I want to create a table and I am currently using a Select into.
I am having to do this 5 times, even though the only thing changing is the name of the new table created and the date. Is there a way to do this in an elegant way.
I started writing some code, but I am struggling to get it to loop through all the dates I want. The way I have written it currently, I only works if I edit the date at the start.
DECLARE #MyDate DATE;
SET #MyDate = '2019-01-01';
SET #TableName = 'Table1';
SELECT *
into #TableName
FROM Original_Table
WHERE Query_Date = #MyDate;
Is this a one time thing or do you have to do this on a regular basis?
If it's the first, than I would just do it and get it over with.
If it's the latter, then I suspect something is very wrong with the way that system is designed - but assuming that can't be changed, you can create a stored procedure that will do this using dynamic SQL.
Something like this can get you started:
CREATE PROCEDURE dbo.CreateTableBasedOnDate
(
#MyDate DATE,
-- sysname is a system data type for identifiers: a non-nullable nvarchar(128).
#TableName sysname
)
AS
-- 200 is long enough. Yes, I did the math.
DECLARE #Sql nvarchar(200) =
-- Note: I'm not convinced that quotename is enough to protect you from sql injection.
-- you should be very careful with what user is allowed to execute this procedure.
N'SELECT * into '+ QUOTENAME(#TableName) +N'
FROM Original_Table
WHERE Query_Date = #MyDate;';
-- When dealing with dynamic SQL, Print is your best friend.
-- Remark this row and unremark the next only once you've verified you get the correct SQL
PRINT #SQL;
--EXEC sp_ExecuteSql #Sql, N'#MyDate Date', #MyDate
GO
Usage:
EXEC CreateTableBasedOnDate '2018-01-01', 'zohar';
Use dynamic SQL:
DECLARE #MyDate DATE, #TableName varchar(50);
SET #MyDate = '2019-01-01';
SET #TableName = 'Table1';
DECLARE #sql NVARCHAR(4000);
DECLARE #params NVARCHAR(4000);
SELECT #sql=N'
SELECT *
INTO ' + QUOTENAME(#TableName) + '
FROM Original_Table
WHERE Query_Date = #MyDate;';
SELECT #params = N'#MyDate DATE';
EXEC sys.sp_executesql #sql, #params, #MyDate=#MyDate
Note that dynamic SQL can be dangerous as it opens up a path for SQL injection. Its fine if you are just using it in your own local scripts, but take care if you e.g. wrap this in a procedure that is more widely accessible.
I would use dynamic SQL although I would add another variables for the schema:
DECLARE
#MyDate nVarchar(50) = '2019-01-01',
#Schema nVarchar (50) = 'dbo',
#TableName nVarchar(250) = 'Table1',
#SQL nVarchar(500);
Set #SQL = '
SELECT *
into '+ QUOTENAME(#Schema)+'.'+ QUOTENAME(#TableName) +'
FROM Original_Table
WHERE Query_Date = '+ #MyDate +';
'
--print #SQL
Exec(#SQL)
You can use the print statement to see how the SQL will look before executing this properly. You may also want to look at adding this as a stored procedure.

How to use a variable in "Select [some calculations] insert into #NameOfTheTableInThisVariable"?

I have a procedure in which there are calculations being done and the final result is inserted into a permanent table. I want to remove the permanent table and I cannot use Temp table as well. So i want to use a dynamic table name, which is stored in a variable:
Current scenario:
Insert into xyz_table
Select col1,col2,sum(col3)
from BaseTable
(In reality, there are lot of columns and a lot of calculations)
What I want:
Select col1,col2,sum(col3) into #DynamicTableName
from BaseTable
where the name of the table would be dynamic in nature i.e.,
#DynamicTableName = 'xyz ' + cast(convert(date,getdate()) as nvarchar)+' '+convert(nvarchar(5),getdate(),108)
It will have date and time in its name every time the procedure is run.
I want to use this name in the "Select * into statement"
How can I achieve this?
i tried it with the some short code. But since my procedure has a lot of calculations and UNIONS , I cannot use that code for this. Any help would be appreciated.
declare #tablename nvarchar(30)= 'xyz ' + cast(convert(date,getdate()) as nvarchar)+' '+convert(nvarchar(5),getdate(),108)
declare #SQL_Statement nvarchar(100)
declare #SQL_Statement2 nvarchar(100)
declare #dropstatement nvarchar(100)
SET #SQL_Statement = N'SELECT * Into ' +'['+#tablename +'] '+'FROM '+ 'dimBranch'
print #SQL_Statement
EXECUTE sp_executesql #SQL_Statement
SET #SQL_Statement= N'select * from ' + '['+#tablename + '] '
print #SQL_Statement
EXECUTE sp_executesql #SQL_Statement
set #dropstatement = 'DROP TABLE' + '['+#tablename + '] '
PRINT #dropstatement
exec sp_executesql #dropstatement
Reason why I want this is because I use this procedure in ETL job as well as in SSRS report. And if someone runs the package and the SSRS report at the same time, the incorrect or weird data gets stored in the table. Therefore I need a dynamic name of the table with date and time.
You can't parameterize an identifier in SQL, only a value
--yes
select * from table where column = #value
--no
select * from #tablename where #columnname = #value
The only thin you can do to make these things dynamic is to build an sql string and execute it dynamically, but your code is already doing this with sp_executesql
More telling is your complaint at the bottom of your question, that if the procedure is invoked simultaneously it gives problems. Perhaps you should consider using local table variables for temporary data storage that the report is using rather than pushing data back into the db
DECLARE #temp TABLE(id INT, name varchar100);
INSERT INTO #temp SELECT personid, firstname FROM person;
-- work with temp data
select count(*) from #temp;
--when #temp goes out of scope it is lost,
--no other procedure invoked simultaneously can access this procedure'a #temp
Consider a local temp table, which is automatically session scoped without the need for dynamic SQL. For example:
SELECT *
INTO #YourTempTable
FROM dimBranch;
The local temp table will automatically be dropped when the proc completes so there is no need for an explict drop in the proc code.

Need to dynamically check for columns in other databases

My application runs over several databases, and it needs to be able to check from one to see if a column exists in the other. Unfortunately, I won't know the name of the second database until runtime, so it needs to be dynamic. Also, it has to do this in multiple places, so ideally I'd like to make it into a function, but this gives me problems because functions won't run dynamic SQL.
This is the (non-working) function I wrote.....
CREATE FUNCTION [dbo].[fn_checkcolexists] (
#dbname VARCHAR(100)
,#tablename VARCHAR(100)
,#colname VARCHAR(100)
)
RETURNS BIT
AS
BEGIN
DECLARE #sqlstring NVARCHAR(2000)
SET #sqlstring = 'select #retVal = 1 from ' + #dbname + '.sys.columns cols inner join yodata_dev_load.sys.tables tabs
on cols.object_ID=tabs.object_ID where cols.name=''' + #colname + ''' and tabs.name=''' + #tablename + ''''
DECLARE #retVal INT
EXEC sp_executesql #sqlstring
,N'#retVal int output'
,#retVal OUTPUT
RETURN #retval
END
Has anyone got any suggestions how I can accomplish this? I can't find a way to access the column information for every database. Does this information exist in the system databases anywhere?
Alternatively, can I create some sort of synonym for the other database?
Edit: How to find column names for all tables in all databases in SQL Server isn't an ideal solution, because it also relies on dynamic SQL, so I couldn't use this as a function
Use stored procedure and use one of these
One of the methods is to use undocumented
EXEC sp_msforeachdb 'SELECT table_catalog FROM ?.INFORMATION_SCHEMA.COLUMNS
where table_name=''your_table'' and column_name=''your_column_name'''
or simulate it
declare #sql varchar(max), #table_name varchar(100)
select #sql='', #table_name='your_table'
select #sql=#sql+ 'SELECT table_catalog
FROM '+name+'.INFORMATION_SCHEMA.TABLES
where table_name='''+#table_name+''' and
column_name=''your_column_name''' from sys.databases
exec(#sql)
I think I've got the solution I was after. I am using COL_LENGTH, which seems to do the job. You can specify a dbname to is, and even pass that as a parameter, and it returns a null if the column does not exist.
eg
declare #dbname varchar(200)='dbname'
select COL_LENGTH(#dbname + '.dbo.tablename','columnname')
if this returns a null, the column doesn't exist
Many thanks for all the contributors to this thread
Hope this works for you
CREATE FUNCTION [dbo].[fn_checkcolexists]
(
#dbname VARCHAR(100)
,#tablename VARCHAR(100)
,#colname VARCHAR(100)
)
RETURNS INT
AS
BEGIN
DECLARE #RECCOUNT INT = 0
SELECT #RECCOUNT = COUNT(*) FROM information_schema.columns WHERE TABLE_CATALOG = #dbname AND COLUMN_NAME = #colname AND TABLE_NAME = #tablename
RETURN #RECCOUNT
END
GO

Create temp table from provided variable column names

I want to create a temporary table, in which the columns will be those which I provide as parameter, separated by a delimiter.
For example, if the column names are: id, name, address..the respective table should contain the same amount and header names of the columns. Similarly, next time the column number and names could vary.
Any help in this regard?
Try this :-
CREATE PROCEDURE GenerateTempTable
#tableName as nvarchar(max),
#Col1 as nvarchar(255),
#Col2 as nvarchar(255)
AS
BEGIN
Declare #sql nvarchar(max)
set #sql='CREATE TABLE #'+ #tableName + '
('+ #col1+ ' nvarchar(255),'+
#col2 + ' nvarchar(255)
)'
-- Select #sql Check the DDL
EXECUTE sp_executesql #sql,
N'#tableName nvarchar(max),#Col1 nvarchar(255),#Col2 nvarchar(255)',
#tableName = #tableName,#Col1=#Col1,#Col2=#Col2
END
The problem with the above query is temp table is created with the dynamic block query therefore it cannot be accessed after the block . In order to access the table outside the scope then you need to create global temp table ##
Edit :-
An example with Global Temp Tables and static table name
ALTER PROCEDURE GenerateTable
#Col1 as nvarchar(255),
#Col2 as nvarchar(255)
AS
BEGIN
Declare #sql nvarchar(max)
If object_id('tempdb..##TempTable') is not null
Drop table ##TempTable
set #sql='CREATE TABLE ##TempTable
('+ #col1+ ' nvarchar(255),'+
#col2 + ' nvarchar(255)
)'
-- Select #sql Check the DDL
EXECUTE sp_executesql #sql,
N'#Col1 nvarchar(255),#Col2 nvarchar(255)',
#Col1=#Col1,#Col2=#Col2
END
To execute the SP the sql is :-
Declare #tableName varchar(max),
#Col1 varchar(70),
#Col2 varchar(70)
Exec GenerateTable #col1='ColA',#Col2='ColB'
Edit 2:-
If you are sure that the number of parameters wont exceed x values ( Say 5) .Then you can create 5 default parameter .Check this link for further details.
Could you not build a table out of a distinct list from wherever these "Dynamic Field Names" live... Then push that in as a string list... Like... I built a table with colors then got a field of names and now am going to push it into a string that can be used to build out the table headers... no limit to quantity...
SELECT #Fields = coalesce(#Fields + ',', '') + convert(varchar(50),[name])
FROM #TempCols
WHERE column_id > 1
ORDER BY column_id
Where Column_ID is just a Windowed ROW_Number...
I don't agree with the notion of its not possible ever. There is always a way, we may not see it now but there is always a method that can be nested or abused to bend any rule to what we need.

How to BULK INSERT a file into a *temporary* table where the filename is a variable?

I have some code like this that I use to do a BULK INSERT of a data file into a table, where the data file and table name are variables:
DECLARE #sql AS NVARCHAR(1000)
SET #sql = 'BULK INSERT ' + #tableName + ' FROM ''' + #filename + ''' WITH (CODEPAGE=''ACP'', FIELDTERMINATOR=''|'')'
EXEC (#sql)
The works fine for standard tables, but now I need to do the same sort of thing to load data into a temporary table (for example, #MyTable). But when I try this, I get the error:
Invalid Object Name: #MyTable
I think the problem is due to the fact that the BULK INSERT statement is constructed on the fly and then executed using EXEC, and that #MyTable is not accessible in the context of the EXEC call.
The reason that I need to construct the BULK INSERT statement like this is that I need to insert the filename into the statement, and this seems to be the only way to do that. So, it seems that I can either have a variable filename, or use a temporary table, but not both.
Is there another way of achieving this - perhaps by using OPENROWSET(BULK...)?
UPDATE:
OK, so what I'm hearing is that BULK INSERT & temporary tables are not going to work for me. Thanks for the suggestions, but moving more of my code into the dynamic SQL part is not practical in my case.
Having tried OPENROWSET(BULK...), it seems that that suffers from the same problem, i.e. it cannot deal with a variable filename, and I'd need to construct the SQL statement dynamically as before (and thus not be able to access the temp table).
So, that leaves me with only one option which is to use a non-temp table and achieve process isolation in a different way (by ensuring that only one process can be using the tables at any one time - I can think of several ways to do that).
It's annoying. It would have been much more convenient to do it the way I originally intended. Just one of those things that should be trivial, but ends up eating a whole day of your time...
You could always construct the #temp table in dynamic SQL. For example, right now I guess you have been trying:
CREATE TABLE #tmp(a INT, b INT, c INT);
DECLARE #sql NVARCHAR(1000);
SET #sql = N'BULK INSERT #tmp ...' + #variables;
EXEC master.sys.sp_executesql #sql;
SELECT * FROM #tmp;
This makes it tougher to maintain (readability) but gets by the scoping issue:
DECLARE #sql NVARCHAR(MAX);
SET #sql = N'CREATE TABLE #tmp(a INT, b INT, c INT);
BULK INSERT #tmp ...' + #variables + ';
SELECT * FROM #tmp;';
EXEC master.sys.sp_executesql #sql;
EDIT 2011-01-12
In light of how my almost 2-year old answer was suddenly deemed incomplete and unacceptable, by someone whose answer was also incomplete, how about:
CREATE TABLE #outer(a INT, b INT, c INT);
DECLARE #sql NVARCHAR(MAX);
SET #sql = N'SET NOCOUNT ON;
CREATE TABLE #inner(a INT, b INT, c INT);
BULK INSERT #inner ...' + #variables + ';
SELECT * FROM #inner;';
INSERT #outer EXEC master.sys.sp_executesql #sql;
It is possible to do everything you want. Aaron's answer was not quite complete.
His approach is correct, up to creating the temporary table in the inner query. Then, you need to insert the results into a table in the outer query.
The following code snippet grabs the first line of a file and inserts it into the table #Lines:
declare #fieldsep char(1) = ',';
declare #recordsep char(1) = char(10);
declare #Lines table (
line varchar(8000)
);
declare #sql varchar(8000) = '
create table #tmp (
line varchar(8000)
);
bulk insert #tmp
from '''+#filename+'''
with (FirstRow = 1, FieldTerminator = '''+#fieldsep+''', RowTerminator = '''+#recordsep+''');
select * from #tmp';
insert into #Lines
exec(#sql);
select * from #lines
Sorry to dig up an old question but in case someone stumbles onto this thread and wants a quicker solution.
Bulk inserting a unknown width file with \n row terminators into a temp table that is created outside of the EXEC statement.
DECLARE #SQL VARCHAR(8000)
IF OBJECT_ID('TempDB..#BulkInsert') IS NOT NULL
BEGIN
DROP TABLE #BulkInsert
END
CREATE TABLE #BulkInsert
(
Line VARCHAR(MAX)
)
SET #SQL = 'BULK INSERT #BulkInser FROM ''##FILEPATH##'' WITH (ROWTERMINATOR = ''\n'')'
EXEC (#SQL)
SELECT * FROM #BulkInsert
Further support that dynamic SQL within an EXEC statement has access to temp tables outside of the EXEC statement. http://sqlfiddle.com/#!3/d41d8/19343
DECLARE #SQL VARCHAR(8000)
IF OBJECT_ID('TempDB..#BulkInsert') IS NOT NULL
BEGIN
DROP TABLE #BulkInsert
END
CREATE TABLE #BulkInsert
(
Line VARCHAR(MAX)
)
INSERT INTO #BulkInsert
(
Line
)
SELECT 1
UNION SELECT 2
UNION SELECT 3
SET #SQL = 'SELECT * FROM #BulkInsert'
EXEC (#SQL)
Further support, written for MSSQL2000 http://technet.microsoft.com/en-us/library/aa175921(v=sql.80).aspx
Example at the bottom of the link
DECLARE #cmd VARCHAR(1000), #ExecError INT
CREATE TABLE #ErrFile (ExecError INT)
SET #cmd = 'EXEC GetTableCount ' +
'''pubs.dbo.authors''' +
'INSERT #ErrFile VALUES(##ERROR)'
EXEC(#cmd)
SET #ExecError = (SELECT * FROM #ErrFile)
SELECT #ExecError AS '##ERROR'
http://msdn.microsoft.com/en-us/library/ms191503.aspx
i would advice to create table with unique name before bulk inserting.