Need to dynamically check for columns in other databases - sql

My application runs over several databases, and it needs to be able to check from one to see if a column exists in the other. Unfortunately, I won't know the name of the second database until runtime, so it needs to be dynamic. Also, it has to do this in multiple places, so ideally I'd like to make it into a function, but this gives me problems because functions won't run dynamic SQL.
This is the (non-working) function I wrote.....
CREATE FUNCTION [dbo].[fn_checkcolexists] (
#dbname VARCHAR(100)
,#tablename VARCHAR(100)
,#colname VARCHAR(100)
)
RETURNS BIT
AS
BEGIN
DECLARE #sqlstring NVARCHAR(2000)
SET #sqlstring = 'select #retVal = 1 from ' + #dbname + '.sys.columns cols inner join yodata_dev_load.sys.tables tabs
on cols.object_ID=tabs.object_ID where cols.name=''' + #colname + ''' and tabs.name=''' + #tablename + ''''
DECLARE #retVal INT
EXEC sp_executesql #sqlstring
,N'#retVal int output'
,#retVal OUTPUT
RETURN #retval
END
Has anyone got any suggestions how I can accomplish this? I can't find a way to access the column information for every database. Does this information exist in the system databases anywhere?
Alternatively, can I create some sort of synonym for the other database?
Edit: How to find column names for all tables in all databases in SQL Server isn't an ideal solution, because it also relies on dynamic SQL, so I couldn't use this as a function

Use stored procedure and use one of these
One of the methods is to use undocumented
EXEC sp_msforeachdb 'SELECT table_catalog FROM ?.INFORMATION_SCHEMA.COLUMNS
where table_name=''your_table'' and column_name=''your_column_name'''
or simulate it
declare #sql varchar(max), #table_name varchar(100)
select #sql='', #table_name='your_table'
select #sql=#sql+ 'SELECT table_catalog
FROM '+name+'.INFORMATION_SCHEMA.TABLES
where table_name='''+#table_name+''' and
column_name=''your_column_name''' from sys.databases
exec(#sql)

I think I've got the solution I was after. I am using COL_LENGTH, which seems to do the job. You can specify a dbname to is, and even pass that as a parameter, and it returns a null if the column does not exist.
eg
declare #dbname varchar(200)='dbname'
select COL_LENGTH(#dbname + '.dbo.tablename','columnname')
if this returns a null, the column doesn't exist
Many thanks for all the contributors to this thread

Hope this works for you
CREATE FUNCTION [dbo].[fn_checkcolexists]
(
#dbname VARCHAR(100)
,#tablename VARCHAR(100)
,#colname VARCHAR(100)
)
RETURNS INT
AS
BEGIN
DECLARE #RECCOUNT INT = 0
SELECT #RECCOUNT = COUNT(*) FROM information_schema.columns WHERE TABLE_CATALOG = #dbname AND COLUMN_NAME = #colname AND TABLE_NAME = #tablename
RETURN #RECCOUNT
END
GO

Related

Select into tables dynamically with variables

I have some code to create tables based on a set of dates I define.
Example, I have 5 dates, and they are aren't consecutive. For any of these dates, I want to create a table and I am currently using a Select into.
I am having to do this 5 times, even though the only thing changing is the name of the new table created and the date. Is there a way to do this in an elegant way.
I started writing some code, but I am struggling to get it to loop through all the dates I want. The way I have written it currently, I only works if I edit the date at the start.
DECLARE #MyDate DATE;
SET #MyDate = '2019-01-01';
SET #TableName = 'Table1';
SELECT *
into #TableName
FROM Original_Table
WHERE Query_Date = #MyDate;
Is this a one time thing or do you have to do this on a regular basis?
If it's the first, than I would just do it and get it over with.
If it's the latter, then I suspect something is very wrong with the way that system is designed - but assuming that can't be changed, you can create a stored procedure that will do this using dynamic SQL.
Something like this can get you started:
CREATE PROCEDURE dbo.CreateTableBasedOnDate
(
#MyDate DATE,
-- sysname is a system data type for identifiers: a non-nullable nvarchar(128).
#TableName sysname
)
AS
-- 200 is long enough. Yes, I did the math.
DECLARE #Sql nvarchar(200) =
-- Note: I'm not convinced that quotename is enough to protect you from sql injection.
-- you should be very careful with what user is allowed to execute this procedure.
N'SELECT * into '+ QUOTENAME(#TableName) +N'
FROM Original_Table
WHERE Query_Date = #MyDate;';
-- When dealing with dynamic SQL, Print is your best friend.
-- Remark this row and unremark the next only once you've verified you get the correct SQL
PRINT #SQL;
--EXEC sp_ExecuteSql #Sql, N'#MyDate Date', #MyDate
GO
Usage:
EXEC CreateTableBasedOnDate '2018-01-01', 'zohar';
Use dynamic SQL:
DECLARE #MyDate DATE, #TableName varchar(50);
SET #MyDate = '2019-01-01';
SET #TableName = 'Table1';
DECLARE #sql NVARCHAR(4000);
DECLARE #params NVARCHAR(4000);
SELECT #sql=N'
SELECT *
INTO ' + QUOTENAME(#TableName) + '
FROM Original_Table
WHERE Query_Date = #MyDate;';
SELECT #params = N'#MyDate DATE';
EXEC sys.sp_executesql #sql, #params, #MyDate=#MyDate
Note that dynamic SQL can be dangerous as it opens up a path for SQL injection. Its fine if you are just using it in your own local scripts, but take care if you e.g. wrap this in a procedure that is more widely accessible.
I would use dynamic SQL although I would add another variables for the schema:
DECLARE
#MyDate nVarchar(50) = '2019-01-01',
#Schema nVarchar (50) = 'dbo',
#TableName nVarchar(250) = 'Table1',
#SQL nVarchar(500);
Set #SQL = '
SELECT *
into '+ QUOTENAME(#Schema)+'.'+ QUOTENAME(#TableName) +'
FROM Original_Table
WHERE Query_Date = '+ #MyDate +';
'
--print #SQL
Exec(#SQL)
You can use the print statement to see how the SQL will look before executing this properly. You may also want to look at adding this as a stored procedure.

SQL Query all Databases

I need the run the following query on many databases, I have over 100+ databases, but I don't want to pull up each database and run the query one at a time.
The User table is only listed in Database#_Account.
If the query is ran it errors out because Database#_Admin does not have User Table.
(EXAMPLE Database List)
Database:
---------------------
MASTER
Model
msdb
tempdb
Database1_Account
Database1_Admin
Database2_Account
Database2_Admin
Database3_Account
Database3_Admin
Query:
EXEC sp_MsForEachDb #command1 = SELECT "?" as DatabaseName, *
FROM ?.User
WHERE Name = "John" AND "?" LIKE "%_Account"
ms_foreachDb is still an undocumented function and it subject to change anytime. I would use a cursor for something like this.
Here is a working template to get you started:
DECLARE #tsql nvarchar(max)
DECLARE #dbname varchar(500)
DECLARE MyCur CURSOR STATIC FORWARD_ONLY FOR
SELECT [name]
FROM sys.databases
WHERE [name] NOT IN ('tempdb')
OPEN MyCur
WHILE (1=1)
BEGIN
FETCH NEXT FROM MyCur INTO #dbname
IF ##FETCH_STATUS <> 0
BREAK
SET #tsql = 'use ' + #dbname + ' SELECT * FROM sys.tables'
EXEC sp_executesql #tsql
END
CLOSE MyCur;
DEALLOCATE MyCur;
You need to pass the command as an nvarchar literal, not as a query.
You need to use the correct nomenclature. You've left out the schema name. It's Database.Schema.Table, not Database.Table. I'm assuming all tables use the default dbo schema.
Write the query to test if the table exists before executing. Easiest way to do that is with IF OBJECT_ID(N'TableName') IS NOT NULL.
Avoid double quotes. They're normally field identifiers like square brackets are, so they're potentially ambiguous when used for varchar literals.
Try:
EXEC sp_MsForEachDb #command1 = N'IF OBJECT_ID(N''?.dbo.User'') IS NOT NULL SELECT ''?'' as DatabaseName, * FROM ?.dbo.User WHERE Name = ''John'' AND ''?'' LIKE ''%_Account'''
Here's the query I use to do a while loop to iterate through Databases.
Just put your code where it says PUT CODE HERE.
SET NOCOUNT ON
DECLARE #Database TABLE (DbName SYSNAME)
DECLARE #DbName AS SYSNAME
SET #DbName = ''
INSERT INTO #Database (DbName)
SELECT NAME
FROM master.dbo.sysdatabases
WHERE NAME <> 'tempdb'
ORDER BY NAME ASC
WHILE #DbName IS NOT NULL
BEGIN
SET #DbName = (
SELECT MIN(DbName)
FROM #Database
WHERE DbName > #DbName
)
/*
PUT CODE HERE
EXAMPLE PRINT Database Name
*/
PRINT #DbName
END
To create a list of users that match certain conditions you can modify this script.

How to use table as variable in stored procedure

There is this query that I keep using over and over:
SELECT column_name, count(column_name) FROM table_name GROUP by column_name ORDER BY COUNT(column_name) DESC
I use this to check which different values there are in a column and how often they occur.
Because I use this query so often and it's repeating the same 4 times: column_name, I was like: why not make a stored procedure:
CREATE PROCEDURE countcv #table_name VARCHAR(50),#column_name VARCHAR(50)
AS
BEGIN
SELECT #column_name,COUNT(#column_name) FROM #table_name GROUP BY #column_name ORDER BY COUNT(#column_name)
END
Here is where I get stuck, I can not manage to get a variable tablename:
Must declare the table variable "#table_name"
I believe that #Julien Vavasseur and #Dark Knight has already addressed to your question.
However, I would like to add here that, Sql Server 2008 introduced Table-Valued Parameter by using which we can pass table type variable to the stored procedures. e.g.
Assuming you have a table by the name tblTest with the below columns
ID INT,
Name VARCHAR(50)
Step 1: Declare a new table User Defined Type
CREATE TYPE tblTestType AS TABLE
(
ID INT,
Name VARCHAR(50)
)
Step 2: Create a STORED PROCEDURE that has tblTestType as parameter
CREATE PROCEDURE countcv
(
#tblName tblTestType readonly
)
AS
INSERT INTO tblTest (ID, Name)
SELECT ID, Name
FROM
#tblName;
Then you can use DataTable (if you are using C#) and pass this data table as a parameter to the Stored Procedure.(you can find an example in the link I provided).
There is no way to do it directly. You need to use dynamicSQL approach. Assuming you pass correct table and column names. Below one should work.
CREATE PROCEDURE countcv #table_name VARCHAR(50),#column_name VARCHAR(50)
AS
BEGIN
declare #SQL nvarchar(max)
set #SQL = 'SELECT '+#column_name+',COUNT('+#column_name+')
FROM '+#table_name+'
GROUP BY '+#column_name+'
ORDER BY COUNT('+#column_name+')'
EXEC sp_executesql #SQL
END
If you want to do something like this, you must use dynamic SQL:
CREATE PROCEDURE countcv #table_name sysname, #column_name sysname
AS
BEGIN
Declare #sql nvarchar(max)
Set #sql = 'SELECT ' + QUOTENAME(#column_name)+', COUNT(' + QUOTENAME(#column_name)+')
FROM ' + QUOTENAME(#table_name)+'
GROUP BY ' + QUOTENAME(#column_name)+' ORDER BY COUNT(' + QUOTENAME(#column_name)+')'
EXEC sp_executesql #sql
END
Use sysname for data type for column and table names (buitin datatype for object names, alias to nvarchar(128))
Use QUOTENAME to add delimeter to column and table names

Send query as parameter to SQL function

I want to create a SQL tabled-value function that will receive a query as n parameter through my API. In my function I want execute that query. The query will be a SELECT statement.
This is what I have done so far and what to achieve but it is not the correct way to do so.
CREATE FUNCTION CUSTOM_EXPORT_RESULTS (
#query varchar(max),
#guid uniqueidentifier,
#tableName varchar(200))
RETURNS TABLE
AS
RETURN
(
-- Execute query into a table
SELECT *
INTO #tableName
FROM (
EXEC(#query)
)
)
GO
Please suggest the correct way!
Try this one -
CREATE PROCEDURE dbo.sp_CUSTOM_EXPORT_RESULTS
#query NVARCHAR(MAX) = 'SELECT * FROM dbo.test'
, #guid UNIQUEIDENTIFIER
, #tableName VARCHAR(200) = 'test2'
AS BEGIN
SELECT #query =
REPLACE(#query,
'FROM',
'INTO [' + #tableName + '] FROM')
DECLARE #SQL NVARCHAR(MAX)
SELECT #SQL = '
IF OBJECT_ID (N''' + #tableName + ''') IS NOT NULL
DROP TABLE [' + #tableName + ']
' + #query
PRINT #SQL
EXEC sys.sp_executesql #SQL
RETURN 0
END
GO
Output -
IF OBJECT_ID (N'test2') IS NOT NULL
DROP TABLE [test2]
SELECT * INTO [test2] FROM dbo.test
What I see in your question is encapsulation of:
taking a dynamic SQL expression
executing it to fill a parametrized table
Why do you want to have such an encapsulation?
First, this can have a negative impact on your database performance. Please read this on EXEC() and sp_executesql() . I hope your SP won't be called from multiple parts of your application, because this WILL get you into trouble, at least performance-wise.
Another thing is - how and where are you constructing your SQL? Obviously you do it somewhere else and it seems its manually created. If we're talking about a contemporary application, there are lot of OR/M solutions for this and manual construction of TSQL in runtime should be always avoided if possible. Not to mention EXEC is not guarding you against any form of SQL injection attacks. However, if all of this is a part of some database administration TSQL bundle, forget his paragraph.
At the end, if you want to simply load a new table from some existing table (or part of it) as a part of some administration task in TSQL, consider issuing a SELECT ... INTO ... This will create a new target table structure for you (omitting indexes and constraints) and copy the data. SELECT INTO will outperform INSERT INTO SELECT because SELECT INTO gets minimally logged.
I hope this will get you (and others) at least a bit on the right track.
You can use stored procedure as well, here is the code that you can try.
CREATE FUNCTION CUSTOM_EXPORT_RESULTS
(
#query varchar(max),
#guid uniqueidentifier,
#tableName varchar(200)
)
RETURNS TABLE
AS
RETURN
(
declare #strQuery nvarchar(max)
-- Execute query into a table
SET #strQuery = REPLACE(#query,'FROM', 'INTO '+#tableName+' FROM')
exec sp_executesql #strQuery
)
GO

Create temp table from provided variable column names

I want to create a temporary table, in which the columns will be those which I provide as parameter, separated by a delimiter.
For example, if the column names are: id, name, address..the respective table should contain the same amount and header names of the columns. Similarly, next time the column number and names could vary.
Any help in this regard?
Try this :-
CREATE PROCEDURE GenerateTempTable
#tableName as nvarchar(max),
#Col1 as nvarchar(255),
#Col2 as nvarchar(255)
AS
BEGIN
Declare #sql nvarchar(max)
set #sql='CREATE TABLE #'+ #tableName + '
('+ #col1+ ' nvarchar(255),'+
#col2 + ' nvarchar(255)
)'
-- Select #sql Check the DDL
EXECUTE sp_executesql #sql,
N'#tableName nvarchar(max),#Col1 nvarchar(255),#Col2 nvarchar(255)',
#tableName = #tableName,#Col1=#Col1,#Col2=#Col2
END
The problem with the above query is temp table is created with the dynamic block query therefore it cannot be accessed after the block . In order to access the table outside the scope then you need to create global temp table ##
Edit :-
An example with Global Temp Tables and static table name
ALTER PROCEDURE GenerateTable
#Col1 as nvarchar(255),
#Col2 as nvarchar(255)
AS
BEGIN
Declare #sql nvarchar(max)
If object_id('tempdb..##TempTable') is not null
Drop table ##TempTable
set #sql='CREATE TABLE ##TempTable
('+ #col1+ ' nvarchar(255),'+
#col2 + ' nvarchar(255)
)'
-- Select #sql Check the DDL
EXECUTE sp_executesql #sql,
N'#Col1 nvarchar(255),#Col2 nvarchar(255)',
#Col1=#Col1,#Col2=#Col2
END
To execute the SP the sql is :-
Declare #tableName varchar(max),
#Col1 varchar(70),
#Col2 varchar(70)
Exec GenerateTable #col1='ColA',#Col2='ColB'
Edit 2:-
If you are sure that the number of parameters wont exceed x values ( Say 5) .Then you can create 5 default parameter .Check this link for further details.
Could you not build a table out of a distinct list from wherever these "Dynamic Field Names" live... Then push that in as a string list... Like... I built a table with colors then got a field of names and now am going to push it into a string that can be used to build out the table headers... no limit to quantity...
SELECT #Fields = coalesce(#Fields + ',', '') + convert(varchar(50),[name])
FROM #TempCols
WHERE column_id > 1
ORDER BY column_id
Where Column_ID is just a Windowed ROW_Number...
I don't agree with the notion of its not possible ever. There is always a way, we may not see it now but there is always a method that can be nested or abused to bend any rule to what we need.