Call procedure for each row without using cursors and loops? - sql

I need to apply a procedure on every record's NVARCHAR(MAX) field in a table. The procedure will receive a large string and split it into several shorter strings (less than 100 chars). The procedure will return a result set of smaller string. These strings will be inserted into a different table (each in its own row).
How can I apply this procedure in a set-based fashion to the whole table, so that I can insert the results into another table?
I've found some similar questions on SO, however they didn't need to use the INSERT INTO construct. This means UDF and TVF functions are off the table. EDIT: functions do not support DML statements. I wanted to use INSERT INTO inside the function.
Alternatively, is there a set-based way of using a stored procedure? SELECT sproc(Text) FROM Table didn't work.

I am not sure of your exact logic to split the string, but if possible you can make your split function an inline TVF (Heres one I made earlier):
CREATE FUNCTION dbo.Split(#StringToSplit NVARCHAR(MAX), #Delimiter NCHAR(1))
RETURNS TABLE
AS
RETURN
(
SELECT Position = Number,
Value = SUBSTRING(#StringToSplit, Number, CHARINDEX(#Delimiter, #StringToSplit + #Delimiter, Number) - Number)
FROM ( SELECT TOP (LEN(#StringToSplit) + 1) Number = ROW_NUMBER() OVER(ORDER BY a.object_id)
FROM sys.all_objects a
) n
WHERE SUBSTRING(#Delimiter + #StringToSplit + #Delimiter, n.Number, 1) = #Delimiter
);
Then you can simply use this in your insert statement by using cross apply with the TVF:
DECLARE #T1 TABLE (ID INT IDENTITY, TextToSplit NVARCHAR(MAX) NOT NULL);
DECLARE #T2 TABLE (T1ID INT NOT NULL, Position INT NOT NULL, SplitText NVARCHAR(MAX) NOT NULL);
INSERT #T1 (TextToSplit)
VALUES ('This is a test'), ('This is Another Test');
INSERT #T2 (T1ID, Position, SplitText)
SELECT t1.ID, s.Position, s.Value
FROM #T1 t1
CROSS APPLY dbo.Split(t1.TextToSplit, N' ') s;
SELECT *
FROM #T2;

Related

How do I pass a list as a parameter in a stored procedure?

Looking to pass a list of User IDs to return a list names. I have a plan to handle the outputed names (with a COALESCE something or other) but trying to find the best way to pass in the list of user IDs.
The guts of my sproc will look something like this:
create procedure [dbo].[get_user_names]
#user_id_list, --which would equal a list of incoming ID numbers like (5,44,72,81,126)
#username varchar (30) output
as
select last_name+', '+first_name
from user_mstr
where user_id in #user_id_list
Passing the values for #user_id_list is my main concern here.
The preferred method for passing an array of values to a stored procedure in SQL server is to use table valued parameters.
First you define the type like this:
CREATE TYPE UserList AS TABLE ( UserID INT );
Then you use that type in the stored procedure:
create procedure [dbo].[get_user_names]
#user_id_list UserList READONLY,
#username varchar (30) output
as
select last_name+', '+first_name
from user_mstr
where user_id in (SELECT UserID FROM #user_id_list)
So before you call that stored procedure, you fill a table variable:
DECLARE #UL UserList;
INSERT #UL VALUES (5),(44),(72),(81),(126)
And finally call the SP:
EXEC dbo.get_user_names #UL, #username OUTPUT;
As far as I can tell, there are three main contenders: Table-Valued Parameters, delimited list string, and JSON string.
Since 2016, you can use the built-in STRING_SPLIT if you want the delimited route: https://learn.microsoft.com/en-us/sql/t-sql/functions/string-split-transact-sql
That would probably be the easiest/most straightforward/simple approach.
Also since 2016, JSON can be passed as a nvarchar and used with OPENJSON: https://learn.microsoft.com/en-us/sql/t-sql/functions/openjson-transact-sql
That's probably best if you have a more structured data set to pass that may be significantly variable in its schema.
TVPs, it seems, used to be the canonical way to pass more structured parameters, and they are still good if you need that structure, explicitness, and basic value/type checking. They can be a little more cumbersome on the consumer side, though. If you don't have 2016+, this is probably the default/best option.
I think it's a trade off between any of these concrete considerations as well as your preference for being explicit about the structure of your params, meaning even if you have 2016+, you may prefer to explicitly state the type/schema of the parameter rather than pass a string and parse it somehow.
Azure DB, Azure Data WH and from SQL Server 2016, you can use STRING_SPLIT to achieve a similar result to what was described by #sparrow.
Recycling code from #sparrow
WHERE user_id IN (SELECT value FROM STRING_SPLIT( #user_id_list, ',')
Simple and effective way of accepting a list of values into a Stored Procedure
You can try this:
create procedure [dbo].[get_user_names]
#user_id_list varchar(2000), -- You can use any max length
#username varchar (30) output
as
select last_name+', '+first_name
from user_mstr
where user_id in (Select ID from dbo.SplitString( #user_id_list, ',') )
And here is the user defined function for SplitString:
Create FUNCTION [dbo].[SplitString]
(
#Input NVARCHAR(MAX),
#Character CHAR(1)
)
RETURNS #Output TABLE (
Item NVARCHAR(1000)
)
AS
BEGIN
DECLARE #StartIndex INT, #EndIndex INT
SET #StartIndex = 1
IF SUBSTRING(#Input, LEN(#Input) - 1, LEN(#Input)) <> #Character
BEGIN
SET #Input = #Input + #Character
END
WHILE CHARINDEX(#Character, #Input) > 0
BEGIN
SET #EndIndex = CHARINDEX(#Character, #Input)
INSERT INTO #Output(Item)
SELECT SUBSTRING(#Input, #StartIndex, #EndIndex - 1)
SET #Input = SUBSTRING(#Input, #EndIndex + 1, LEN(#Input))
END
RETURN
END
I solved this problem through the following:
In C # I built a String variable.
string userId="";
I put my list's item in this variable. I separated the ','.
for example: in C#
userId= "5,44,72,81,126";
and Send to SQL-Server
SqlParameter param = cmd.Parameters.AddWithValue("#user_id_list",userId);
I Create Separated Function in SQL-server For Convert my Received List (that it's type is NVARCHAR(Max)) to Table.
CREATE FUNCTION dbo.SplitInts
(
#List VARCHAR(MAX),
#Delimiter VARCHAR(255)
)
RETURNS TABLE
AS
RETURN ( SELECT Item = CONVERT(INT, Item) FROM
( SELECT Item = x.i.value('(./text())[1]', 'varchar(max)')
FROM ( SELECT [XML] = CONVERT(XML, '<i>'
+ REPLACE(#List, #Delimiter, '</i><i>') + '</i>').query('.')
) AS a CROSS APPLY [XML].nodes('i') AS x(i) ) AS y
WHERE Item IS NOT NULL
);
In the main Store Procedure, using the command below, I use the entry list.
SELECT user_id = Item FROM dbo.SplitInts(#user_id_list, ',');
this is perfect working for me . this perfect example i hope solved many users problem.
Step 1
Creare reference table in sql like this
Create TYPE dbo.tblNames
AS TABLE
(
[Name] nvarchar(max)
);
go
create TYPE dbo.tblNamesWithCols
AS TABLE
(
[Name] nvarchar(max)
);
go
Step 2
create store procedure with reference table parameters like this
create proc syTest
#VarTbleNameList AS dbo.tblNames READONLY,
#VarTbleNameColsList AS dbo.tblNamesWithCols READONLY,
#VarWhereQuery nvarchar(max)
as
begin
......
...... End
**Calling Store Procedure with parameters **
DECLARE #VarTbleList AS dbo.tblNames
INSERT INTO #VarTbleList
VALUES ( 'tblEmployes' )
INSERT INTO #VarTbleList
VALUES ( 'tblDepartments' )
INSERT INTO #VarTbleList
VALUES ( 'tblCities' )
DECLARE #VarTbleColList AS dbo.tblNamesWithCols
INSERT INTO #VarTbleColList
VALUES ( 'tblEmployes.EmployeId as empId;' )
INSERT INTO #VarTbleColList
VALUES ( 'tblEmployes.EmployeName as empName;' )
INSERT INTO #VarTbleColList
VALUES ( 'tblDepartments.DepartmentName as deptName;' )
INSERT INTO #VarTbleColList
VALUES ( 'tblDepartments.DepartmentId as deptId;' )
EXECUTE syTest #VarTbleList , #VarTbleColList , #VarWhereQuery ='test'
You can use this simple 'inline' method to construct a string_list_type parameter (works in SQL Server 2014):
declare #p1 dbo.string_list_type
insert into #p1 values(N'myFirstString')
insert into #p1 values(N'mySecondString')
Example use when executing a stored proc:
exec MyStoredProc #MyParam=#p1
Check the below code this work for me
#ManifestNoList VARCHAR(MAX)
WHERE
(
ManifestNo IN (SELECT value FROM dbo.SplitString(#ManifestNoList, ','))
)
The proper way is to create a user defined data type:
CREATE TYPE [dbo].[IntArray] AS TABLE
(
[ID] [INT] NULL
)
Then you can use this custom data type:
CREATE OR ALTER PROCEDURE [dbo].[sp_GetUserNames]
(
#userIds [IntArray] READONLY
)
AS
BEGIN
SET NOCOUNT ON;
SELECT
"Name" = u.LastName + ', ' + u.FirstName
FROM dbo.User u
JOIN #userIds uid ON u.Id = uid.Id;
END
Usage:
#DECLARE #result TABLE
(
Name NVARCHAR(max)
);
#DECLARE #ids [IntArray] = SELECT x.userId FROM dbo.sometable x;
SET #result = EXECUTE [dbo].[sp_GetUserNames] #userIds = #ids;
SELECT * FROM #result;
Maybe you could use:
select last_name+', '+first_name
from user_mstr
where ',' + #user_id_list + ',' like '%,' + convert(nvarchar, user_id) + ',%'

Passing a variable into an IN clause within a SQL function? [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Parameterizing an SQL IN clause?
I have a SQL function whereby I need to pass a list of IDs in, as a string, into:
WHERE ID IN (#MyList)
I have looked around and most of the answers are either where the SQL is built within C# and they loop through and call AddParameter, or the SQL is built dynamically.
My SQL function is fairly large and so building the query dynamically would be rather tedious.
Is there really no way to pass in a string of comma-separated values into the IN clause?
My variable being passed in is representing a list of integers so it would be:
"1,2,3,4,5,6,7" etc
Here is a slightly more efficient way to split a list of integers. First, create a numbers table, if you don't already have one. This will create a table with 100,000 unique integers (you may need more or less):
;WITH x AS
(
SELECT TOP (1000000) Number = ROW_NUMBER() OVER
(ORDER BY s1.[object_id])
FROM sys.all_objects AS s1 CROSS JOIN sys.all_objects AS s2
ORDER BY s1.[object_id]
)
SELECT Number INTO dbo.Numbers FROM x;
CREATE UNIQUE CLUSTERED INDEX n ON dbo.Numbers(Number);
Then a function:
CREATE FUNCTION [dbo].[SplitInts_Numbers]
(
#List NVARCHAR(MAX),
#Delimiter NVARCHAR(255)
)
RETURNS TABLE
WITH SCHEMABINDING
AS
RETURN
(
SELECT Item = CONVERT(INT, SUBSTRING(#List, Number,
CHARINDEX(#Delimiter, #List + #Delimiter, Number) - Number))
FROM dbo.Numbers
WHERE Number <= CONVERT(INT, LEN(#List))
AND SUBSTRING(#Delimiter + #List, Number, 1) = #Delimiter
);
You can compare the performance to an iterative approach here:
http://sqlfiddle.com/#!3/960d2/1
To avoid the numbers table, you can also try an XML-based version of the function - it is more compact but less efficient:
CREATE FUNCTION [dbo].[SplitInts_XML]
(
#List VARCHAR(MAX),
#Delimiter CHAR(1)
)
RETURNS TABLE
WITH SCHEMABINDING
AS
RETURN ( SELECT Item = CONVERT(INT, Item) FROM (
SELECT Item = x.i.value('(./text())[1]', 'int') FROM (
SELECT [XML] = CONVERT(XML, '<i>' + REPLACE(#List, #Delimiter, '</i><i>')
+ '</i>').query('.') ) AS a CROSS APPLY [XML].nodes('i') AS x(i)) AS y
WHERE Item IS NOT NULL
);
Anyway once you have a function you can simply say:
WHERE ID IN (SELECT Item FROM dbo.SplitInts_Numbers(#MyList, ','));
Passing a string directly into the IN clause is not possible. However, if you are providing the list as a string to a stored procedure, for example, you can use the following dirty method.
First, create this function:
CREATE FUNCTION [dbo].[fnNTextToIntTable] (#Data NTEXT)
RETURNS
#IntTable TABLE ([Value] INT NULL)
AS
BEGIN
DECLARE #Ptr int, #Length int, #v nchar, #vv nvarchar(10)
SELECT #Length = (DATALENGTH(#Data) / 2) + 1, #Ptr = 1
WHILE (#Ptr < #Length)
BEGIN
SET #v = SUBSTRING(#Data, #Ptr, 1)
IF #v = ','
BEGIN
INSERT INTO #IntTable (Value) VALUES (CAST(#vv AS int))
SET #vv = NULL
END
ELSE
BEGIN
SET #vv = ISNULL(#vv, '') + #v
END
SET #Ptr = #Ptr + 1
END
-- If the last number was not followed by a comma, add it to the result set
IF #vv IS NOT NULL
INSERT INTO #IntTable (Value) VALUES (CAST(#vv AS int))
RETURN
END
(Note: this is not my original code, but thanks to versioning systems here at my place of work, I have lost the header comment linking to the source.)
Then use it like so:
SELECT *
FROM tblMyTable
INNER JOIN fnNTextToIntTable(#MyList) AS List ON tblMyTable.ID = List.Value
Or, as in your question:
SELECT *
FROM tblMyTable
WHERE ID IN ( SELECT Value FROM fnNTextToIntTable(#MyList) )

Comma-separated value insertion In SQL Server 2005

How can I insert values from a comma-separated input parameter with a stored procedure?
For example:
exec StoredProcedure Name 17,'127,204,110,198',7,'162,170,163,170'
you can see that I have two comma-separated value lists in the parameter list. Both will have the same number of values: if the first has 5 comma-separated values, then the second one also has 5 comma-separated values.
127 and 162 are related
204 and 170 are related
...and same for the others.
How can I insert these two values?
One comma-separated value is inserted, but how do I insert two?
Have a lok at something like (Full Example)
DECLARE #Inserts TABLE(
ID INT,
Val1 INT,
Val2 INT,
Val3 INT
)
DECLARE #Param1 INT,
#Param2 VARCHAR(100),
#Param3 INT,
#Param4 VARCHAR(100)
SELECT #Param1 = 17,
#Param2 = '127,204,110,198',
#Param3 = 7,
#Param4 = '162,170,163,170'
DECLARE #Table1 TABLE(
ID INT IDENTITY(1,1),
Val INT
)
DECLARE #Table2 TABLE(
ID INT IDENTITY(1,1),
Val INT
)
DECLARE #textXML XML
SELECT #textXML = CAST('<d>' + REPLACE(#Param2, ',', '</d><d>') + '</d>' AS XML)
INSERT INTO #Table1
SELECT T.split.value('.', 'nvarchar(max)') AS data
FROM #textXML.nodes('/d') T(split)
SELECT #textXML = CAST('<d>' + REPLACE(#Param4, ',', '</d><d>') + '</d>' AS XML)
INSERT INTO #Table2
SELECT T.split.value('.', 'nvarchar(max)') AS data
FROM #textXML.nodes('/d') T(split)
INSERT INTO #Inserts
SELECT #Param1,
t1.Val,
#Param3,
t2.Val
FROM #Table1 t1 INNER JOIN
#Table2 t2 ON t1.ID = t2.ID
SELECT *
FROM #Inserts
You need a way to split and process the string in TSQL, there are many ways to do this. This article covers the PROs and CONs of just about every method:
"Arrays and Lists in SQL Server 2005 and Beyond, When Table Value Parameters Do Not Cut it" by Erland Sommarskog
You need to create a split function. This is how a split function can be used:
SELECT
*
FROM YourTable y
INNER JOIN dbo.yourSplitFunction(#Parameter) s ON y.ID=s.Value
I prefer the number table approach to split a string in TSQL but there are numerous ways to split strings in SQL Server, see the previous link, which explains the PROs and CONs of each.
For the Numbers Table method to work, you need to do this one time table setup, which will create a table Numbers that contains rows from 1 to 10,000:
SELECT TOP 10000 IDENTITY(int,1,1) AS Number
INTO Numbers
FROM sys.objects s1
CROSS JOIN sys.objects s2
ALTER TABLE Numbers ADD CONSTRAINT PK_Numbers PRIMARY KEY CLUSTERED (Number)
Once the Numbers table is set up, create this split function:
CREATE FUNCTION [dbo].[FN_ListToTableRows]
(
#SplitOn char(1) --REQUIRED, the character to split the #List string on
,#List varchar(8000)--REQUIRED, the list to split apart
)
RETURNS TABLE
AS
RETURN
(
----------------
--SINGLE QUERY-- --this will return empty rows, and row numbers
----------------
SELECT
ROW_NUMBER() OVER(ORDER BY number) AS RowNumber
,LTRIM(RTRIM(SUBSTRING(ListValue, number+1, CHARINDEX(#SplitOn, ListValue, number+1)-number - 1))) AS ListValue
FROM (
SELECT #SplitOn + #List + #SplitOn AS ListValue
) AS InnerQuery
INNER JOIN Numbers n ON n.Number < LEN(InnerQuery.ListValue)
WHERE SUBSTRING(ListValue, number, 1) = #SplitOn
);
GO
You can now easily split a CSV string into a table and join on it. To accomplish your task, set up a test table to insert into:
create table YourTable (col1 int, col2 int)
then create your procedure:
CREATE PROCEDURE StoredProcedureName
(
#Params1 int
,#Array1 varchar(8000)
,#Params2 int
,#Array2 varchar(8000)
)
AS
INSERT INTO YourTable
(col1, col2)
SELECT
a1.ListValue, a2.ListValue
FROM dbo.FN_ListToTableRows(',',#Array1) a1
INNER JOIN dbo.FN_ListToTableRows(',',#Array2) a2 ON a1.RowNumber=a2.RowNumber
GO
test it out:
exec StoredProcedureName 17,'127,204,110,198',7,'162,170,163,170'
select * from YourTable
OUTPUT:
(4 row(s) affected)
col1 col2
----------- -----------
127 162
204 170
110 163
198 170
(4 row(s) affected)
This may not be an answer to your question... But I thought of letting you know that there is a better way to pass related values (Table Format) to a stored procedure... XML... You can build the XML string in your app (just as regular string) and pass it on to the stored procedure as a parameter... You can then use the following syntax to get it into a table. Hope this helps... In this way you can pass an entire table as parameter to stored procedure...
--Parameters
#param1 int,
#Budgets xml,
#Param2 int
-- #Budgets = '<Values><Row><Val1>127</Val1><Val2>162</Val2></Row> <Row><Val1>204</Val1><Val2>170</Val2></Row></Values>'
SELECT #param1 as Param1,
x.query('Val1').value('.','int') as val1,
#param3 as Param3,
x.query('Val2').value('.','int') as val1,
into #NewTable
FROM #Budgets.nodes('/Values/Row') x1(x)

How to split space delimited field into rows in SQL Server?

I found this function which returns three rows for the following query:
select * from dbo.split('1 2 3',' ')
However, I need to use values from a field instead of '1 2 3'.
I tried:
select * from dbo.split(select top 1 myfield from mytable,' ')
But it fails saying incorrect syntax.
It doesn't have to use the function above, so feel free to recommend another function or different way to go about it. To clarify, I only need to parse the values from a single row of a single field.
You need to apply the split(myfield) function to each row in mytable. When the split function is a table valued function the correct answer is the APPLY operator:
The APPLY operator allows you to
invoke a table-valued function for
each row returned by an outer table
expression of a query.
So the answer must be:
select *
from mytable
cross apply dbo.split(myfield, ' ');
Example:
create table mytable (myfield varchar(10));
insert into mytable (myfield) values ('1 2 3');
go
create function split (#list varchar(max), #delimiter char(1))
returns #shards table (value varchar(8000))
with schemabinding
as
begin
declare #i int;
set #i = 0;
while #i <= len(#list)
begin
declare #n int;
set #n = charindex(#delimiter, #list, #i);
if 0 = #n
begin
set #n = len(#list);
end
insert into #shards (value)
values (substring(#list, #i, #n-#i+1));
set #i = #n+1;
end
return;
end
go
select *
from mytable
cross apply dbo.split(myfield, ' ');
Have you tried
SELECT dbo.split(myfield, ' ') AS x FROM mytable
EXEC SP_DBCMPTLEVEL 'YOUR_DB_NAME',90;
Should fix the problem of Remus's incompatible code. I just looked into my own db and it was set to level '80' which means it supports <= SQL 2000. After applying the procedure above, the code runs and works perfectly.
Now I just need to find out wtf relies on SQL2000 and breaks in SQL2005...AHH!
This MSDN link will help you determine whether your fn/usp/app layers will be negatively impacted: http://msdn.microsoft.com/en-us/library/bb510680.aspx
Try
select * from dbo.split((select top 1 myfield from mytable),' ')
put the UDF around your column, example
SELECT dbo.split(myfield, ' ') as SplitValue
FROM mytable

Define variable to use with IN operator (T-SQL)

I have a Transact-SQL query that uses the IN operator. Something like this:
select * from myTable where myColumn in (1,2,3,4)
Is there a way to define a variable to hold the entire list "(1,2,3,4)"? How should I define it?
declare #myList {data type}
set #myList = (1,2,3,4)
select * from myTable where myColumn in #myList
DECLARE #MyList TABLE (Value INT)
INSERT INTO #MyList VALUES (1)
INSERT INTO #MyList VALUES (2)
INSERT INTO #MyList VALUES (3)
INSERT INTO #MyList VALUES (4)
SELECT *
FROM MyTable
WHERE MyColumn IN (SELECT Value FROM #MyList)
DECLARE #mylist TABLE (Id int)
INSERT INTO #mylist
SELECT id FROM (VALUES (1),(2),(3),(4),(5)) AS tbl(id)
SELECT * FROM Mytable WHERE theColumn IN (select id from #mylist)
There are two ways to tackle dynamic csv lists for TSQL queries:
1) Using an inner select
SELECT * FROM myTable WHERE myColumn in (SELECT id FROM myIdTable WHERE id > 10)
2) Using dynamically concatenated TSQL
DECLARE #sql varchar(max)
declare #list varchar(256)
select #list = '1,2,3'
SELECT #sql = 'SELECT * FROM myTable WHERE myColumn in (' + #list + ')'
exec sp_executeSQL #sql
3) A possible third option is table variables. If you have SQl Server 2005 you can use a table variable. If your on Sql Server 2008 you can even pass whole table variables in as a parameter to stored procedures and use it in a join or as a subselect in the IN clause.
DECLARE #list TABLE (Id INT)
INSERT INTO #list(Id)
SELECT 1 UNION ALL SELECT 2 UNION ALL SELECT 3 UNION ALL SELECT 4
SELECT
*
FROM
myTable
JOIN #list l ON myTable.myColumn = l.Id
SELECT
*
FROM
myTable
WHERE
myColumn IN (SELECT Id FROM #list)
Use a function like this:
CREATE function [dbo].[list_to_table] (#list varchar(4000))
returns #tab table (item varchar(100))
begin
if CHARINDEX(',',#list) = 0 or CHARINDEX(',',#list) is null
begin
insert into #tab (item) values (#list);
return;
end
declare #c_pos int;
declare #n_pos int;
declare #l_pos int;
set #c_pos = 0;
set #n_pos = CHARINDEX(',',#list,#c_pos);
while #n_pos > 0
begin
insert into #tab (item) values (SUBSTRING(#list,#c_pos+1,#n_pos - #c_pos-1));
set #c_pos = #n_pos;
set #l_pos = #n_pos;
set #n_pos = CHARINDEX(',',#list,#c_pos+1);
end;
insert into #tab (item) values (SUBSTRING(#list,#l_pos+1,4000));
return;
end;
Instead of using like, you make an inner join with the table returned by the function:
select * from table_1 where id in ('a','b','c')
becomes
select * from table_1 a inner join [dbo].[list_to_table] ('a,b,c') b on (a.id = b.item)
In an unindexed 1M record table the second version took about half the time...
I know this is old now but TSQL => 2016, you can use STRING_SPLIT:
DECLARE #InList varchar(255) = 'This;Is;My;List';
WITH InList (Item) AS (
SELECT value FROM STRING_SPLIT(#InList, ';')
)
SELECT *
FROM [Table]
WHERE [Item] IN (SELECT Tag FROM InList)
Starting with SQL2017 you can use STRING_SPLIT and do this:
declare #myList nvarchar(MAX)
set #myList = '1,2,3,4'
select * from myTable where myColumn in (select value from STRING_SPLIT(#myList,','))
DECLARE #myList TABLE (Id BIGINT) INSERT INTO #myList(Id) VALUES (1),(2),(3),(4);
select * from myTable where myColumn in(select Id from #myList)
Please note that for long list or production systems it's not recommended to use this way as it may be much more slower than simple INoperator like someColumnName in (1,2,3,4) (tested using 8000+ items list)
slight improvement on #LukeH, there is no need to repeat the "INSERT INTO":
and #realPT's answer - no need to have the SELECT:
DECLARE #MyList TABLE (Value INT)
INSERT INTO #MyList VALUES (1),(2),(3),(4)
SELECT * FROM MyTable
WHERE MyColumn IN (SELECT Value FROM #MyList)
No, there is no such type. But there are some choices:
Dynamically generated queries (sp_executesql)
Temporary tables
Table-type variables (closest thing that there is to a list)
Create an XML string and then convert it to a table with the XML functions (really awkward and roundabout, unless you have an XML to start with)
None of these are really elegant, but that's the best there is.
If you want to do this without using a second table, you can do a LIKE comparison with a CAST:
DECLARE #myList varchar(15)
SET #myList = ',1,2,3,4,'
SELECT *
FROM myTable
WHERE #myList LIKE '%,' + CAST(myColumn AS varchar(15)) + ',%'
If the field you're comparing is already a string then you won't need to CAST.
Surrounding both the column match and each unique value in commas will ensure an exact match. Otherwise, a value of 1 would be found in a list containing ',4,2,15,'
As no one mentioned it before, starting from Sql Server 2016 you can also use json arrays and OPENJSON (Transact-SQL):
declare #filter nvarchar(max) = '[1,2]'
select *
from dbo.Test as t
where
exists (select * from openjson(#filter) as tt where tt.[value] = t.id)
You can test it in
sql fiddle demo
You can also cover more complicated cases with json easier - see Search list of values and range in SQL using WHERE IN clause with SQL variable?
This one uses PATINDEX to match ids from a table to a non-digit delimited integer list.
-- Given a string #myList containing character delimited integers
-- (supports any non digit delimiter)
DECLARE #myList VARCHAR(MAX) = '1,2,3,4,42'
SELECT * FROM [MyTable]
WHERE
-- When the Id is at the leftmost position
-- (nothing to its left and anything to its right after a non digit char)
PATINDEX(CAST([Id] AS VARCHAR)+'[^0-9]%', #myList)>0
OR
-- When the Id is at the rightmost position
-- (anything to its left before a non digit char and nothing to its right)
PATINDEX('%[^0-9]'+CAST([Id] AS VARCHAR), #myList)>0
OR
-- When the Id is between two delimiters
-- (anything to its left and right after two non digit chars)
PATINDEX('%[^0-9]'+CAST([Id] AS VARCHAR)+'[^0-9]%', #myList)>0
OR
-- When the Id is equal to the list
-- (if there is only one Id in the list)
CAST([Id] AS VARCHAR)=#myList
Notes:
when casting as varchar and not specifying byte size in parentheses the default length is 30
% (wildcard) will match any string of zero or more characters
^ (wildcard) not to match
[^0-9] will match any non digit character
PATINDEX is an SQL standard function that returns the position of a pattern in a string
DECLARE #StatusList varchar(MAX);
SET #StatusList='1,2,3,4';
DECLARE #Status SYS_INTEGERS;
INSERT INTO #Status
SELECT Value
FROM dbo.SYS_SPLITTOINTEGERS_FN(#StatusList, ',');
SELECT Value From #Status;
Most of these seem to focus on separating-out each INT into its own parenthetical, for example:
(1),(2),(3), and so on...
That isn't always convenient. Especially since, many times, you already start with a comma-separated list, for example:
(1,2,3,...) and so on...
In these situations, you may care to do something more like this:
DECLARE #ListOfIds TABLE (DocumentId INT);
INSERT INTO #ListOfIds
SELECT Id FROM [dbo].[Document] WHERE Id IN (206,235,255,257,267,365)
SELECT * FROM #ListOfIds
I like this method because, more often than not, I am trying to work with IDs that should already exist in a table.
My experience with a commonly proposed technique offered here,
SELECT * FROM Mytable WHERE myColumn IN (select id from #mylist)
is that it induces a major performance degradation if the primary data table (Mytable) includes a very large number of records. Presumably, that is because the IN operator’s list-subquery is re-executed for every record in the data table.
I’m not seeing any offered solution here that provides the same functional result by avoiding the IN operator entirely. The general problem isn’t a need for a parameterized IN operation, it’s a need for a parameterized inclusion constraint. My favored technique for that is to implement it using an (inner) join:
DECLARE #myList varchar(50) /* BEWARE: if too small, no error, just missing data! */
SET #myList = '1,2,3,4'
SELECT *
FROM myTable
JOIN STRING_SPLIT(#myList,',') MyList_Tbl
ON myColumn = MyList_Tbl.Value
It is so much faster because the generation of the constraint-list table (MyList_Tbl) is executed only once for the entire query execution. Typically, for large data sets, this technique executes at least five times faster than the functionally equivalent parameterized IN operator solutions, like those offered here.
I think you'll have to declare a string and then execute that SQL string.
Have a look at sp_executeSQL