Using data within an extracted text string for an inner join - sql

Newbie so please be easy on me.
I have a select statement which returns a text string
select [column name]from [table] where [column name] like '%dog%'
simple enough, returns a result much like below
random text dog '123' more random text
random text dog '345' more random text
random text dog '723' more random text ...
I am looking to extract the 123, 345, 723 part of the text string which I can kind of do through using
Declare #Text Varchar(100);
Set #Text = 'random text dog ''123'' more random text';
Select Left(Substring(#Text, Patindex('%''%', #Text) + 1, Len(#Text) - Patindex('%''%', #Text))
,Patindex('%''%', Substring(#Text, Patindex('%''%', #Text) + 1, Len(#Text) - Patindex('%''%', #Text)))- 1) 'Lookup Index'
I am then wanting to use the result as part of inner join to another table to return a result along the lines of
Lookup Index Colour
123 Blue
345 Green
723 Orange
I just cant seem to tie it all together so any help much appreciated.
Thanking you all in advance.

Here's an option for you.
Load the parsed data into a temp table, then join that back to your lookup table. I would be easier and would probably perform better.
Have a look at this. There's an example with some test data all using temp tables. You'd obviously need to adjust for your specific tables and requirements:
--This will be our table that has the data you want to parse
CREATE TABLE #TestData (
TextColumn nvarchar(1000)
)
--lookup table for colours
CREATE TABLE #LookUp(
LookUpIndex INT
,Colour NVARCHAR(100)
)
--Use a temp table to load the parsed data from #TestData
CREATE TABLE #TestDataParse (
LookUpIndex INT
,TextColumn NVARCHAR(1000)
)
--Load our test data
INSERT INTO [#TestData] ([TextColumn])
VALUES('random text dog ''123'' more random text')
,('random text dog ''345'' more random text')
,('random text dog ''723'' more random text')
--populate our lookup table
INSERT INTO [#LookUp] (
[LookUpIndex]
, [Colour]
)
VALUES(123, 'Blue')
,(345, 'Green')
,(723 , 'Orange')
--Now parse the LookUp number out and load that to our temp table
INSERT INTO [#TestDataParse] (
[LookUpIndex]
, [TextColumn]
)
SELECT
Left(Substring(TextColumn, Patindex('%''%', TextColumn) + 1, Len(TextColumn) - Patindex('%''%', TextColumn))
,Patindex('%''%', Substring(TextColumn, Patindex('%''%', TextColumn) + 1, Len(TextColumn) - Patindex('%''%', TextColumn)))- 1) AS LookUpIndex
,[TextColumn]
FROM [#TestData]
--Now we can join that back to the lookup table
SELECT *
FROM [#TestDataParse] [a]
INNER JOIN [#LookUp] [b]
ON [b].[LookUpIndex] = [a].[LookUpIndex];
--Example done
--Drop our #temp tables
DROP TABLE [#LookUp]
DROP TABLE [#TestData]
DROP TABLE [#TestDataParse]
or you could skip loading into a temp table and use a sub query to simplify the join back to your lookup table, something like:
SELECT * FROM (
SELECT
Left(Substring(TextColumn, Patindex('%''%', TextColumn) + 1, Len(TextColumn) - Patindex('%''%', TextColumn))
,Patindex('%''%', Substring(TextColumn, Patindex('%''%', TextColumn) + 1, Len(TextColumn) - Patindex('%''%', TextColumn)))- 1) AS LookUpIndex
,[TextColumn]
FROM [#TestData] a
) AS tb
INNER JOIN [#LookUp] b ON [b].[LookUpIndex] = [tb].[LookUpIndex]
I'd test each one and see which performs the best for you.

This is a pain but you can do:
select t.col, left(v.dogv, charindex('''', v.dogv) - 1)
from (values ('random text dog ''123'' more random text')) t(col) cross apply
(values (stuff(col, 1, charindex('dog', col) + 4, ''))
) v(dogv)
where col like '%dog%';

Related

Split string and display below other column data using SQL Server [duplicate]

I have a table that looks like this:
ProductId, Color
"1", "red, blue, green"
"2", null
"3", "purple, green"
And I want to expand it to this:
ProductId, Color
1, red
1, blue
1, green
2, null
3, purple
3, green
Whats the easiest way to accomplish this? Is it possible without a loop in a proc?
Take a look at this function. I've done similar tricks to split and transpose data in Oracle. Loop over the data inserting the decoded values into a temp table. The convent thing is that MS will let you do this on the fly, while Oracle requires an explicit temp table.
MS SQL Split Function
Better Split Function
Edit by author:
This worked great. Final code looked like this (after creating the split function):
select pv.productid, colortable.items as color
from product p
cross apply split(p.color, ',') as colortable
based on your tables:
create table test_table
(
ProductId int
,Color varchar(100)
)
insert into test_table values (1, 'red, blue, green')
insert into test_table values (2, null)
insert into test_table values (3, 'purple, green')
create a new table like this:
CREATE TABLE Numbers
(
Number int not null primary key
)
that has rows containing values 1 to 8000 or so.
this will return what you want:
EDIT
here is a much better query, slightly modified from the great answer from #Christopher Klein:
I added the "LTRIM()" so the spaces in the color list, would be handled properly: "red, blue, green". His solution requires no spaces "red,blue,green". Also, I prefer to use my own Number table and not use master.dbo.spt_values, this allows the removal of one derived table too.
SELECT
ProductId, LEFT(PartialColor, CHARINDEX(',', PartialColor + ',')-1) as SplitColor
FROM (SELECT
t.ProductId, LTRIM(SUBSTRING(t.Color, n.Number, 200)) AS PartialColor
FROM test_table t
LEFT OUTER JOIN Numbers n ON n.Number<=LEN(t.Color) AND SUBSTRING(',' + t.Color, n.Number, 1) = ','
) t
EDIT END
SELECT
ProductId, Color --,number
FROM (SELECT
ProductId
,CASE
WHEN LEN(List2)>0 THEN LTRIM(RTRIM(SUBSTRING(List2, number+1, CHARINDEX(',', List2, number+1)-number - 1)))
ELSE NULL
END AS Color
,Number
FROM (
SELECT ProductId,',' + Color + ',' AS List2
FROM test_table
) AS dt
LEFT OUTER JOIN Numbers n ON (n.Number < LEN(dt.List2)) OR (n.Number=1 AND dt.List2 IS NULL)
WHERE SUBSTRING(List2, number, 1) = ',' OR List2 IS NULL
) dt2
ORDER BY ProductId, Number, Color
here is my result set:
ProductId Color
----------- --------------
1 red
1 blue
1 green
2 NULL
3 purple
3 green
(6 row(s) affected)
which is the same order you want...
You can try this out, doesnt require any additional functions:
declare #t table (col1 varchar(10), col2 varchar(200))
insert #t
select '1', 'red,blue,green'
union all select '2', NULL
union all select '3', 'green,purple'
select col1, left(d, charindex(',', d + ',')-1) as e from (
select *, substring(col2, number, 200) as d from #t col1 left join
(select distinct number from master.dbo.spt_values where number between 1 and 200) col2
on substring(',' + col2, number, 1) = ',') t
I arrived this question 10 years after the post.
SQL server 2016 added STRING_SPLIT function.
By using that, this can be written as below.
declare #product table
(
ProductId int,
Color varchar(max)
);
insert into #product values (1, 'red, blue, green');
insert into #product values (2, null);
insert into #product values (3, 'purple, green');
select
p.ProductId as ProductId,
ltrim(split_table.value) as Color
from #product p
outer apply string_split(p.Color, ',') as split_table;
Fix your database if at all possible. Comma delimited lists in database cells indicate a flawed schema 99% of the time or more.
I would create a CLR table-defined function for this:
http://msdn.microsoft.com/en-us/library/ms254508(VS.80).aspx
The reason for this is that CLR code is going to be much better at parsing apart the strings (computational work) and can pass that information back as a set, which is what SQL Server is really good at (set management).
The CLR function would return a series of records based on the parsed values (and the input id value).
You would then use a CROSS APPLY on each element in your table.
Just convert your columns into xml and query it. Here's an example.
select
a.value('.', 'varchar(42)') c
from (select cast('<r><a>' + replace(#CSV, ',', '</a><a>') + '</a></r>' as xml) x) t1
cross apply x.nodes('//r/a') t2(a)
Why not use dynamic SQL for this purpose, something like this(adapt to your needs):
DECLARE #dynSQL VARCHAR(max)
SET #dynSQL = 'insert into DestinationTable(field) values'
select #dynSQL = #dynSQL + '('+ REPLACE(Color,',',''',''') + '),' from Table
SET #dynSql = LEFT(#dynSql,LEN(#dynSql) -1) -- delete the last comma
exec #dynSql
One advantage is that you can use it on any SQL Server version

T-SQL LIKE condition on comma-separated list

Is it possible to write a LIKE condition in T-SQL to match a comma-separated list which includes wildcards to a string. Let me explain further with an example:
Say you have the following command separated list of urls in a field:
'/, /news/%, /about/'
Now here's some examples of strings I'd like to match with the string above:
'/'
'/news/'
'/news/2/'
'/about/'
And here's some strings which would not match:
'/contact/'
'/about/me/'
I've achieved this in the past by writing a split function and then doing a like on each one. However I'm trying to get my query to work in SQL Server CE which doesn't support functions.
In case you are wondering here's how I achieved it using the split function:
SELECT Widgets.Id
FROM Widgets
WHERE (SELECT COUNT(*) FROM [dbo].[Split](Urls, ',') WHERE #Input LIKE Data) > 0
And here's the split function:
CREATE FUNCTION [dbo].[Split]
(
#RowData NVARCHAR(MAX),
#Separator NVARCHAR(MAX)
)
RETURNS #RtnValue TABLE
(
[Id] INT IDENTITY(1,1),
[Data] NVARCHAR(MAX)
)
AS
BEGIN
DECLARE #Iterator INT
SET #Iterator = 1
DECLARE #FoundIndex INT
SET #FoundIndex = CHARINDEX(#Separator, #RowData)
WHILE (#FoundIndex > 0)
BEGIN
INSERT INTO #RtnValue ([Data])
SELECT Data = LTRIM(RTRIM(SUBSTRING(#RowData, 1, #FoundIndex - 1)))
SET #RowData = SUBSTRING(#RowData, #FoundIndex + DATALENGTH(#Separator) / 2, LEN(#RowData))
SET #Iterator = #Iterator + 1
SET #FoundIndex = CHARINDEX(#Separator, #RowData)
END
INSERT INTO #RtnValue ([Data])
SELECT Data = LTRIM(RTRIM(#RowData))
RETURN
END
I'd appreciate it if someone could help. Thanks
I can think of several options:
Use a session-keyed table: delete rows matching current spid, insert desired rows with current spid, read from table in SP, delete from table (again).
Make your client submit a query with many OR ... LIKE ... clauses.
Write an SP that does the same thing as your function and returns a recordset. INSERT YourTable EXEC SP #Strings and you are done!
Use the numbers-table-charindex-into-string inside of a derived table method of splitting the string.
Example
Let me flesh this out a little for you with an example combining ideas #3 and #4. Of course, your code for your function could be adapted, too.
Build a separate Numbers table. Here is example creation script:
--Numbers Table with 8192 elements (keeping it small for CE)
CREATE TABLE Numbers (
N smallint NOT NULL CONSTRAINT PK_Numbers PRIMARY KEY CLUSTERED
);
INSERT Numbers VALUES (1);
WHILE ##RowCount < 4096
INSERT Numbers SELECT N + (SELECT Max(N) FROM Numbers) FROM Numbers;
The SP:
CREATE PROCEDURE dbo.StringSplitRowset
#String varchar(8000)
AS
SELECT Substring(#String, l.StartPos, l.Chars) Item
FROM (
SELECT
S.StartPos,
IsNull(NullIf(CharIndex(',', #String, S.StartPos), 0) - S.StartPos, 8000)
FROM (
SELECT 1 UNION ALL
SELECT N.N + 1 FROM Numbers N WHERE Substring(#String, N.N, 1) = ','
) S (StartPos)
) L (StartPos, Chars);
And usage, easy as pie:
DECLARE #String varchar(8000);
SET #String = 'abc,def,ghi,jkl';
CREATE TABLE #Split (S varchar(8000));
INSERT #Split EXEC dbo.StringSplitRowset #String;
SELECT * FROM #Split;
Result:
abc
def
ghi
jkl
And finally, if you don't want to build a numbers table, you can use this SP. I think you will find that one of these two SPs performs well enough for you. There are other implementations of string splitting that could work as well.
ALTER PROCEDURE dbo.StringSplitRowset
#String varchar(8000)
AS
SELECT Substring(#String, l.StartPos, l.Chars) Item
FROM (
SELECT
S.StartPos,
IsNull(NullIf(CharIndex(',', #String, S.StartPos), 0) - S.StartPos, 8000)
FROM (
SELECT 1 UNION ALL
SELECT N.N + 1
FROM (
SELECT A.A * 4096 + B.B * 1024 + C.C * 256 + D.D * 64 + E.E * 16 + F.F * 4 + G.G N
FROM
(SELECT 0 UNION ALL SELECT 1) A (A),
(SELECT 1 UNION ALL SELECT 2 UNION ALL SELECT 3 UNION ALL SELECT 4) G (G),
(SELECT 0 UNION ALL SELECT 1 UNION ALL SELECT 2 UNION ALL SELECT 3) F (F),
(SELECT 0 UNION ALL SELECT 1 UNION ALL SELECT 2 UNION ALL SELECT 3) E (E),
(SELECT 0 UNION ALL SELECT 1 UNION ALL SELECT 2 UNION ALL SELECT 3) D (D),
(SELECT 0 UNION ALL SELECT 1 UNION ALL SELECT 2 UNION ALL SELECT 3) C (C),
(SELECT 0 UNION ALL SELECT 1 UNION ALL SELECT 2 UNION ALL SELECT 3) B (B)
) N (N)
WHERE Substring(#String, N.N, 1) = ','
) S (StartPos)
) L (StartPos, Chars)
Any SQL writer serious about understanding some of the performance implications of splitting strings different ways ought to see Aaron Bertrand's blog post on splitting strings.
Also, any serious SQL Server database student ought to see Erland Sommarskog's How to Share Data between Stored Procedures.
Will SQL Server CE let you split with XML functions and use CROSS APPLY? If so, you could do something like this:
SELECT DISTINCT T1.id
FROM (
SELECT id, CAST(('<X>'+
REPLACE(REPLACE(urls,' ',''),',','</X><X>')+
'</X>'
) AS xml
) as URLsXML
FROM dbo.Widgets
) AS T1
CROSS APPLY(
SELECT N.value('.', 'varchar(50)') AS URLPattern
FROM URLsXML.nodes('X') AS S(N)
) AS T2
WHERE #Input LIKE T2.URLPattern
UPDATE: I just checked. It looks like SQL Server CE doesn't support the XML data type or CROSS APPLY.
I think you will have to populate another table with ID's and Patterns to join against.

SQL Server query with multiple values in one column relating to another column

Situation: This table holds the relation information between a Documents table and an Users table. Certain Users need to review or approve documents (Type). I would like to have it to where I could get all of the reviewers on one line if needed. So if three users review Document 1, then a row would have 346, 394, 519 as the value, since those are the reviewers
Table: xDocumentsUsers
DocID..UserID....Type...
1........386......approver
1........346......reviewer
1........394......reviewer..
1........519......reviewer..
4........408......reviewer..
5........408......reviewer..
6........408......reviewer..
7........386......approver..
7........111......readdone..
7........346......reviewer..
8........386......approver..
8........346......reviewer..
9........386......approver..
9........346......reviewer..
10.......386......approver..
11.......386......approver..
11......346......reviewer..
12......386......approver..
12......346......reviewer..
13......386......approver..
13......346......reviewer..
14......386......approver..
14......346......reviewer..
15......386......approver
So desired result would be...
DocID..UserID................Type...
1........386....................approver
1........346,394,519......reviewer.
4........408....................reviewer..
5........408....................reviewer..
6........408....................reviewer..
7........386....................approver..
7........111....................readdone..
7........346....................reviewer..
8........386....................approver..
8........346....................reviewer..
9........386....................approver..
9........346....................reviewer..
10......386....................approver..
11......386....................approver..
11......346....................reviewer..
12......386....................approver..
12......346....................reviewer..
13......386....................approver..
13......346....................reviewer..
14......386....................approver..
14......346....................reviewer..
15......386....................approver
The FOR XML PATH is a great solution. You need to be aware, though, that it will convert any special characters in the inner SELECTs result set into their xml equivalent - i.e., & will become & in the XML result set. You can easily revert back to the original character by using the REPLACE function around the inner result set. To borrow from astander's previous example, it would look like (note that the SELECT as the 1st argument to the REPLACE function is enclosed in ():
--Concat
SELECT t.ID,
REPLACE((SELECT tIn.Val + ','
FROM #Table tIn
WHERE tIn.ID = t.ID
FOR XML PATH('')), '&', '&'))
FROM #Table t
GROUP BY t.ID
Have a look at
Emulating MySQL’s GROUP_CONCAT() Function in SQL Server 2005
Is there a way to create a SQL Server function to “join” multiple rows from a subquery into a single delimited field?
A simple example is
DECLARE #Table TABLE(
ID INT,
Val VARCHAR(50)
)
INSERT INTO #Table (ID,Val) SELECT 1, 'A'
INSERT INTO #Table (ID,Val) SELECT 1, 'B'
INSERT INTO #Table (ID,Val) SELECT 1, 'C'
INSERT INTO #Table (ID,Val) SELECT 2, 'B'
INSERT INTO #Table (ID,Val) SELECT 2, 'C'
--Concat
SELECT t.ID,
(
SELECT tIn.Val + ','
FROM #Table tIn
WHERE tIn.ID = t.ID
FOR XML PATH('')
)
FROM #Table t
GROUP BY t.ID
Does this help?
SELECT DocID
, [Type]
, (SELECT CAST(UserID + ', ' AS VARCHAR(MAX))
FROM [xDocumentsUsers]
WHERE (UserID = x1.UserID)
FOR XML PATH ('')
) AS [UserIDs]
FROM [xDocumentsUsers] AS x1

SQL: break up one row into many (normalization)

I am in middle of upgrading from a poorly designed legacy database to a new database. In the old database there is tableA with fields Id and Commodities. Id is the primary key and contains an int and Commodities contains a comma delimited list.
TableA:
id | commodities
1135 | fish,eggs,meat
1127 | flour,oil
In the new database, I want tableB to be in the form id, commodity where each commodity is a single item from the comma delimited list in tableA.
TableB:
id | commodity
1135 | fish
1135 | eggs
1135 | meat
1127 | flour
1127 | oil
I have a function, functionA, that when given an id, a list, and a delimiter, returns a table with an id and item field. How can I use this function to turn the two fields from tableA into tableB?
(Note: I had trouble figuring out what to title this question. Please feel free to edit the title to make it more accurately reflect the question!)
Here is the function code:
ALTER FUNCTION dbo.functionA
(
#id int,
#List VARCHAR(6000),
#Delim varchar(5)
)
RETURNS
#ParsedList TABLE
(
id int,
item VARCHAR(6000)
)
AS
BEGIN
DECLARE #item VARCHAR(6000), #Pos INT
SET #List = LTRIM(RTRIM(#List))+ #Delim
SET #Pos = CHARINDEX(#Delim, #List, 1)
WHILE #Pos > 0
BEGIN
SET #item = LTRIM(RTRIM(LEFT(#List, #Pos - 1)))
IF #item <> ''
BEGIN
INSERT INTO #ParsedList (id, item)
VALUES (#id, CAST(#item AS VARCHAR(6000)))
END
SET #List = RIGHT(#List, LEN(#List) - #Pos)
SET #Pos = CHARINDEX(#Delim, #List, 1)
END
RETURN
END
Here's the link I posted as a comment:
http://www.sqlteam.com/article/parsing-csv-values-into-multiple-rows
You need a way to split and process the string in TSQL, there are many ways to do this. This article covers the PROs and CONs of just about every method:
Arrays and Lists in SQL Server 2000 and Earlier
You need to create a split function. This is how a split function can be used:
SELECT
*
FROM YourTable y
INNER JOIN dbo.yourSplitFunction(#Parameter) s ON y.ID=s.Value
[I prefer the number table approach to split a string in TSQL](Arrays and Lists in SQL Server 2000 and Earlier) but there are numerous ways to split strings in SQL Server, see the previous link, which explains the PROs and CONs of each.
For the Numbers Table method to work, you need to do this one time table setup, which will create a table Numbers that contains rows from 1 to 10,000:
SELECT TOP 10000 IDENTITY(int,1,1) AS Number
INTO Numbers
FROM sys.objects s1
CROSS JOIN sys.objects s2
ALTER TABLE Numbers ADD CONSTRAINT PK_Numbers PRIMARY KEY CLUSTERED (Number)
Once the Numbers table is set up, create this split function:
CREATE FUNCTION inline_split_me (#SplitOn char(1),#param varchar(7998)) RETURNS TABLE AS
RETURN(SELECT substring(#SplitOn + #param + ',', Number + 1,
charindex(#SplitOn, #SplitOn + #param + #SplitOn, Number + 1) - Number - 1)
AS Value
FROM Numbers
WHERE Number <= len(#SplitOn + #param + #SplitOn) - 1
AND substring(#SplitOn + #param + #SplitOn, Number, 1) = #SplitOn)
GO
You can now easily split a CSV string into a table and join on it:
select * from dbo.inline_split_me(';','1;22;333;4444;;') where LEN(Value)>0
OUTPUT:
Value
----------------------
1
22
333
4444
(4 row(s) affected)
to make you new table use this:
--set up tables:
create table TableA (id int, commodities varchar(8000))
INSERT TableA VALUES (1135,'fish,eggs,meat')
INSERT TableA VALUES (1127,'flour,oil')
Create table TableB (id int, commodities varchar(8000))
--populate TableB
INSERT TableB
(id, commodities)
SELECT
a.id,c.value
FROM TableA a
CROSS APPLY dbo.inline_split_me(',',a.commodities) c
--show tableB contents:
select * from TableB
OUTPUT:
id commodities
----------- -------------
1135 fish
1135 eggs
1135 meat
1127 flour
1127 oil
(5 row(s) affected)
EDIT after Conrad Frix comment about SQL Server 2000 not supporting CROSS APPLY
this will do the same:
INSERT TableB
(id, commodities)
SELECT
a.id,NullIf(SubString(',' + a.commodities + ',' , number , CharIndex(',' , ',' + a.commodities + ',' , number) - number) , '')
FROM TableA a
INNER JOIN Numbers n ON 1=1
WHERE SubString(',' + a.commodities + ',' , number - 1, 1) = ','
AND CharIndex(',' , ',' + a.commodities + ',' , number) - number > 0
AND number <= Len(',' + a.commodities + ',')
and is based on the code from the link in the answer by #Rup. It basically removes the function call and does the split in the main query (using a similar Numbers table split), so no need for a CROSS APPLY.
You write a SQL batch that loops through table A and inserts into table b the results of your function call.
Call me lazy, but I'd pull the combined rows out of the database, split them, then reinsert the split rows. This kind of thing seems kind of unnatural for SQL...
SSIS has a pretty handy Unpivot transform if that's available to you.
create table Project (ProjectId int, Description varchar(50));
insert into Project values (1, 'Chase tail, change directions');
insert into Project values (2, 'ping-pong ball in clothes dryer');
create table ProjectResource (ProjectId int, ResourceId int, Name varchar(15));
insert into ProjectResource values (1, 1, 'Adam');
insert into ProjectResource values (1, 2, 'Kerry');
insert into ProjectResource values (1, 3, 'Tom');
insert into ProjectResource values (2, 4, 'David');
insert into ProjectResource values (2, 5, 'Jeff');
-- a bit of SQL magic involving XML and voila
SELECT *,
(SELECT Name + ' ' AS [text()]
FROM ProjectResource pr
WHERE pr.ProjectId = p.ProjectId
FOR XML PATH ('')) AS ResourceList
FROM Project p
The output of this will be :
ProjectId Description ResourceList
1 Chase tail, change directions Adam Kerry Tom
2 ping-pong ball in clothes dryer David Jeff

How do I expand comma separated values into separate rows using SQL Server 2005?

I have a table that looks like this:
ProductId, Color
"1", "red, blue, green"
"2", null
"3", "purple, green"
And I want to expand it to this:
ProductId, Color
1, red
1, blue
1, green
2, null
3, purple
3, green
Whats the easiest way to accomplish this? Is it possible without a loop in a proc?
Take a look at this function. I've done similar tricks to split and transpose data in Oracle. Loop over the data inserting the decoded values into a temp table. The convent thing is that MS will let you do this on the fly, while Oracle requires an explicit temp table.
MS SQL Split Function
Better Split Function
Edit by author:
This worked great. Final code looked like this (after creating the split function):
select pv.productid, colortable.items as color
from product p
cross apply split(p.color, ',') as colortable
based on your tables:
create table test_table
(
ProductId int
,Color varchar(100)
)
insert into test_table values (1, 'red, blue, green')
insert into test_table values (2, null)
insert into test_table values (3, 'purple, green')
create a new table like this:
CREATE TABLE Numbers
(
Number int not null primary key
)
that has rows containing values 1 to 8000 or so.
this will return what you want:
EDIT
here is a much better query, slightly modified from the great answer from #Christopher Klein:
I added the "LTRIM()" so the spaces in the color list, would be handled properly: "red, blue, green". His solution requires no spaces "red,blue,green". Also, I prefer to use my own Number table and not use master.dbo.spt_values, this allows the removal of one derived table too.
SELECT
ProductId, LEFT(PartialColor, CHARINDEX(',', PartialColor + ',')-1) as SplitColor
FROM (SELECT
t.ProductId, LTRIM(SUBSTRING(t.Color, n.Number, 200)) AS PartialColor
FROM test_table t
LEFT OUTER JOIN Numbers n ON n.Number<=LEN(t.Color) AND SUBSTRING(',' + t.Color, n.Number, 1) = ','
) t
EDIT END
SELECT
ProductId, Color --,number
FROM (SELECT
ProductId
,CASE
WHEN LEN(List2)>0 THEN LTRIM(RTRIM(SUBSTRING(List2, number+1, CHARINDEX(',', List2, number+1)-number - 1)))
ELSE NULL
END AS Color
,Number
FROM (
SELECT ProductId,',' + Color + ',' AS List2
FROM test_table
) AS dt
LEFT OUTER JOIN Numbers n ON (n.Number < LEN(dt.List2)) OR (n.Number=1 AND dt.List2 IS NULL)
WHERE SUBSTRING(List2, number, 1) = ',' OR List2 IS NULL
) dt2
ORDER BY ProductId, Number, Color
here is my result set:
ProductId Color
----------- --------------
1 red
1 blue
1 green
2 NULL
3 purple
3 green
(6 row(s) affected)
which is the same order you want...
You can try this out, doesnt require any additional functions:
declare #t table (col1 varchar(10), col2 varchar(200))
insert #t
select '1', 'red,blue,green'
union all select '2', NULL
union all select '3', 'green,purple'
select col1, left(d, charindex(',', d + ',')-1) as e from (
select *, substring(col2, number, 200) as d from #t col1 left join
(select distinct number from master.dbo.spt_values where number between 1 and 200) col2
on substring(',' + col2, number, 1) = ',') t
I arrived this question 10 years after the post.
SQL server 2016 added STRING_SPLIT function.
By using that, this can be written as below.
declare #product table
(
ProductId int,
Color varchar(max)
);
insert into #product values (1, 'red, blue, green');
insert into #product values (2, null);
insert into #product values (3, 'purple, green');
select
p.ProductId as ProductId,
ltrim(split_table.value) as Color
from #product p
outer apply string_split(p.Color, ',') as split_table;
Fix your database if at all possible. Comma delimited lists in database cells indicate a flawed schema 99% of the time or more.
I would create a CLR table-defined function for this:
http://msdn.microsoft.com/en-us/library/ms254508(VS.80).aspx
The reason for this is that CLR code is going to be much better at parsing apart the strings (computational work) and can pass that information back as a set, which is what SQL Server is really good at (set management).
The CLR function would return a series of records based on the parsed values (and the input id value).
You would then use a CROSS APPLY on each element in your table.
Just convert your columns into xml and query it. Here's an example.
select
a.value('.', 'varchar(42)') c
from (select cast('<r><a>' + replace(#CSV, ',', '</a><a>') + '</a></r>' as xml) x) t1
cross apply x.nodes('//r/a') t2(a)
Why not use dynamic SQL for this purpose, something like this(adapt to your needs):
DECLARE #dynSQL VARCHAR(max)
SET #dynSQL = 'insert into DestinationTable(field) values'
select #dynSQL = #dynSQL + '('+ REPLACE(Color,',',''',''') + '),' from Table
SET #dynSql = LEFT(#dynSql,LEN(#dynSql) -1) -- delete the last comma
exec #dynSql
One advantage is that you can use it on any SQL Server version