Split column data and insert - SQL Server stored procedures - sql

I have a table with a few hundred thousand rows and the data format is index (int), and a words nvarchar(1000). The words string is made up of a collection of words separated by a space, e.g word1 word2 word3. I want to read the word table and create a dictionary. In terms of pseudo code this is what I want:
INSERT INTO dictionary (dictionaryword)
SELECt splitBySpace(words) FROM word;
This is simple enough to code in Java or C#, but I have found the system takes a long time to process the data. In other processing the cost benefit to running SQL to handle the query (i.e not processing the data in c# or Java) is huge.
I want to create a stored procedure which reads the words, splits them, and then creates the dictionary. I have seen various split procedures which are a little complex, e.g https://dba.stackexchange.com/questions/21078/t-sql-table-valued-function-to-split-a-column-on-commas but I could not see how to re-code this for the task of reading a whole database, splitting the words, and inserting them.
Has anyone any sample code to split the column data and then insert it which can wholly implemented in SQL for reasons of efficiency?

Here is the solution.
DDL:
create table sentence(t varchar(100))
insert into sentence values
('Once upon a time in America'),
('Eyes wide shut')
DML:
select distinct ca.d as words from sentence s
cross apply(select split.a.value('.', 'varchar(100)') as d
from
(select cast('<x>' + REPLACE(s.t, ' ', '</x><x>') + '</x>' as xml) as d) as a
cross apply d.nodes ('/x') as split(a)) ca
Output:
words
a
America
Eyes
in
Once
shut
time
upon
wide
Fiddle http://sqlfiddle.com/#!6/54dff/4

I suggest you to use a stored procedure like this:
CREATE PROCEDURE spSplit
#words nvarchar(max),
#delimiter varchar(1) = ' '
AS
BEGIN
SET NOCOUNT ON;
DECLARE #sql nvarchar(max)
SELECT #sql = 'SELECT ''' + REPLACE(#words, #delimiter, ''' As res UNION ALL SELECT ''') + ''''
--or for removing duplicates SELECT #sql = 'SELECT ''' + REPLACE(#words, #delimiter, ''' As res UNION SELECT ''') + ''''
EXEC(#sql)
END
GO
This stored procedure will give you the results that you can use it in INSERT INTO statement, and:
CREATE PROCEDURE spSplit
#words nvarchar(max) = 'a bc lkj weu 234 , sdsd 3 and 3 & test',
#delimiter varchar(1) = ' ',
#destTable nvarchar(255),
#destColumn nvarchar(255)
AS
BEGIN
SET NOCOUNT ON;
DECLARE #sql nvarchar(max)
SELECT #sql = 'INSERT INTO [' + #destTable + '] ([' + #destColumn + ']) SELECT res FROM ('
SELECT #sql = #sql + 'SELECT ''' + REPLACE(#words, #delimiter, ''' As res UNION ALL SELECT ''') + ''''
SELECT #sql = #sql + ') DT WHERE res NOT IN (SELECT [' + #destColumn + '] FROM [' + #destTable + '])'
EXEC(#sql)
END
GO
This stored procedure will do the insert with out inserting duplicates.

Related

How can i extend the code to be able to show drop list of values from selected column in SSRS report

I'm new to sql and i'm trying to create SSRS.
I found this code in internet to create SSRS report and it works good to me. However i need to adjust this code to get the value as well from selected column
USE [project]
GO
/****** Object: StoredProcedure [dbo].[Report] Script Date: 26-1-2020 01:19:45 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
ALTER procedure [dbo].[Report]
#SchemaName VARCHAR(128)='sys',
#TableName VARCHAR(128)='columns',
#ColumnList VARCHAR(MAX)='object_id,column_id,name,max_length,system_type_id'
AS
BEGIN
DECLARE #ColumnNames VARCHAR(MAX)
DECLARE #ColumnNamesVAR VARCHAR(MAX)
--drop ##Temp_Data Table
IF OBJECT_ID('tempdb..##Temp_Data') IS NOT NULL
DROP TABLE ##Temp_Data
--drop ##Temp_Data_Final Table
IF OBJECT_ID('tempdb..##Temp_Data_Final') IS NOT NULL
DROP TABLE ##Temp_Data_Final
--drop #Temp_Columns Table
IF OBJECT_ID('tempdb..#Temp_Columns') IS NOT NULL
DROP TABLE #Temp_Columns
Create table #ColumnList (Data NVARCHAR(MAX))
insert into #ColumnList values (#ColumnList)
--convert all column list to VARCHAR(1000) for unpivot
;with Cte_ColumnList as (
SELECT
'['+LTRIM(RTRIM(m.n.value('.[1]','varchar(8000)')))+']' AS ColumnList
FROM
(
SELECT CAST('<XMLRoot><RowData>' + REPLACE(Data,',','</RowData><RowData>')
+ '</RowData></XMLRoot>' AS XML) AS x
FROM #ColumnList
)t
CROSS APPLY x.nodes('/XMLRoot/RowData')m(n))
,CTE_ColumnListVarchar as
(Select 'CAST('+ColumnList+' as VARCHAR(1000)) AS '+ColumnList AS ColumnListVAR,ColumnList from Cte_ColumnList)
SELECT #ColumnNamesVAR = COALESCE(#ColumnNamesVAR + ', ', '') + ColumnListVAR,
#ColumnNames = COALESCE(#ColumnNames + ', ', '') + ColumnList
FROM CTE_ColumnListVarchar
--Insert data into ##Temp_Data Table
DECLARE #SQL NVARCHAR(MAX)
DECLARE #TempTbleSQL NVARCHAR(MAX)
SET #TempTbleSQL='Select ROW_NUMBER()
OVER (order by (Select 1)) AS R,'+#ColumnNames +' into ##Temp_Data from ['+#SchemaName+'].['+#TableName+']'
--Print #TempTbleSQL
EXEC(#TempTbleSQL)
SET #SQL='
select
R,columnname,value into ##Temp_Data_Final from
(select R,'+#ColumnNamesVAR+' from ##Temp_Data )u
unpivot
(value for columnname in ('+#ColumnNames+'))v'
--Print #SQL
EXEC(#SQL)
Select * From ##Temp_Data_Final
END
SO, Now i can select Schema, Table & column. but i couldn't know how i get drop list for values in selected column.
And one more thing. how i can deploy this report to web form.Or if there any way to create dynamic sql with cascading parameters where i can select schema, table, column and values
PLEASE SOMEBODY HELP ME WITH THIS IT REALLY IMPORTANT
Here i can choose Schema, then table and the column. So i want to extend the code to be able to get another drop list with value of selected column
I used also the following datasets for each parameter
--ds_schema
SELECT NAME AS schemaname FROM sys.schemas
WHERE NAME not in (
'guest',
'information_schema',
'sys',
'db_owner',
'db_accessadmin',
'db_securityadmin',
'db_ddladmin',
'db_backupoperator',
'db_datareader',
'db_datawriter',
'db_denydatareader',
'db_denydatawriter')
----DSTables
Select Distinct Table_Name as TableName from INFORMATION_SCHEMA.TABLES
where TABLE_SCHEMA=#SchemaName
order by Table_Name
----DS_Columns
Select COLUMN_NAME as ColumnName from INFORMATION_SCHEMA.COLUMNS
where TABLE_SCHEMA=#SchemaName
and TABLE_NAME=#TableName
To get a list of values in a column you need to build a SQL statement then execute it.
As you have your parameters you can do something like this...
SET #SQL = 'SELECT DISTINCT ' + QUOTENAME(#ColumnName) + ' FROM ' + QUOTENAME(#SchemaName) + '.' + QUOTENAME(#TableName) + ' ORDER BY ' + QUOTENAME(#ColumnName)
EXEC (#SQL)
notes
This gives a DISTINCT list of values and also sorts them using the ORDER BY clause, just edit the SET #SQL = line to adjust the query that is executed.
I've used QUOTENAME() to put square brackets around the schema, table and column names e.g. SELECT DISTINCT [myColumnName] FROM .....
You can add PRINT #SQL at the end to see the generated SQL if you like.

Creating "Not Exists"/Insert Into Statements - Stumped by nothing (NULL)

I need a script that generate insert statements but with check for if the data doesn't already exist, this because it should be periodically run on parallell systems where different dtata will be added to the systems but we want them tables to be in sync. I have the basic ides and borrowed parts of code but get a syntax error i have trouble solving.
I'm basing my code on the code Param Yadav showed at Converting Select results into Insert script - SQL Server but I need to check for data already in the table. (I need to add more "bells & whistles later, but take this step-by-step)
My own main addition is the #NOT_EXISTS part which should be in the WHERE clause of the NOT EXISTS check. If I replace that with a plain WHERE 0=1 I get no syntax error so it indicates the error is in my #NOT_EXISTS string.
Edit: Yesterday I thought I had an answer to my own question but when running on "real data" I saw that some lines are too long for QUOTENAME, I have to fix those quotation marks "manually" (concats in script) instead...
SET NOCOUNT ON
DECLARE #CSV_COLUMN VARCHAR(MAX),
#QUOTED_DATA VARCHAR(MAX),
#NOT_EXISTS VARCHAR(MAX),
#SQL_KOD VARCHAR(MAX),
#TABLE_NAME VARCHAR(MAX),
#FILTER_CONDITION VARCHAR(MAX)='',
#FIRST_COL INT,
#LAST_COL INT
/* INPUT DATA */
SELECT #TABLE_NAME = 'WorkflowError'
SELECT #FIRST_COL = 2
SELECT #LAST_COL = 4
/* */
SELECT #CSV_COLUMN=STUFF
(
(
SELECT ',['+ NAME +']' FROM sys.all_columns
WHERE OBJECT_ID=OBJECT_ID(#TABLE_NAME) AND
is_identity!=1 FOR XML PATH('')
),1,1,''
)
--SELECT #CSV_COLUMN
SELECT #QUOTED_DATA=STUFF
(
(
SELECT ' ISNULL(QUOTENAME('+NAME+','+QUOTENAME('''','''''')+'),'+'''NULL'''+')+'','''+'+' FROM sys.all_columns
WHERE OBJECT_ID=OBJECT_ID(#TABLE_NAME) AND
is_identity!=1 FOR XML PATH('')
),1,1,''
)
SELECT #QUOTED_DATA=SUBSTRING(#QUOTED_DATA,1,LEN(#QUOTED_DATA)-5)
SELECT #QUOTED_DATA
SELECT #NOT_EXISTS=STUFF
(
(
SELECT ' ['+ COLUMN_NAME +']=', 'ISNULL(QUOTENAME('+COLUMN_NAME+','+QUOTENAME('''','''''')+'),'+'''NULL'''+') AND '
FROM information_schema.columns
WHERE table_name = #TABLE_NAME AND
ordinal_position BETWEEN #FIRST_COL AND #LAST_COL
FOR XML PATH('')
),1,1,''
)
SELECT #NOT_EXISTS=SUBSTRING(#NOT_EXISTS,1,LEN(#NOT_EXISTS)-4)
SELECT #NOT_EXISTS
--SELECT #NOT_EXISTS=' 0=1 '
SELECT #SQL_KOD='SELECT ''
IF NOT EXISTS(SELECT 1
FROM ' + #TABLE_NAME + ' WHERE ' + #NOT_EXISTS + ')
BEGIN
INSERT INTO '+#TABLE_NAME+'('+#CSV_COLUMN+')
VALUES('''+'+'+#QUOTED_DATA+'+'+''')
END
GO '''+' Insert_Scripts
FROM '+#TABLE_NAME + #FILTER_CONDITION
SELECT #SQL_KOD
EXECUTE (#SQL_KOD)
GO
[stackoverflow won't let me post code unless it's formatted, but then the strings below won't be as they are created in the script...]
When I do SELECT #NOT_EXISTS=' 0=1 ' I get an INSERT line for each row in my table:
IF NOT EXISTS(SELECT 1 FROM WorkflowError WHERE 0=1 )
BEGIN
INSERT INTO WorkflowError([TargetSystem],[ErrorCode],[ErrorText],[RetryMaxCount],[RetryStrategyName],[ErrorDescription])
VALUES('EttLiv','800','Value cannot be null. Parameter name: source','0',NULL,'Value cannot be null. Parameter name: source')
END
GO
With my #NOT_EXISTS code the #SQL_KOD string becomes this:
SELECT 'IF NOT EXISTS(SELECT 1 FROM WorkflowError
WHERE [TargetSystem]=ISNULL(QUOTENAME(TargetSystem,''''),'NULL'))
BEGIN
INSERT INTO WorkflowError([TargetSystem],[ErrorCode],[ErrorText],[RetryMaxCount],[RetryStrategyName],[ErrorDescription])
VALUES('+ISNULL(QUOTENAME(TargetSystem,''''),'NULL')+','
+ ISNULL(QUOTENAME(ErrorCode,''''),'NULL')+','
+ ISNULL(QUOTENAME(ErrorText,''''),'NULL')+','
+ ISNULL(QUOTENAME(RetryMaxCount,''''),'NULL')+','
+ ISNULL(QUOTENAME(RetryStrategyName,''''),'NULL')+','
+ ISNULL(QUOTENAME(ErrorDescription,''''),'NULL')+')
END
GO ' Insert_Scripts FROM WorkflowError
However, trying to execute that #SQL_KOD line just gives:
Msg 156, Level 15, State 1, Line 3
Incorrect syntax near the keyword 'NULL'.
...and I can't find out where I have done wrong, if it's in my thinking or if it's just a misplaced quotation mark...
Where do you expect #SQL_KOD to get its values from? Because if you are retrieving your values for TargetSystem / ErrorCode / ... / ErrorDescription from somewhere outside of your insert statement, I would expect a "from" statement. If you want to input variables, you are missing both the definition of the variables and the #-sign in front of the variable name.
As far as keeping quotes happy: try writing your code with QUOTED_IDENTIFIER OFF - you can create the entire #SQL_KOD variable by writing between double quotes ("), and single quotes would behave like normal quotation marks.
A very basic re-write of your code could be something as follows:
SET QUOTED_IDENTIFIER OFF
DECLARE #SQL_KOD VARCHAR(MAX)
SET #SQL_KOD =
"DECLARE #WorkFlowError TABLE ([TargetSystem] NVARCHAR(200),[ErrorCode] NVARCHAR(200))
IF NOT EXISTS ( SELECT 1 FROM #WorkFlowError )
BEGIN
INSERT INTO #WorkFlowError ([TargetSystem],[ErrorCode])
SELECT ISNULL(QUOTENAME([TargetSystem],''''),'NULL')
, ISNULL(QUOTENAME([ErrorCode],''''),'NULL')
FROM (
SELECT [TargetSystem]='Foo'
, [ErrorCode]='Bar'
) src
END";
I originally used QUOTENAME as in the Param Yadav script I borrowed from but that function can't handle long strings. It doesn't complain, just returns NULL if the string is too long. Now the script is less readable (long lines of quotation marks) but now works.
SET NOCOUNT ON
DECLARE #CSV_COLUMN VARCHAR(MAX),
#QUOTED_DATA VARCHAR(MAX),
#NOT_EXISTS VARCHAR(MAX),
#SQL_KOD VARCHAR(MAX),
#TABLE_NAME VARCHAR(MAX),
#FILTER_CONDITION VARCHAR(MAX),
#FIRST_COL INT,
#LAST_COL INT
/* INPUT DATA */
SELECT #TABLE_NAME = 'WorkflowError'
SELECT #FIRST_COL = 2
SELECT #LAST_COL = 4
SELECT #FILTER_CONDITION = ''
/* */
SELECT #CSV_COLUMN=STUFF
(
(
SELECT ',['+ NAME +']' FROM sys.all_columns
WHERE OBJECT_ID=OBJECT_ID(#TABLE_NAME) AND
is_identity!=1 FOR XML PATH('')
),1,1,''
)
SELECT #QUOTED_DATA=STUFF
(
(
SELECT ' ISNULL('''''''' + REPLACE('+NAME+','''''''','''''''''''') + '''''''','''+'NULL'''+''+')+'',''+'
FROM sys.all_columns
WHERE OBJECT_ID=OBJECT_ID(#TABLE_NAME) AND
is_identity!=1 FOR XML PATH('')
),1,1,''
)
SELECT #QUOTED_DATA=SUBSTRING(#QUOTED_DATA,1,LEN(#QUOTED_DATA)-5)
SELECT #NOT_EXISTS=STUFF
(
(
SELECT ' ['+ COLUMN_NAME +']='' + ', 'ISNULL('''''''' + REPLACE('+COLUMN_NAME+','''''''','''''''''''') + '''''''','''+'NULL'''+''+')+'' AND '
FROM information_schema.columns
WHERE table_name = #TABLE_NAME AND
ordinal_position BETWEEN #FIRST_COL AND #LAST_COL
FOR XML PATH('')
),1,1,''
)
SELECT #NOT_EXISTS=SUBSTRING(#NOT_EXISTS,1,LEN(#NOT_EXISTS)-6)
SELECT #SQL_KOD='SELECT ''IF NOT EXISTS(SELECT 1 FROM ' + #TABLE_NAME + ' WHERE ' + #NOT_EXISTS + ' + ' + ''') BEGIN INSERT INTO '+#TABLE_NAME+'('+#CSV_COLUMN+')VALUES('''+'+'+#QUOTED_DATA+'+'+''') END '''+' Insert_Scripts FROM ' + #TABLE_NAME + ' ' + #FILTER_CONDITION
EXECUTE (#SQL_KOD)
SET NOCOUNT OFF

how to find all strings in between commas in sql server

I want to display a string in a table format as shown below:
For a string like 'hi,is,1,question,thanks,.,.,n'
I need this result:
column1 column2 column3 column4 ..... column
hi is 1 question ..... n
DECLARE #string VARCHAR(MAX);
SET #string = 'hi,is,1,question,thanks,.,.,n';
DECLARE #SQL VARCHAR(MAX);
SET #SQL = 'SELECT ''' + REPLACE(#string, ',', ''',''') + '''';
EXEC (#SQL);
Result:
Add SELECT ' at beginning and ' at the end of string
Replace all , with ',' inside string
So string 'hi,is,1,question,thanks,.,.,n' is replace by 'SELECT 'hi','is','1','question','thanks','.','.','n''
Executed as SQL query
PS: If you want to use it on column you will have to combine it with CURSOR
Update
DECLARE #table TABLE
(
ID INT IDENTITY,
string VARCHAR(MAX)
);
INSERT INTO #table
VALUES
('This,is,a,string,,n,elements,..');
INSERT INTO #table
VALUES
('And,one,more');
INSERT INTO #table
VALUES
('Ugly,but,works,,,Yay!,..,,,10,11,12,13,14,15,16,17,18,19,..');
SELECT * FROM #table
DECLARE #string_to_split VARCHAR(MAX);
DECLARE #sql_query_to_execute VARCHAR(MAX);
DECLARE #max_elements INT, #id INT, #i INT;
SET #i = 1;
DECLARE string_cursor CURSOR FOR SELECT ID, string FROM #table;
SELECT #max_elements = MAX(LEN(string) - LEN(REPLACE(string, ',', ''))) + 1 -- Find max number of elements */
FROM #table;
IF OBJECT_ID('tempdb..##my_temp_table_for_splitted_columns') <> 0 -- Create new temp table with valid amount of columns
DROP TABLE ##my_temp_table_for_splited_columns;
SET #sql_query_to_execute = 'create table ##my_temp_table_for_splitted_columns ( ID int,';
WHILE #i <= #max_elements
BEGIN
SET #sql_query_to_execute = #sql_query_to_execute + ' Col' + CAST(#i AS VARCHAR(max)) + ' varchar(25), ';
SET #i = #i + 1;
END;
SELECT #sql_query_to_execute = SUBSTRING(#sql_query_to_execute, 1, LEN(#sql_query_to_execute) - 1) + ')';
EXEC (#sql_query_to_execute);
/* Split string for each row */
OPEN string_cursor;
FETCH NEXT FROM string_cursor
INTO #id,
#string_to_split
WHILE ##FETCH_STATUS = 0
BEGIN
SET #i = MAX(LEN(#string_to_split) - LEN(REPLACE(#string_to_split, ',', ''))) + 1; -- check amount of columns for current string
WHILE #i < #max_elements
BEGIN
SET #string_to_split = #string_to_split + ','; -- add missing columns
SET #i = #i + 1;
END;
SET #sql_query_to_execute = 'SELECT ' + CAST(#id AS VARCHAR(MAX)) + ',''' + REPLACE(#string_to_split, ',', ''',''') + '''';
INSERT INTO ##my_temp_table_for_splitted_columns --insert result to temp table
EXEC (#sql_query_to_execute);
FETCH NEXT FROM string_cursor
INTO #id,
#string_to_split;
END;
CLOSE string_cursor;
DEALLOCATE string_cursor;
SELECT *
FROM ##my_temp_table_for_splitted_columns;
This is not trivial. You will find a lot of examples how to split your string in a set of fragments. And you will find a lot of examples how to pivot a row set to a single row. But - adding quite some difficulty - you have an unknown count of columns. There are three approaches:
Split this and return your set with a known maximum of columns
Use a dynamically created statement and use EXEC. But this will not work in VIEWs or iTVFs, nor will it work against a table.
Instead of a column list you return a generic container like XML
with a known maximum of columns
One example for the first was this
DECLARE #str VARCHAR(1000)='This,is,a,string,with,n,elements,...';
SELECT p.*
FROM
(
SELECT A.[value]
,CONCAT('Column',A.[key]+1) AS ColumnName
FROM OPENJSON('["' + REPLACE(#str,',','","') + '"]') A
) t
PIVOT
(
MAX(t.[value]) FOR ColumnName IN(Column1,Column2,Column3,Column4,Column5,Column6,Column7,Column8,Column9 /*add as many as you need*/)
) p
Hint: My approach to split the string uses OPENJSON, not available before version 2016. But there are many other approaches you'll find easily. It's just an example to show you the combination of a splitter with PIVOT using a running index to build up a column name.
Unknown count of columns
And the same example with a dynamically created column list was this:
DECLARE #str VARCHAR(1000)='This,is,a,string,with,n,elements,...';
DECLARE #CountElements INT=LEN(#str)-LEN(REPLACE(#str,',',''))+1;
DECLARE #columnList NVARCHAR(MAX)=
STUFF((
SELECT TOP(#CountElements)
CONCAT(',Column',ROW_NUMBER() OVER(ORDER BY (SELECT 1)))
FROM master..spt_values /*has a lot of rows*/
FOR XML PATH('')
),1,1,'');
DECLARE #Command NVARCHAR(MAX)=
N'SELECT p.*
FROM
(
SELECT A.[value]
,CONCAT(''Column'',A.[key]+1) AS ColumnName
FROM OPENJSON(''["'' + REPLACE(''' + #str + ''','','',''","'') + ''"]'') A
) t
PIVOT
(
MAX(t.[value]) FOR ColumnName IN(' + #columnList + ')
) p;';
EXEC(#Command);
Hint: The statement created is exactly the same as above. But the column list in the pivot's IN is created dynamically. This will work with (almost) any count of words generically.
If you need more help, please use the edit option of your question and provide some more details.
An inlineable approach for a table returning a generic container
If you need this against a table, you might try something along this:
DECLARE #tbl TABLE(ID INT IDENTITY,YourList NVARCHAR(MAX));
INSERT INTO #tbl VALUES('This,is,a,string,with,n,elements,...')
,('And,one,more');
SELECT *
,CAST('<x>' + REPLACE((SELECT t.YourList AS [*] FOR XML PATH('')),',','</x><x>') + '</x>' AS XML) AS Splitted
FROM #tbl t
This will return your list as an XML like
<x>This</x>
<x>is</x>
<x>a</x>
<x>string</x>
<x>with</x>
<x>n</x>
<x>elements</x>
<x>...</x>
You can grab - if needed - each element by its index like here
TheXml.value('/x[1]','nvarchar(max)') AS Element1

Select non-empty columns using SQL Server

I am using SQL Server 2012. i have a table with 90 columns. I am trying to select only columns that contains data. After searching i used the following procedure:
1- Getting all columns count using one select query
2- Pivoting Result Table into a Temp table
3- Creating Select query
4- Executing this query
Here is the query i used:
DECLARE #strTablename varchar(100) = 'dbo.MyTable'
DECLARE #strQuery varchar(max) = ''
DECLARE #strSecondQuery varchar(max) = 'SELECT '
DECLARE #strUnPivot as varchar(max) = ' UNPIVOT ([Count] for [Column] IN ('
CREATE TABLE ##tblTemp([Column] varchar(50), [Count] Int)
SELECT #strQuery = ISNULL(#strQuery,'') + 'Count([' + name + ']) as [' + name + '] ,' from sys.columns where object_id = object_id(#strTablename) and is_nullable = 1
SELECT #strUnPivot = ISNULL(#strUnPivot,'') + '[' + name + '] ,' from sys.columns where object_id = object_id(#strTablename) and is_nullable = 1
SET #strQuery = 'SELECT [Column],[Count] FROM ( SELECT ' + SUBSTRING(#strQuery,1,LEN(#strQuery) - 1) + ' FROM ' + #strTablename + ') AS p ' + SUBSTRING(#strUnPivot,1,LEN(#strUnPivot) - 1) + ')) AS unpvt '
INSERT INTO ##tblTemp EXEC (#strQuery)
SELECT #strSecondQuery = #strSecondQuery + '[' + [Column] + '],' from ##tblTemp WHERE [Count] > 0
DROP TABLE ##tblTemp
SET #strSecondQuery = SUBSTRING(#strSecondQuery,1,LEN(#strSecondQuery) - 1) + ' FROM ' + #strTablename
EXEC (#strSecondQuery)
The problem is that this query is TOO SLOW. Is there a best way to achieve this?
Notes:
Table have only one clustered index on primary key Column ID and does not contains any other indexes.
Table is not editable.
Table contains very large data.
Query is taking about 1 minute to be executed
Thanks in advance.
I do not know if this is faster, but you might use one trick: FOR XML AUTO will ommit columns without content:
DECLARE #tbl TABLE(col1 INT,col2 INT,col3 INT);
INSERT INTO #tbl VALUES (1,2,NULL),(1,NULL,NULL),(NULL,NULL,NULL);
SELECT *
FROM #tbl AS tbl
FOR XML AUTO
This is the result: col3 is missing...
<tbl col1="1" col2="2" />
<tbl col1="1" />
<tbl />
Knowing this, you could find the list of columns, which are not NULL in all rows, like this:
DECLARE #ColList VARCHAR(MAX)=
STUFF
(
(
SELECT DISTINCT ',' + Attr.value('local-name(.)','nvarchar(max)')
FROM
(
SELECT
(
SELECT *
FROM #tbl AS tbl
FOR XML AUTO,TYPE
) AS TheXML
) AS t
CROSS APPLY t.TheXML.nodes('/tbl/#*') AS A(Attr)
FOR XML PATH('')
),1,1,''
);
SELECT #ColList
The content of #ColList is now col1,col2. This string you can place in a dynamically created SELECT.
UPDATE: Hints
It would be very clever, to replace the SELECT * with a column list created from INFORMATION_SCHEMA.COLUMNS excluding all not-nullable. And - if needed and possible - types, wich contain very large data (BLOBs).
UPDATE2: Performance
Don't know what your very large data means actually... Just tried this on a table with about 500.000 rows (with SELECT *) and it returned correctly after less than one minute. Hope, this is fast enough...
Try using this condition:
where #columnname IS NOT NULL AND #columnname <> ' '

Converting Select results into Insert script - SQL Server

I have SQL Server 2008, SQL Server Management Studio.
I need to select data from a table in one database and insert into another table in another database.
How can I convert the returned results from my select into INSERT INTO ...?
Clarification from comments: While I believe this could be solved by a INSERT INTO SELECT or SELECT INTO, I do need to generate INSERT INTO ....
Here is another method, which may be easier than installing plugins or external tools in some situations:
Do a select [whatever you need]INTO temp.table_namefrom [... etc ...].
Right-click on the database in the Object Explorer => Tasks => Generate Scripts
Select temp.table_name in the "Choose Objects" screen, click Next.
In the "Specify how scripts should be saved" screen:
Click Advanced, find the "Types of data to Script" property, select "Data only", close the advanced properties.
Select "Save to new query window" (unless you have thousands of records).
Click Next, wait for the job to complete, observe the resulting INSERT statements appear in a new query window.
Use Find & Replace to change all [temp.table_name] to [your_table_name].
drop table [temp.table_name].
In SSMS:
Right click on the database > Tasks > Generate Scripts
Next
Select "Select specific database objects" and check the table you want scripted, Next
Click Advanced > in the list of options, scroll down to the bottom and look for the "Types of data to script" and change it to "Data Only" > OK
Select "Save to new query window" > Next > Next > Finish
All 180 rows now written as 180 insert statements!
Native method:
for example if you have table
Users(Id, name)
You can do this:
select 'insert into Table values(Id=' + Id + ', name=' + name + ')' from Users
1- Explanation of Scripts
A)Syntax for inserting data in table is as below
Insert into table(col1,col2,col3,col4,col5)
-- To achieve this part i
--have used below variable
------#CSV_COLUMN-------
values(Col1 data in quote, Col2..quote,..Col5..quote)
-- To achieve this part
-- i.e column data in
--quote i have used
--below variable
----#QUOTED_DATA---
C)To get above data from existing
table we have to write the select
query in such way that the output
will be in form of as above scripts
D)Then Finally i have Concatenated
above variable to create
final script that's will
generate insert script on execution
E)
#TEXT='SELECT ''INSERT INTO
'+#TABLE_NAME+'('+#CSV_COLUMN+')VALUES('''+'+'+SUBSTRING(#QUOTED_DATA,1,LEN(#QUOTED_DATA)-5)+'+'+''')'''+' Insert_Scripts FROM '+#TABLE_NAME + #FILTER_CONDITION
F)And Finally Executed the above query EXECUTE(TEXT)
G)QUOTENAME() function is used to wrap
column data inside quote
H)ISNULL is used because if any row has NULL
data for any column the query fails
and return NULL thats why to avoid
that i have used ISNULL
I)And created the sp sp_generate_insertscripts
for same
1- Just put the table name for which you want insert script
2- Filter condition if you want specific results
----------Final Procedure To generate Script------
CREATE PROCEDURE sp_generate_insertscripts
(
#TABLE_NAME VARCHAR(MAX),
#FILTER_CONDITION VARCHAR(MAX)=''
)
AS
BEGIN
SET NOCOUNT ON
DECLARE #CSV_COLUMN VARCHAR(MAX),
#QUOTED_DATA VARCHAR(MAX),
#TEXT VARCHAR(MAX)
SELECT #CSV_COLUMN=STUFF
(
(
SELECT ',['+ NAME +']' FROM sys.all_columns
WHERE OBJECT_ID=OBJECT_ID(#TABLE_NAME) AND
is_identity!=1 FOR XML PATH('')
),1,1,''
)
SELECT #QUOTED_DATA=STUFF
(
(
SELECT ' ISNULL(QUOTENAME('+NAME+','+QUOTENAME('''','''''')+'),'+'''NULL'''+')+'','''+'+' FROM sys.all_columns
WHERE OBJECT_ID=OBJECT_ID(#TABLE_NAME) AND
is_identity!=1 FOR XML PATH('')
),1,1,''
)
SELECT #TEXT='SELECT ''INSERT INTO '+#TABLE_NAME+'('+#CSV_COLUMN+')VALUES('''+'+'+SUBSTRING(#QUOTED_DATA,1,LEN(#QUOTED_DATA)-5)+'+'+''')'''+' Insert_Scripts FROM '+#TABLE_NAME + #FILTER_CONDITION
--SELECT #CSV_COLUMN AS CSV_COLUMN,#QUOTED_DATA AS QUOTED_DATA,#TEXT TEXT
EXECUTE (#TEXT)
SET NOCOUNT OFF
END
SSMS Toolpack (which is FREE as in beer) has a variety of great features - including generating INSERT statements from tables.
Update: for SQL Server Management Studio 2012 (and newer), SSMS Toolpack is no longer free, but requires a modest licensing fee.
It's possible to do via Visual Studio SQL Server Object Explorer.
You can click "View Data" from context menu for necessary table, filter results and save result as script.
Using visual studio, do the following
Create a project of type SQL Server-->SQL Server Database Project
open the sql server explorer CTL-\ , CTL-S
add a SQL Server by right clicking on the SQL SERVER icon. Selcet ADD NEW SERVER
navigate down to the table you are interested in
right click--> VIEW DATA
Click the top left cell to highlight everything (ctl-A doesnt seem to work)
Right Click -->SCript
This is fabulous. I have tried everything listed above over the years. I know there is a tool out there that will do this and much more, cant think of the name of it. But it is very expensive.
Good luck. I just figured this out. Have not tested it extensively w/ text fields etc, but it looks like it gets you a long ways down the road.
Greg
Create a separate table using into statement
For example
Select * into Test_123 from [dbo].[Employee] where Name like '%Test%'
Go to the Database
Right Click the Database
Click on Generate Script
Select your table
Select advanace option and select the Attribute "Data Only"
Select the file "open in new query"
Sql will generate script for you
This is a more versatile solution (that can do a little more than the question asks), and can be used in a query window without having to create a new stored proc - useful in production databases for instance where you don't have write access.
To use the code, please modify according to the in line comments which explain its usage. You can then just run this query in a query window and it will print the INSERT statements you require.
SET NOCOUNT ON
-- Set the ID you wish to filter on here
DECLARE #id AS INT = 123
DECLARE #tables TABLE (Name NVARCHAR(128), IdField NVARCHAR(128), IdInsert BIT, Excluded NVARCHAR(128))
-- Add any tables you wish to generate INSERT statements for here. The fields are as thus:
-- Name: Your table name
-- IdField: The field on which to filter the dataset
-- IdInsert: If the primary key field is to be included in the INSERT statement
-- Excluded: Any fields you do not wish to include in the INSERT statement
INSERT INTO #tables (Name, IdField, IdInsert, Excluded) VALUES ('MyTable1', 'Id', 0, 'Created,Modified')
INSERT INTO #tables (Name, IdField, IdInsert, Excluded) VALUES ('MyTable2', 'Id', 1, 'Created,Modified')
DECLARE #numberTypes TABLE (sysId TINYINT)
-- This will ensure INT and BIT types are not surrounded with quotes in the
-- resultant INSERT statement, but you may need to add more (from sys.types)
INSERT #numberTypes(SysId) VALUES(56),(104)
DECLARE #rows INT = (SELECT COUNT(*) FROM #tables)
DECLARE #cnt INT = 1
DECLARE #results TABLE (Sql NVARCHAR(4000))
WHILE #cnt <= #rows
BEGIN
DECLARE #tablename AS NVARCHAR(128)
DECLARE #idField AS NVARCHAR(128)
DECLARE #idInsert AS BIT
DECLARE #excluded AS NVARCHAR(128)
SELECT
#tablename = Name,
#idField = IdField,
#idInsert = IdInsert,
#excluded = Excluded
FROM (SELECT *, ROW_NUMBER() OVER(ORDER BY (SELECT 1)) AS RowId FROM #tables) t WHERE t.RowId = #cnt
DECLARE #excludedFields TABLE (FieldName NVARCHAR(128))
DECLARE #xml AS XML = CAST(('<X>' + REPLACE(#excluded, ',', '</X><X>') + '</X>') AS XML)
INSERT INTO #excludedFields SELECT N.value('.', 'NVARCHAR(128)') FROM #xml.nodes('X') AS T(N)
DECLARE #setIdentity NVARCHAR(128) = 'SET IDENTITY_INSERT ' + #tablename
DECLARE #execsql AS NVARCHAR(4000) = 'SELECT ''' + CASE WHEN #idInsert = 1 THEN #setIdentity + ' ON' + CHAR(13) ELSE '' END + 'INSERT INTO ' + #tablename + ' ('
SELECT #execsql = #execsql +
STUFF
(
(
SELECT CASE WHEN NOT EXISTS(SELECT * FROM #excludedFields WHERE FieldName = name) THEN ', ' + name ELSE '' END
FROM sys.columns
WHERE object_id = OBJECT_ID('dbo.' + #tablename)
FOR XML PATH('')
), 1, 2, ''
) +
')' + CHAR(13) + 'VALUES (' +
STUFF
(
(
SELECT
CASE WHEN NOT EXISTS(SELECT * FROM #excludedFields WHERE FieldName = name) THEN
''', '' + ISNULL(' +
CASE WHEN EXISTS(SELECT * FROM #numberTypes WHERE SysId = system_type_id) THEN '' ELSE ''''''''' + ' END +
'CAST(' + name + ' AS VARCHAR)' +
CASE WHEN EXISTS(SELECT * FROM #numberTypes WHERE SysId = system_type_id) THEN '' ELSE ' + ''''''''' END +
', ''NULL'') + '
ELSE ''
END
FROM sys.columns
WHERE object_id = OBJECT_ID('dbo.' + #tablename)
FOR XML PATH('')
), 1, 3, ''
) +
''')' + CASE WHEN #idInsert = 1 THEN CHAR(13) + #setIdentity + ' OFF' ELSE '' END +
''' FROM ' + #tablename + ' WHERE ' + #idField + ' = ' + CAST(#id AS VARCHAR)
INSERT #results EXEC (#execsql)
DELETE #excludedFields
SET #cnt = #cnt + 1
END
DECLARE cur CURSOR FOR SELECT Sql FROM #results
OPEN cur
DECLARE #sql NVARCHAR(4000)
FETCH NEXT FROM cur INTO #sql
WHILE ##FETCH_STATUS = 0
BEGIN
PRINT #sql
FETCH NEXT FROM cur INTO #sql
END
CLOSE cur
DEALLOCATE cur
You can Choose 'Result to File' option in SSMS and export your select result to file and make your changes in result file and finally using BCP - Bulk copy you can insert in table 1 in database 2.
I think for bulk insert you have to convert .rpt file to .csv file
Hope it will help.
I had a similar problem, but I needed to be able to create an INSERT statement from a query (with filters etc.)
So I created following procedure:
CREATE PROCEDURE dbo.ConvertQueryToInsert (#input NVARCHAR(max), #target NVARCHAR(max)) AS BEGIN
DECLARE #fields NVARCHAR(max);
DECLARE #select NVARCHAR(max);
-- Get the defintion from sys.columns and assemble a string with the fields/transformations for the dynamic query
SELECT
#fields = COALESCE(#fields + ', ', '') + '[' + name +']',
#select = COALESCE(#select + ', ', '') + ''''''' + ISNULL(CAST([' + name + '] AS NVARCHAR(max)), ''NULL'')+'''''''
FROM tempdb.sys.columns
WHERE [object_id] = OBJECT_ID(N'tempdb..'+#input);
-- Run the a dynamic query with the fields from #select into a new temp table
CREATE TABLE #ConvertQueryToInsertTemp (strings nvarchar(max))
DECLARE #stmt NVARCHAR(max) = 'INSERT INTO #ConvertQueryToInsertTemp SELECT '''+ #select + ''' AS [strings] FROM '+#input
exec sp_executesql #stmt
-- Output the final insert statement
SELECT 'INSERT INTO ' + #target + ' (' + #fields + ') VALUES (' + REPLACE(strings, '''NULL''', 'NULL') +')' FROM #ConvertQueryToInsertTemp
-- Clean up temp tables
DROP TABLE #ConvertQueryToInsertTemp
SET #stmt = 'DROP TABLE ' + #input
exec sp_executesql #stmt
END
You can then use it by writing the output of your query into a temp table and running the procedure:
-- Example table
CREATE TABLE Dummy (Id INT, Comment NVARCHAR(50), TimeStamp DATETIME)
INSERT INTO Dummy VALUES (1 , 'Foo', GetDate()), (2, 'Bar', GetDate()), (3, 'Foo Bar', GetDate())
-- Run query and procedure
SELECT * INTO #TempTableForConvert FROM Dummy WHERE Id < 3
EXEC dbo.ConvertQueryToInsert '#TempTableForConvert', 'dbo.Dummy'
Note:
This procedure only casts the values to a string which can cause the data to look a bit different. With DATETIME for example the seconds will be lost.
I created the following procedure:
if object_id('tool.create_insert', 'P') is null
begin
exec('create procedure tool.create_insert as');
end;
go
alter procedure tool.create_insert(#schema varchar(200) = 'dbo',
#table varchar(200),
#where varchar(max) = null,
#top int = null,
#insert varchar(max) output)
as
begin
declare #insert_fields varchar(max),
#select varchar(max),
#error varchar(500),
#query varchar(max);
declare #values table(description varchar(max));
set nocount on;
-- Get columns
select #insert_fields = isnull(#insert_fields + ', ', '') + c.name,
#select = case type_name(c.system_type_id)
when 'varchar' then isnull(#select + ' + '', '' + ', '') + ' isnull('''''''' + cast(' + c.name + ' as varchar) + '''''''', ''null'')'
when 'datetime' then isnull(#select + ' + '', '' + ', '') + ' isnull('''''''' + convert(varchar, ' + c.name + ', 121) + '''''''', ''null'')'
else isnull(#select + ' + '', '' + ', '') + 'isnull(cast(' + c.name + ' as varchar), ''null'')'
end
from sys.columns c with(nolock)
inner join sys.tables t with(nolock) on t.object_id = c.object_id
inner join sys.schemas s with(nolock) on s.schema_id = t.schema_id
where s.name = #schema
and t.name = #table;
-- If there's no columns...
if #insert_fields is null or #select is null
begin
set #error = 'There''s no ' + #schema + '.' + #table + ' inside the target database.';
raiserror(#error, 16, 1);
return;
end;
set #insert_fields = 'insert into ' + #schema + '.' + #table + '(' + #insert_fields + ')';
if isnull(#where, '') <> '' and charindex('where', ltrim(rtrim(#where))) < 1
begin
set #where = 'where ' + #where;
end
else
begin
set #where = '';
end;
set #query = 'select ' + isnull('top(' + cast(#top as varchar) + ')', '') + #select + ' from ' + #schema + '.' + #table + ' with (nolock) ' + #where;
insert into #values(description)
exec(#query);
set #insert = isnull(#insert + char(10), '') + '--' + upper(#schema + '.' + #table);
select #insert = #insert + char(10) + #insert_fields + char(10) + 'values(' + v.description + ');' + char(10) + 'go' + char(10)
from #values v
where isnull(v.description, '') <> '';
end;
go
Then you can use it that way:
declare #insert varchar(max),
#part varchar(max),
#start int,
#end int;
set #start = 1;
exec tool.create_insert #schema = 'dbo',
#table = 'customer',
#where = 'id = 1',
#insert = #insert output;
-- Print one line to avoid the maximum 8000 characters problem
while len(#insert) > 0
begin
set #end = charindex(char(10), #insert);
if #end = 0
begin
set #end = len(#insert) + 1;
end;
print substring(#insert, #start, #end - 1);
set #insert = substring(#insert, #end + 1, len(#insert) - #end + 1);
end;
The output would be something like that:
--DBO.CUSTOMER
insert into dbo.customer(id, name, type)
values(1, 'CUSTOMER NAME', 'F');
go
If you just want to get a range of rows, use the #top parameter as bellow:
declare #insert varchar(max),
#part varchar(max),
#start int,
#end int;
set #start = 1;
exec tool.create_insert #schema = 'dbo',
#table = 'customer',
#top = 100,
#insert = #insert output;
-- Print one line to avoid the maximum 8000 characters problem
while len(#insert) > 0
begin
set #end = charindex(char(10), #insert);
if #end = 0
begin
set #end = len(#insert) + 1;
end;
print substring(#insert, #start, #end - 1);
set #insert = substring(#insert, #end + 1, len(#insert) - #end + 1);
end;
You can Use Sql Server Integration Service Packages specifically designed for Import and Export operation.
VS has a package for developing these packages if your fully install Sql Server.
Integration Services in Business Intelligence Development Studio
I think its also possible with adhoc queries
you can export result to excel file and then import that file into your datatable object or use it as it is and then import the excel file into the second database
have a look at this link
this can help u alot.
http://vscontrols.blogspot.com/2010/09/import-and-export-excel-to-sql-server.html
If you are using Oracle (or configure the application to the SQL Server) then Oracle SQL Developer does this for you. choose 'unload' for a table and follow the options through (untick DDL if you don't want all the table create stuff).
I found this SMSMS Boost addon, which is free and does exactly this among other things. You can right click on the results and select Script data as.
You can use this Q2C.SSMSPlugin, which is free and open source. You can right click and select "Execute Query To Command... -> Query To Insert...". Enjoy)
You can use an INSERT INTO SELECT statement, to insert the results of a select query into a table. http://www.w3schools.com/sql/sql_insert_into_select.asp
Example:
INSERT INTO Customers (CustomerName, Country)
SELECT SupplierName, Country
FROM Suppliers
WHERE Country='Germany'