Related
I have multiple columns with some amount in a table and I want to show the total of all those amounts in the last Total column. I have a table in sql which looks somewhat like this,
A_Amt B_Amt C_Amt D_Amt E_Amt F_Amt ...
------------------------------------------------
15 20 25 30 35 40
i have written a query as
declare #xmlResult xml=
(
select *
from Foo
for xml PATH
);
SELECT Nodes.node.value('sum(*[contains(local-name(.), "_Amt")])', 'decimal(15,2)') AS Total
FROM
#xmlResult.nodes('//row') as Nodes(node);
but the result I am getting has only one column total but i want all the columns in resultant table like A_amt etc..
This should be what you need, BUT ATTENTION! You should NOT do this. Aggregate rows should NEVER be fetched together with the "raw" data. This is - in most cases - something your UI should do (or a report...)
declare #table TABLE(ID INT IDENTITY, a INT,b INT,c INT);
insert into #table VALUES(1,1,1),(2,3,4),(5,6,7);
SELECT a,b,c
FROM
(
SELECT ROW_NUMBER() OVER(ORDER BY t.ID) AS inx
,a,b,c
FROM #table AS t
UNION SELECT 999999,SUM(a),SUM(b),SUM(c)
FROM #table
) AS tbl
ORDER BY tbl.inx
I think this is what you are looking for, try this (replace spt_values with your table) :
USE MASTER
GO
declare #lsql nvarchar(max)
declare #lsql2 nvarchar(max)
declare #yourTable nvarchar(255) = 'spt_values'
Select #lsql = isnull(#lsql+'+','') + 'Case When ISNUMERIC('+name+') = 1 Then '+name+' else 0 end' from sys.Columns where Object_id = Object_id(#yourTable)
Print #lsql
SET #lsql2 = 'Select *, '+#lsql+' as Total_allcolumns From '+#yourTable+''
Exec(#lsql2)
Using Microsoft's system table is one way to achieve dynamic SQL and thus your goal. The code below is what you want or will at least get you started.
I wasn't sure what output you expected, so I included two outputs. Just use the one you want and discard the other one. Given your question, it is probably result1. (Result1 or Result2)
!!You have to write the table name in the script at the place indicated prior to executing it!!
--DISCLAIMER
--It assume you use SQL SERVER 2012. (Probably work on 2005+ with little adjustment)
--It assume data is in a table, (Not a view for example)
--Changing SQL SERVER version may break the code as Microsoft could change "system views".
--I don't remember well, but EXEC may be limited to 4000 characters in dynamic query. (But there is a work around, just look around if you need it)
--So use at your own risk
DECLARE #objectIDTable INT,
#AllColumnAdditionStatement NVARCHAR(MAX) = '',
#TableName NVARCHAR(250) = 'WriteYourTableNameHere',--!!!OVERWRITE THE TABLE NAME HERE
#Query NVARCHAR(MAX),
#AllSumStatement NVARCHAR(MAX) = ''
SELECT TOP 1 #objectIDTable = [object_id],
#AllColumnAdditionStatement = ''
FROM sys.objects
WHERE type_desc = 'USER_TABLE'
AND name = #TableName
SELECT #AllColumnAdditionStatement = #AllColumnAdditionStatement + 'CONVERT(DECIMAL(18, 4), (CASE WHEN ISNUMERIC(' + name + ') = 1 THEN ISNULL(' + name + ', ''0'') ELSE 0 END))' + ' + ',
#AllSumStatement = #AllSumStatement + name + 'Total = SUM(CONVERT(DECIMAL(18, 4), (CASE WHEN ISNUMERIC(' + name + ') = 1 THEN ISNULL(' + name + ', ''0'') ELSE 0 END))), ' + CHAR(10)
FROM sys.columns
WHERE object_id = #objectIDTable
AND name LIKE '%_Amt' --!!!Here is a column filter/selector to sum only column ending with _Amt
SELECT #AllColumnAdditionStatement = #AllColumnAdditionStatement + '0', --just too lazy to chop off last three char
#AllSumStatement = #AllSumStatement + 'Total_ = SUM(' + #AllColumnAdditionStatement + ')' + CHAR(10),
#Query = 'SELECT *,
Total_ = ' + #AllColumnAdditionStatement +'
FROM ' + #TableName
PRINT (#Query)
/********************************************************************************************/
EXEC (#Query) --or use sp_execute if you prefer
--Result1 : addition of all selected columns into total column with all column return as well
/********************************************************************************************/
SELECT #Query = 'SELECT ' + #AllSumStatement + '
FROM ' + #TableName
EXEC (#Query) --or use sp_execute if you prefer
--Result2 : Summation of all column individualy and summation of all of them into total column
/********************************************************************************************/
I would like to create a SQL Statement that will return the distinct values of the Code fields in my database, along with the name of the column for the codes and the name of the table on which the column occurs.
I had something like this:
select c.name as 'Col Name', t.name as "Table Name'
from sys.columns c, sys tables t
where c.object_id = t.object_id
and c.name like 'CD_%'
It generates the list of columns and tables I want, but obviously doesn't return any of the values for each of the codes in the list.
There are over 100 tables in my database. I could use the above result set and write the query for each one like this:
Select distinct CD_RACE from PERSON
and it will return the values, but it won't return the column and table name, plus I have to do each one individually. Is there any way I can get the value, column name and table name for EACH code in my database?
Any ideas? THanks...
Just generate your selects and bring in the column and table names as static values. Here's an Oracle version:
select 'select distinct '''||c.column_name||''' as "Col Name", '''||t.table_name||''' as "Table Name", '||c.column_name||' from '||t.table_name||';'
from all_tab_columns c, all_tables t
where c.table_name = t.table_name;
This will give you a bunch of separate statements, you can modify the query a bit to put a union between each select if you really want one uber query you can execute to get all your code values at once.
Here's an approach for SQL Server since someone else covered Oracle (and specific DBMS not mentioned. The following steps are completed:
Setup table to receive the schema, table, column name, and column value (in example below only table variable is used)
Build the list of SQL commands to execute (accounting for various schemas and names with spaces and such)
Run each command dynamically inserting values into the setup table from #1 above
Output results from table
Here is the example:
-- Store the values and source of the values
DECLARE #Values TABLE (
SchemaName VARCHAR(500),
TableName VARCHAR(500),
ColumnName VARCHAR(500),
ColumnValue VARCHAR(MAX)
)
-- Build list of SQL Commands to run
DECLARE #Commands TABLE (
Id INT PRIMARY KEY NOT NULL IDENTITY(1,1),
SchemaName VARCHAR(500),
TableName VARCHAR(500),
ColumnName VARCHAR(500),
SqlCommand VARCHAR(1000)
)
INSERT #Commands
SELECT
[TABLE_SCHEMA],
[TABLE_NAME],
[COLUMN_NAME],
'SELECT DISTINCT '
+ '''' + [TABLE_SCHEMA] + ''', '
+ '''' + [TABLE_NAME] + ''', '
+ '''' + [COLUMN_NAME] + ''', '
+ '[' + [COLUMN_NAME] + '] '
+ 'FROM [' + [TABLE_SCHEMA] + '].[' + [TABLE_NAME] + ']'
FROM INFORMATION_SCHEMA.COLUMNS
WHERE COLUMN_NAME LIKE 'CD_%'
-- Loop through commands
DECLARE
#Sql VARCHAR(1000),
#Id INT,
#SchemaName VARCHAR(500),
#TableName VARCHAR(500),
#ColumnName VARCHAR(500)
WHILE EXISTS (SELECT * FROM #Commands) BEGIN
-- Get next set of records
SELECT TOP 1
#Id = Id,
#Sql = SqlCommand,
#SchemaName = SchemaName,
#TableName = TableName,
#ColumnName = ColumnName
FROM #Commands
-- Add values for that command
INSERT #Values
EXEC (#Sql)
-- Remove command record
DELETE #Commands WHERE Id = #Id
END
-- Return the values and sources
SELECT * FROM #Values
My table has all the column names
(There are more than 80 columns, I can't change the column names now)
in the format of '_'. Like First_Name, Last_Name,...
So i want to use select * from table instead
of using AS.
I want to select them by removing '_' in one statement. Anyway i can do it?
something like Replace(coulmnName, '_','') in select statement ?
Thanks
You can simply rename the column in your query. For example:
SELECT FIRST_NAME [First Name],
LAST_NAME [Last Name]
FROM UserTable
You can also use the AS keyword but this is optional. Also note that if you don't want to do this on every query you can use this process to create a view with renamed columns. Then you can use SELECT * the way you want to (although this is considered a bad idea for many reasons).
Best of luck!
Alternative - Map In The Client Code:
One other alternative is to do the mapping in the client code. This solution is going to depend greatly on your ORM. Most ORM's (such as LINQ or EF) will allow you to remap. If nothing else you could use AutoMapper or similar to rename the columns on the client using convention based naming.
You can't do this in a single statement unless you're using dynamic SQL. If you're just trying to generate code, you can run a query against Information_Schema and get the info you want ...
DECLARE #MaxColumns INT
DECLARE #TableName VARCHAR(20)
SET #TableName = 'Course'
SELECT #MaxColumns = MAX(ORDINAL_POSITION) FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = #TableName
SELECT Col
FROM
(
SELECT 0 Num, 'SELECT' Col
UNION
SELECT ROW_NUMBER() OVER (PARTITION BY TABLE_NAME ORDER BY ORDINAL_POSITION) Num, ' [' + COLUMN_NAME + '] AS [' + REPLACE(COLUMN_NAME, '_', '') + ']' + CASE WHEN ORDINAL_POSITION = #MaxColumns THEN '' ELSE ',' END
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_NAME = #TableName
UNION
SELECT #MaxColumns + 1 Num, 'FROM ' + #TableName
) s
ORDER BY num
The question intrigued me and I did find one way. It makes it happen but if you just wanted to give a lot of aliases one time in one query I wouldn't recommend it though.
First I made a stored procedure that extracts all the column names and gives them an alias without '_'.
USE [DataBase]
GO
IF OBJECT_ID('usp_AlterColumnDisplayName', 'P') IS NOT NULL
DROP PROCEDURE usp_AlterColumnDisplayName
GO
CREATE PROCEDURE usp_AlterColumnDisplayName
#TableName VARCHAR(50)
,
#ret nvarchar(MAX) OUTPUT
AS
Select #ret = #ret + [Column name]
From
(
SELECT ([name] + ' AS ' + '[' + REPLACE([name], '_', ' ') + '], ') [Column name]
FROM syscolumns
WHERE id =
(Select id
From sysobjects
Where type = 'U'
And [name] = #TableName
)
) T
GO
Then extract that string and throw it into another string with a query-structure.
Execute that and you are done.
DECLARE #out NVARCHAR(MAX), #DesiredTable VARCHAR(50), #Query NVARCHAR(MAX)
SET #out = ''
SET #DesiredTable = 'YourTable'
EXEC usp_AlterColumnDisplayName
#TableName = #DesiredTable,
#ret = #out OUTPUT
SET #out = LEFT(#out, LEN(#out)-1) --Removing trailing ', '
SET #Query = 'Select ' + #out + ' From ' + #DesiredTable + ' WHERE whatever'
EXEC sp_executesql #Query
If you just wanted to give a lot of aliases at once without sitting and typing it out for 80+ columns I would rather suggest doing that with one simple SELECT statement, like the one in the sp, or in Excel and then copy paste into your code.
I have two database's, named DB1 and DB2 in Sql server 2008. These two database's have the same tables and same table data also. However, I want to check if there are any differences between the data in these tables.
Could anyone help me with a script for this?
select *
from (
select 'T1' T, *
from DB1.dbo.Table
except
select 'T2' T, *
from DB2.dbo.Table
) as T
union all
select *
from (
select 'T2' T, *
from DB2.dbo.Table
except
select 'T1' T, *
from DB1.dbo.Table
) as T
ORDER BY 2,3,4, ..., 1 -- make T1 and T2 to be close in output 2,3,4 are UNIQUE KEY SEGMENTS
Test code:
declare #T1 table (ID int)
declare #T2 table (ID int)
insert into #T1 values(1),(2)
insert into #T2 values(2),(3)
select *
from (
select *
from #T1
except
select *
from #T2
) as T
union all
select *
from (
select *
from #T2
except
select *
from #T1
) as T
Result:
ID
-----------
1
3
Note: It can take long time to compare big table, when developing "tuned" solution or refactorig, which will give same result as REFERERCE - it may be wise to chekc simple parameters first: like
select count(t.*) from (
select count(*) c0, SUM(BINARY_CHECKSUM(*)%1000000) c1 FROM T_REF_TABLE
-- select 12345 c0, -214365454 c1 -- constant values FROM T_REF_TABLE
except
select count(*) , SUM(BINARY_CHECKSUM(*)%1000000) FROM T_WORK_COPY
) t
When this is empty, you have probably things under controll, and may be you can modify when you fail you will see "constant values FROM T_REF" to isert to save even more time for next check!!!
I’d really suggest that people who encounter this problem go and find a third party database comparison tool.
Reason – these tools save a lot of time and make the process less error prone.
I’ve used comparison tools from ApexSQL (Diff and Data Diff) but you can’t go wrong with other tools marc_s and Marina Nastenko already pointed out.
If you’re absolutely sure that you are only going to compare tables once then SQL is fine but if you’re going to need this from time to time you’ll be better off with some 3rd party tool.
If you don’t have budget to buy it then just use it in trial mode to get the job done.
I hope new readers will find this useful even though it’s a late answer…
I'v done things like this using the Checksum(*) function
In essance it creates a row level checksum on all the columns data, you could then compare the checksum of each row for each table to each other, use a left join, to find rows that are different.
Hope that made sense...
Better with an example....
select *
from
( select checksum(*) as chk, userid as k from UserAccounts) as t1
left join
( select checksum(*) as chk, userid as k from UserAccounts) as t2 on t1.k = t2.k
where t1.chk <> t2.chk
select * from DB1.dbo.Table a inner join DB2.dbo.Table b on b.PrimKey = a.PrimKey
where a.FirstColumn <> b.FirstColumn ...
Checksum that Matt recommended is probably a better approach to compare columns rather than comparing each column
Comparing the two Databases in SQL Database. Try this Query it may help.
SELECT T.[name] AS [table_name], AC.[name] AS [column_name], TY.[name] AS
system_data_type FROM [***Database Name 1***].sys.[tables] AS T
INNER JOIN [***Database Name 1***].sys.[all_columns] AC ON T.[object_id] = AC.[object_id]
INNER JOIN [***Database Name 1***].sys.[types] TY ON AC.[system_type_id] = TY.[system_type_id]
EXCEPT SELECT T.[name] AS [table_name], AC.[name] AS [column_name], TY.[name] AS system_data_type FROM ***Database Name 2***.sys.[tables] AS T
INNER JOIN ***Database Name 2***.sys.[all_columns] AC ON T.[object_id] = AC.[object_id]
INNER JOIN ***Database Name 2***.sys.[types] TY ON AC.[system_type_id] = TY.[system_type_id]
If the database are in the same server use [DatabaseName].[Owner].[TableName] format when accessing a table that resides in a different database.
Eg: [DB1].[dbo].[TableName]
If databases in different server look at on Creating Linked Servers (SQL Server Database Engine)
Another solution (non T-SQL): you can use tablediff utility.
For example if you want to compare two tables (Localitate) from two different servers (ROBUH01 & ROBUH02) you can use this shell command:
C:\Program Files\Microsoft SQL Server\100\COM>tablediff -sourceserver ROBUH01 -s
ourcedatabase SIM01 -sourceschema dbo -sourcetable Localitate -destinationserver
ROBUH02 -destinationschema dbo -destinationdatabase SIM02 -destinationtable Lo
calitate
Results:
Microsoft (R) SQL Server Replication Diff Tool Copyright (c) 2008 Microsoft Corporation User-specified agent parameter values:
-sourceserver ROBUH01
-sourcedatabase SIM01
-sourceschema dbo
-sourcetable Localitate
-destinationserver ROBUH02
-destinationschema dbo
-destinationdatabase SIM02
-destinationtable Localitate
Table [SIM01].[dbo].[Localitate] on ROBUH01 and Table [SIM02].[dbo].[Localitate ] on ROBUH02 have 10 differences.
Err Id Dest.
Only 21433 Dest.
Only 21434 Dest.
Only 21435 Dest.
Only 21436 Dest.
Only 21437 Dest.
Only 21438 Dest.
Only 21439 Dest.
Only 21441 Dest.
Only 21442 Dest.
Only 21443
The requested operation took 9,9472657 seconds.
------------------------------------------------------------------------
If both database on same server. You can check similar tables by using following query :
select
fdb.name, sdb.name
from
FIRSTDBNAME.sys.tables fdb
join SECONDDBNAME.sys.tables sdb
on fdb.name = sdb.name -- compare same name tables
order by
1
By listing out similar table you can compare columns schema using sys.columns view.
Hope this helps you.
In order to compare two databases, I've written the procedures bellow.
If you want to compare two tables you can use procedure 'CompareTables'. Example :
EXEC master.dbo.CompareTables 'DB1', 'dbo', 'table1', 'DB2', 'dbo', 'table2'
If you want to compare two databases, use the procedure 'CompareDatabases'. Example :
EXEC master.dbo.CompareDatabases 'DB1', 'DB2'
Note : - I tried to make the procedures secure, but anyway, those procedures are only for testing and debugging.
- If you want a complete solution for comparison use third party like (Visual Studio, ...)
USE [master]
GO
create proc [dbo].[CompareDatabases]
#FirstDatabaseName nvarchar(50),
#SecondDatabaseName nvarchar(50)
as
begin
-- Check that databases exist
if not exists(SELECT name FROM sys.databases WHERE name=#FirstDatabaseName)
return 0
if not exists(SELECT name FROM sys.databases WHERE name=#SecondDatabaseName)
return 0
declare #result table (TABLE_NAME nvarchar(256))
SET NOCOUNT ON
insert into #result EXEC('(Select distinct TABLE_NAME from ' + #FirstDatabaseName + '.INFORMATION_SCHEMA.COLUMNS '
+'Where TABLE_SCHEMA=''dbo'')'
+ 'intersect'
+ '(Select distinct TABLE_NAME from ' + #SecondDatabaseName + '.INFORMATION_SCHEMA.COLUMNS '
+'Where TABLE_SCHEMA=''dbo'')')
DECLARE #TABLE_NAME nvarchar(256)
DECLARE curseur CURSOR FOR
SELECT TABLE_NAME FROM #result
OPEN curseur
FETCH curseur INTO #TABLE_NAME
WHILE ##FETCH_STATUS = 0
BEGIN
print 'TABLE : ' + #TABLE_NAME
EXEC master.dbo.CompareTables #FirstDatabaseName, 'dbo', #TABLE_NAME, #SecondDatabaseName, 'dbo', #TABLE_NAME
FETCH curseur INTO #TABLE_NAME
END
CLOSE curseur
DEALLOCATE curseur
SET NOCOUNT OFF
end
GO
.
USE [master]
GO
CREATE PROC [dbo].[CompareTables]
#FirstTABLE_CATALOG nvarchar(256),
#FirstTABLE_SCHEMA nvarchar(256),
#FirstTABLE_NAME nvarchar(256),
#SecondTABLE_CATALOG nvarchar(256),
#SecondTABLE_SCHEMA nvarchar(256),
#SecondTABLE_NAME nvarchar(256)
AS
BEGIN
-- Verify if first table exist
DECLARE #table1 nvarchar(256) = #FirstTABLE_CATALOG + '.' + #FirstTABLE_SCHEMA + '.' + #FirstTABLE_NAME
DECLARE #return_status int
EXEC #return_status = master.dbo.TableExist #FirstTABLE_CATALOG, #FirstTABLE_SCHEMA, #FirstTABLE_NAME
IF #return_status = 0
BEGIN
PRINT #table1 + ' : Table Not FOUND'
RETURN 0
END
-- Verify if second table exist
DECLARE #table2 nvarchar(256) = #SecondTABLE_CATALOG + '.' + #SecondTABLE_SCHEMA + '.' + #SecondTABLE_NAME
EXEC #return_status = master.dbo.TableExist #SecondTABLE_CATALOG, #SecondTABLE_SCHEMA, #SecondTABLE_NAME
IF #return_status = 0
BEGIN
PRINT #table2 + ' : Table Not FOUND'
RETURN 0
END
-- Compare the two tables
DECLARE #sql AS NVARCHAR(MAX)
SELECT #sql = '('
+ '(SELECT ''' + #table1 + ''' as _Table, * FROM ' + #FirstTABLE_CATALOG + '.' + #FirstTABLE_SCHEMA + '.' + #FirstTABLE_NAME + ')'
+ 'EXCEPT'
+ '(SELECT ''' + #table1 + ''' as _Table, * FROM ' + #SecondTABLE_CATALOG + '.' + #SecondTABLE_SCHEMA + '.' + #SecondTABLE_NAME + ')'
+ ')'
+ 'UNION'
+ '('
+ '(SELECT ''' + #table2 + ''' as _Table, * FROM ' + #SecondTABLE_CATALOG + '.' + #SecondTABLE_SCHEMA + '.' + #SecondTABLE_NAME + ')'
+ 'EXCEPT'
+ '(SELECT ''' + #table2 + ''' as _Table, * FROM ' + #FirstTABLE_CATALOG + '.' + #FirstTABLE_SCHEMA + '.' + #FirstTABLE_NAME + ')'
+ ')'
DECLARE #wrapper AS NVARCHAR(MAX) = 'if exists (' + #sql + ')' + char(10) + ' (' + #sql + ')ORDER BY 2'
Exec(#wrapper)
END
GO
.
USE [master]
GO
CREATE PROC [dbo].[TableExist]
#TABLE_CATALOG nvarchar(256),
#TABLE_SCHEMA nvarchar(256),
#TABLE_NAME nvarchar(256)
AS
BEGIN
IF NOT EXISTS(SELECT name FROM sys.databases WHERE name=#TABLE_CATALOG)
RETURN 0
declare #result table (TABLE_SCHEMA nvarchar(256), TABLE_NAME nvarchar(256))
SET NOCOUNT ON
insert into #result EXEC('Select TABLE_SCHEMA, TABLE_NAME from ' + #TABLE_CATALOG + '.INFORMATION_SCHEMA.COLUMNS')
SET NOCOUNT OFF
IF EXISTS(SELECT TABLE_SCHEMA, TABLE_NAME FROM #result
WHERE TABLE_SCHEMA=#TABLE_SCHEMA AND TABLE_NAME=#TABLE_NAME)
RETURN 1
RETURN 0
END
GO
Although this question is for SQL 2018, which in the year 2021 might not be that old, when you use Azure Data Studio, there is an extension that can be installed called SQL Server Schema Compare that does this for you.
I have SQL Server 2008, SQL Server Management Studio.
I need to select data from a table in one database and insert into another table in another database.
How can I convert the returned results from my select into INSERT INTO ...?
Clarification from comments: While I believe this could be solved by a INSERT INTO SELECT or SELECT INTO, I do need to generate INSERT INTO ....
Here is another method, which may be easier than installing plugins or external tools in some situations:
Do a select [whatever you need]INTO temp.table_namefrom [... etc ...].
Right-click on the database in the Object Explorer => Tasks => Generate Scripts
Select temp.table_name in the "Choose Objects" screen, click Next.
In the "Specify how scripts should be saved" screen:
Click Advanced, find the "Types of data to Script" property, select "Data only", close the advanced properties.
Select "Save to new query window" (unless you have thousands of records).
Click Next, wait for the job to complete, observe the resulting INSERT statements appear in a new query window.
Use Find & Replace to change all [temp.table_name] to [your_table_name].
drop table [temp.table_name].
In SSMS:
Right click on the database > Tasks > Generate Scripts
Next
Select "Select specific database objects" and check the table you want scripted, Next
Click Advanced > in the list of options, scroll down to the bottom and look for the "Types of data to script" and change it to "Data Only" > OK
Select "Save to new query window" > Next > Next > Finish
All 180 rows now written as 180 insert statements!
Native method:
for example if you have table
Users(Id, name)
You can do this:
select 'insert into Table values(Id=' + Id + ', name=' + name + ')' from Users
1- Explanation of Scripts
A)Syntax for inserting data in table is as below
Insert into table(col1,col2,col3,col4,col5)
-- To achieve this part i
--have used below variable
------#CSV_COLUMN-------
values(Col1 data in quote, Col2..quote,..Col5..quote)
-- To achieve this part
-- i.e column data in
--quote i have used
--below variable
----#QUOTED_DATA---
C)To get above data from existing
table we have to write the select
query in such way that the output
will be in form of as above scripts
D)Then Finally i have Concatenated
above variable to create
final script that's will
generate insert script on execution
E)
#TEXT='SELECT ''INSERT INTO
'+#TABLE_NAME+'('+#CSV_COLUMN+')VALUES('''+'+'+SUBSTRING(#QUOTED_DATA,1,LEN(#QUOTED_DATA)-5)+'+'+''')'''+' Insert_Scripts FROM '+#TABLE_NAME + #FILTER_CONDITION
F)And Finally Executed the above query EXECUTE(TEXT)
G)QUOTENAME() function is used to wrap
column data inside quote
H)ISNULL is used because if any row has NULL
data for any column the query fails
and return NULL thats why to avoid
that i have used ISNULL
I)And created the sp sp_generate_insertscripts
for same
1- Just put the table name for which you want insert script
2- Filter condition if you want specific results
----------Final Procedure To generate Script------
CREATE PROCEDURE sp_generate_insertscripts
(
#TABLE_NAME VARCHAR(MAX),
#FILTER_CONDITION VARCHAR(MAX)=''
)
AS
BEGIN
SET NOCOUNT ON
DECLARE #CSV_COLUMN VARCHAR(MAX),
#QUOTED_DATA VARCHAR(MAX),
#TEXT VARCHAR(MAX)
SELECT #CSV_COLUMN=STUFF
(
(
SELECT ',['+ NAME +']' FROM sys.all_columns
WHERE OBJECT_ID=OBJECT_ID(#TABLE_NAME) AND
is_identity!=1 FOR XML PATH('')
),1,1,''
)
SELECT #QUOTED_DATA=STUFF
(
(
SELECT ' ISNULL(QUOTENAME('+NAME+','+QUOTENAME('''','''''')+'),'+'''NULL'''+')+'','''+'+' FROM sys.all_columns
WHERE OBJECT_ID=OBJECT_ID(#TABLE_NAME) AND
is_identity!=1 FOR XML PATH('')
),1,1,''
)
SELECT #TEXT='SELECT ''INSERT INTO '+#TABLE_NAME+'('+#CSV_COLUMN+')VALUES('''+'+'+SUBSTRING(#QUOTED_DATA,1,LEN(#QUOTED_DATA)-5)+'+'+''')'''+' Insert_Scripts FROM '+#TABLE_NAME + #FILTER_CONDITION
--SELECT #CSV_COLUMN AS CSV_COLUMN,#QUOTED_DATA AS QUOTED_DATA,#TEXT TEXT
EXECUTE (#TEXT)
SET NOCOUNT OFF
END
SSMS Toolpack (which is FREE as in beer) has a variety of great features - including generating INSERT statements from tables.
Update: for SQL Server Management Studio 2012 (and newer), SSMS Toolpack is no longer free, but requires a modest licensing fee.
It's possible to do via Visual Studio SQL Server Object Explorer.
You can click "View Data" from context menu for necessary table, filter results and save result as script.
Using visual studio, do the following
Create a project of type SQL Server-->SQL Server Database Project
open the sql server explorer CTL-\ , CTL-S
add a SQL Server by right clicking on the SQL SERVER icon. Selcet ADD NEW SERVER
navigate down to the table you are interested in
right click--> VIEW DATA
Click the top left cell to highlight everything (ctl-A doesnt seem to work)
Right Click -->SCript
This is fabulous. I have tried everything listed above over the years. I know there is a tool out there that will do this and much more, cant think of the name of it. But it is very expensive.
Good luck. I just figured this out. Have not tested it extensively w/ text fields etc, but it looks like it gets you a long ways down the road.
Greg
Create a separate table using into statement
For example
Select * into Test_123 from [dbo].[Employee] where Name like '%Test%'
Go to the Database
Right Click the Database
Click on Generate Script
Select your table
Select advanace option and select the Attribute "Data Only"
Select the file "open in new query"
Sql will generate script for you
This is a more versatile solution (that can do a little more than the question asks), and can be used in a query window without having to create a new stored proc - useful in production databases for instance where you don't have write access.
To use the code, please modify according to the in line comments which explain its usage. You can then just run this query in a query window and it will print the INSERT statements you require.
SET NOCOUNT ON
-- Set the ID you wish to filter on here
DECLARE #id AS INT = 123
DECLARE #tables TABLE (Name NVARCHAR(128), IdField NVARCHAR(128), IdInsert BIT, Excluded NVARCHAR(128))
-- Add any tables you wish to generate INSERT statements for here. The fields are as thus:
-- Name: Your table name
-- IdField: The field on which to filter the dataset
-- IdInsert: If the primary key field is to be included in the INSERT statement
-- Excluded: Any fields you do not wish to include in the INSERT statement
INSERT INTO #tables (Name, IdField, IdInsert, Excluded) VALUES ('MyTable1', 'Id', 0, 'Created,Modified')
INSERT INTO #tables (Name, IdField, IdInsert, Excluded) VALUES ('MyTable2', 'Id', 1, 'Created,Modified')
DECLARE #numberTypes TABLE (sysId TINYINT)
-- This will ensure INT and BIT types are not surrounded with quotes in the
-- resultant INSERT statement, but you may need to add more (from sys.types)
INSERT #numberTypes(SysId) VALUES(56),(104)
DECLARE #rows INT = (SELECT COUNT(*) FROM #tables)
DECLARE #cnt INT = 1
DECLARE #results TABLE (Sql NVARCHAR(4000))
WHILE #cnt <= #rows
BEGIN
DECLARE #tablename AS NVARCHAR(128)
DECLARE #idField AS NVARCHAR(128)
DECLARE #idInsert AS BIT
DECLARE #excluded AS NVARCHAR(128)
SELECT
#tablename = Name,
#idField = IdField,
#idInsert = IdInsert,
#excluded = Excluded
FROM (SELECT *, ROW_NUMBER() OVER(ORDER BY (SELECT 1)) AS RowId FROM #tables) t WHERE t.RowId = #cnt
DECLARE #excludedFields TABLE (FieldName NVARCHAR(128))
DECLARE #xml AS XML = CAST(('<X>' + REPLACE(#excluded, ',', '</X><X>') + '</X>') AS XML)
INSERT INTO #excludedFields SELECT N.value('.', 'NVARCHAR(128)') FROM #xml.nodes('X') AS T(N)
DECLARE #setIdentity NVARCHAR(128) = 'SET IDENTITY_INSERT ' + #tablename
DECLARE #execsql AS NVARCHAR(4000) = 'SELECT ''' + CASE WHEN #idInsert = 1 THEN #setIdentity + ' ON' + CHAR(13) ELSE '' END + 'INSERT INTO ' + #tablename + ' ('
SELECT #execsql = #execsql +
STUFF
(
(
SELECT CASE WHEN NOT EXISTS(SELECT * FROM #excludedFields WHERE FieldName = name) THEN ', ' + name ELSE '' END
FROM sys.columns
WHERE object_id = OBJECT_ID('dbo.' + #tablename)
FOR XML PATH('')
), 1, 2, ''
) +
')' + CHAR(13) + 'VALUES (' +
STUFF
(
(
SELECT
CASE WHEN NOT EXISTS(SELECT * FROM #excludedFields WHERE FieldName = name) THEN
''', '' + ISNULL(' +
CASE WHEN EXISTS(SELECT * FROM #numberTypes WHERE SysId = system_type_id) THEN '' ELSE ''''''''' + ' END +
'CAST(' + name + ' AS VARCHAR)' +
CASE WHEN EXISTS(SELECT * FROM #numberTypes WHERE SysId = system_type_id) THEN '' ELSE ' + ''''''''' END +
', ''NULL'') + '
ELSE ''
END
FROM sys.columns
WHERE object_id = OBJECT_ID('dbo.' + #tablename)
FOR XML PATH('')
), 1, 3, ''
) +
''')' + CASE WHEN #idInsert = 1 THEN CHAR(13) + #setIdentity + ' OFF' ELSE '' END +
''' FROM ' + #tablename + ' WHERE ' + #idField + ' = ' + CAST(#id AS VARCHAR)
INSERT #results EXEC (#execsql)
DELETE #excludedFields
SET #cnt = #cnt + 1
END
DECLARE cur CURSOR FOR SELECT Sql FROM #results
OPEN cur
DECLARE #sql NVARCHAR(4000)
FETCH NEXT FROM cur INTO #sql
WHILE ##FETCH_STATUS = 0
BEGIN
PRINT #sql
FETCH NEXT FROM cur INTO #sql
END
CLOSE cur
DEALLOCATE cur
You can Choose 'Result to File' option in SSMS and export your select result to file and make your changes in result file and finally using BCP - Bulk copy you can insert in table 1 in database 2.
I think for bulk insert you have to convert .rpt file to .csv file
Hope it will help.
I had a similar problem, but I needed to be able to create an INSERT statement from a query (with filters etc.)
So I created following procedure:
CREATE PROCEDURE dbo.ConvertQueryToInsert (#input NVARCHAR(max), #target NVARCHAR(max)) AS BEGIN
DECLARE #fields NVARCHAR(max);
DECLARE #select NVARCHAR(max);
-- Get the defintion from sys.columns and assemble a string with the fields/transformations for the dynamic query
SELECT
#fields = COALESCE(#fields + ', ', '') + '[' + name +']',
#select = COALESCE(#select + ', ', '') + ''''''' + ISNULL(CAST([' + name + '] AS NVARCHAR(max)), ''NULL'')+'''''''
FROM tempdb.sys.columns
WHERE [object_id] = OBJECT_ID(N'tempdb..'+#input);
-- Run the a dynamic query with the fields from #select into a new temp table
CREATE TABLE #ConvertQueryToInsertTemp (strings nvarchar(max))
DECLARE #stmt NVARCHAR(max) = 'INSERT INTO #ConvertQueryToInsertTemp SELECT '''+ #select + ''' AS [strings] FROM '+#input
exec sp_executesql #stmt
-- Output the final insert statement
SELECT 'INSERT INTO ' + #target + ' (' + #fields + ') VALUES (' + REPLACE(strings, '''NULL''', 'NULL') +')' FROM #ConvertQueryToInsertTemp
-- Clean up temp tables
DROP TABLE #ConvertQueryToInsertTemp
SET #stmt = 'DROP TABLE ' + #input
exec sp_executesql #stmt
END
You can then use it by writing the output of your query into a temp table and running the procedure:
-- Example table
CREATE TABLE Dummy (Id INT, Comment NVARCHAR(50), TimeStamp DATETIME)
INSERT INTO Dummy VALUES (1 , 'Foo', GetDate()), (2, 'Bar', GetDate()), (3, 'Foo Bar', GetDate())
-- Run query and procedure
SELECT * INTO #TempTableForConvert FROM Dummy WHERE Id < 3
EXEC dbo.ConvertQueryToInsert '#TempTableForConvert', 'dbo.Dummy'
Note:
This procedure only casts the values to a string which can cause the data to look a bit different. With DATETIME for example the seconds will be lost.
I created the following procedure:
if object_id('tool.create_insert', 'P') is null
begin
exec('create procedure tool.create_insert as');
end;
go
alter procedure tool.create_insert(#schema varchar(200) = 'dbo',
#table varchar(200),
#where varchar(max) = null,
#top int = null,
#insert varchar(max) output)
as
begin
declare #insert_fields varchar(max),
#select varchar(max),
#error varchar(500),
#query varchar(max);
declare #values table(description varchar(max));
set nocount on;
-- Get columns
select #insert_fields = isnull(#insert_fields + ', ', '') + c.name,
#select = case type_name(c.system_type_id)
when 'varchar' then isnull(#select + ' + '', '' + ', '') + ' isnull('''''''' + cast(' + c.name + ' as varchar) + '''''''', ''null'')'
when 'datetime' then isnull(#select + ' + '', '' + ', '') + ' isnull('''''''' + convert(varchar, ' + c.name + ', 121) + '''''''', ''null'')'
else isnull(#select + ' + '', '' + ', '') + 'isnull(cast(' + c.name + ' as varchar), ''null'')'
end
from sys.columns c with(nolock)
inner join sys.tables t with(nolock) on t.object_id = c.object_id
inner join sys.schemas s with(nolock) on s.schema_id = t.schema_id
where s.name = #schema
and t.name = #table;
-- If there's no columns...
if #insert_fields is null or #select is null
begin
set #error = 'There''s no ' + #schema + '.' + #table + ' inside the target database.';
raiserror(#error, 16, 1);
return;
end;
set #insert_fields = 'insert into ' + #schema + '.' + #table + '(' + #insert_fields + ')';
if isnull(#where, '') <> '' and charindex('where', ltrim(rtrim(#where))) < 1
begin
set #where = 'where ' + #where;
end
else
begin
set #where = '';
end;
set #query = 'select ' + isnull('top(' + cast(#top as varchar) + ')', '') + #select + ' from ' + #schema + '.' + #table + ' with (nolock) ' + #where;
insert into #values(description)
exec(#query);
set #insert = isnull(#insert + char(10), '') + '--' + upper(#schema + '.' + #table);
select #insert = #insert + char(10) + #insert_fields + char(10) + 'values(' + v.description + ');' + char(10) + 'go' + char(10)
from #values v
where isnull(v.description, '') <> '';
end;
go
Then you can use it that way:
declare #insert varchar(max),
#part varchar(max),
#start int,
#end int;
set #start = 1;
exec tool.create_insert #schema = 'dbo',
#table = 'customer',
#where = 'id = 1',
#insert = #insert output;
-- Print one line to avoid the maximum 8000 characters problem
while len(#insert) > 0
begin
set #end = charindex(char(10), #insert);
if #end = 0
begin
set #end = len(#insert) + 1;
end;
print substring(#insert, #start, #end - 1);
set #insert = substring(#insert, #end + 1, len(#insert) - #end + 1);
end;
The output would be something like that:
--DBO.CUSTOMER
insert into dbo.customer(id, name, type)
values(1, 'CUSTOMER NAME', 'F');
go
If you just want to get a range of rows, use the #top parameter as bellow:
declare #insert varchar(max),
#part varchar(max),
#start int,
#end int;
set #start = 1;
exec tool.create_insert #schema = 'dbo',
#table = 'customer',
#top = 100,
#insert = #insert output;
-- Print one line to avoid the maximum 8000 characters problem
while len(#insert) > 0
begin
set #end = charindex(char(10), #insert);
if #end = 0
begin
set #end = len(#insert) + 1;
end;
print substring(#insert, #start, #end - 1);
set #insert = substring(#insert, #end + 1, len(#insert) - #end + 1);
end;
You can Use Sql Server Integration Service Packages specifically designed for Import and Export operation.
VS has a package for developing these packages if your fully install Sql Server.
Integration Services in Business Intelligence Development Studio
I think its also possible with adhoc queries
you can export result to excel file and then import that file into your datatable object or use it as it is and then import the excel file into the second database
have a look at this link
this can help u alot.
http://vscontrols.blogspot.com/2010/09/import-and-export-excel-to-sql-server.html
If you are using Oracle (or configure the application to the SQL Server) then Oracle SQL Developer does this for you. choose 'unload' for a table and follow the options through (untick DDL if you don't want all the table create stuff).
I found this SMSMS Boost addon, which is free and does exactly this among other things. You can right click on the results and select Script data as.
You can use this Q2C.SSMSPlugin, which is free and open source. You can right click and select "Execute Query To Command... -> Query To Insert...". Enjoy)
You can use an INSERT INTO SELECT statement, to insert the results of a select query into a table. http://www.w3schools.com/sql/sql_insert_into_select.asp
Example:
INSERT INTO Customers (CustomerName, Country)
SELECT SupplierName, Country
FROM Suppliers
WHERE Country='Germany'