I have 2 different sql servers (2 different databases).
The 2 servers have the same tables.
Now I want to transfer from Server 1's Person table to Server 2's Person table only the records with ID between 1.000 and 50.000.
How could I do it in the easiest way ?
Tried with Generate Scripts, but there isn't an option to select just those IDs, the script transfers all the records.
Tried by using a SELECT statement on Server 1 and exporting the data as CSV, then importing the CSV file on Server 2, but apparently there are some problems because of the datetimeoffset fields...
I had the same problem where I had data in 2 domains that could not see each other over the network, I had to get some date, not all data and move that to the "other" server.
I wrote a script that took all data from a file group and created a dump of that data as well as the script to load the data.
A little later they also started to dump data out to archive for data that needed to be retained as the "csv" version can always be restored, regardless of the database used "7 years" from now...
Anyway, it's just a big "print" statement that uses BCP to move massive amounts of data between servers. You can tweak it to do what you like, just alter the query a bit, the top of the file contains the "control" variables.
/*******************************************************************
this script will generate thebcp out commands for all data from the
users current connected database. The this script will only work if
both databases have the same ddl version, meaning same tables, same
columns same data definitions.
*******************************************************************/
SET NOCOUNT ON
GO
DECLARE #Path nvarchar(2000) = 'f:\export\' -- storage location for bcp dump (needs to have lots of space!)
, #Batchsize nvarchar(40) = '1000000' -- COMMIT EVERY n RECORDS
, #Xmlformat bit = 0 -- 1 for yes to xml format, 0 for not xml
, #SourceServerinstance nvarchar(200) = 'localhost'-- SQL Server \ Instance name
, #Security nvarchar(800) = ' -T ' -- options are -T (trusted), -Uloginid -Ploginpassword
, #GenerateDump bit = 0 -- 0 for storing data to disk, not 1 for loading from disk
, #FileGroup sysname = 'Data'; -- Table filegroup that we are intrested in
--> set output to text and execute the query, then copy the generated commands, validate and execucte them
--------------------------------Do not edit below this line-----------------------------------------------------------------
DECLARE #filter TABLE(TABLE_NAME sysname)
INSERT INTO #filter (TABLE_NAME)
SELECT o.name
FROM sys.indexes as i
JOIN sys.objects as o on o.object_id = i.object_id
WHERE i.data_space_id = FILEGROUP_ID(#FileGroup)
AND i.type_desc ='CLUSTERED'
and o.name not like 'sys%'
order by 1
if(#GenerateDump=0)
begin
--BCP-OUT TABLES
SELECT 'bcp "' + QUOTENAME( TABLE_CATALOG ) + '.' + QUOTENAME( TABLE_SCHEMA )
+ '.' + QUOTENAME( TABLE_NAME ) + '" out "' + #path + '' + TABLE_NAME + '.dat" -q -b"'
+ #batchsize + '" -e"' + #path + 'Error_' + TABLE_NAME + '.err" -n -CRAW -o"' + #path + ''
+ TABLE_NAME + '.out" -S"' + #SourceServerinstance + '" ' + #security + ''
FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE' AND TABLE_NAME IN (SELECT TABLE_NAME FROM #filter)
if(#Xmlformat=0)
begin
print 'REM CREATE NON-XML FORMAT FILE '
SELECT 'bcp "' + QUOTENAME( TABLE_CATALOG ) + '.' + QUOTENAME( TABLE_SCHEMA ) + '.'+
QUOTENAME( TABLE_NAME ) + '" format nul -n -CRAW -f "' + #path + ''
+ TABLE_NAME + '.fmt" -S"' + #SourceServerinstance + '" ' + #security + ''
FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE' AND TABLE_NAME IN (SELECT TABLE_NAME FROM #filter)
end
else
begin
PRINT 'REM XML FORMAT FILE'
SELECT 'bcp "' +QUOTENAME( TABLE_CATALOG ) + '.' + QUOTENAME( TABLE_SCHEMA )
+ '.' + QUOTENAME( TABLE_NAME ) + '" format nul -x -n -CRAW -f "'
+ #path + '' + TABLE_NAME + '.xml" -S"' + #SourceServerinstance + '" ' + #security + ''
FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE' AND TABLE_NAME IN (SELECT TABLE_NAME FROM #filter)
end
end
else
begin
print '--Make sure you backup your database first'
--GENERATE CONSTRAINT NO CHECK
PRINT '--NO CHECK CONSTRAINTS'
SELECT 'ALTER TABLE ' + QUOTENAME(TABLE_SCHEMA)+'.'+QUOTENAME( TABLE_NAME ) + ' NOCHECK CONSTRAINT ' + QUOTENAME( CONSTRAINT_NAME )
FROM INFORMATION_SCHEMA.TABLE_CONSTRAINTS WHERE TABLE_NAME IN (SELECT TABLE_NAME FROM #filter)
PRINT '--DISABLE TRIGGERS'
SELECT 'ALTER TABLE ' + QUOTENAME(TABLE_SCHEMA)+'.'+QUOTENAME( TABLE_NAME ) + ' DISABLE TRIGGER ALL'
FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE' AND TABLE_NAME IN (SELECT TABLE_NAME FROM #filter)
--TRUNCATE TABLE
SELECT 'TRUNCATE TABLE ' +QUOTENAME(TABLE_SCHEMA)+'.'+QUOTENAME( TABLE_NAME ) + '
GO '
FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE' AND TABLE_NAME IN (SELECT TABLE_NAME FROM #filter)
--BULK INSERT
SELECT DISTINCT 'BULK INSERT ' + QUOTENAME(TABLE_CATALOG) + '.'
+ QUOTENAME( TABLE_SCHEMA ) + '.' + QUOTENAME( TABLE_NAME ) + '
FROM ''' + #path + '' + TABLE_NAME + '.Dat''
WITH (FORMATFILE = ''' + #path + '' + TABLE_NAME + '.FMT'',
BATCHSIZE = ' + #batchsize + ',
ERRORFILE = ''' + #path + 'BI_' + TABLE_NAME + '.ERR'',
TABLOCK);
GO '
FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE' AND TABLE_NAME IN (SELECT TABLE_NAME FROM #filter)
--GENERATE CONSTRAINT CHECK CONSTRAINT TO VERIFY DATA AFTER LOAD
PRINT '--CHECK CONSTRAINT'
SELECT 'ALTER TABLE ' + QUOTENAME(TABLE_SCHEMA)+'.'+QUOTENAME( TABLE_NAME ) + ' CHECK CONSTRAINT '
FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE' AND TABLE_NAME IN (SELECT TABLE_NAME FROM #filter)
SELECT 'ALTER TABLE ' + QUOTENAME(TABLE_SCHEMA)+'.'+ QUOTENAME( TABLE_NAME ) + ' ENABLE TRIGGER ALL'
FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE = 'BASE TABLE' AND TABLE_NAME IN (SELECT TABLE_NAME FROM #filter)
end
At the end the easiest way was to use create a linked server between the 2 and execute my queries taking data from both the servers and excluding the IDs from the first server.
Thanks to everyone for the responses.
Related
I am rather new to this and I am having a hard time into converting the below SQL query into an UPDATE statement for a stored procedure.
SELECT 'select'+
stuff((
SELECT ',' + 'dbo.' + Function_Name + '(' + Parameters_List + ')' FROM
[SPECIFIC_DATABASE]..Specific_table c WHERE c.Table_Name = t.Table_Name FOR
XML PATH('')),1,1,'')
+' from [' + Database_Name +'].[dbo].['+Table_Name+'] '
+ 'Where Audit_ID>' + CAST(#Audit_ID as nvarchar(100))
As 'Specific Queries'
FROM (SELECT Distinct Database_Name, Table_Name FROM [SPECIFIC_DATABASE]..Specific_table) t
The UPDATE query should be something like
UPDATE Table_name
SET Column_name = Function_Name(Parameters_List)
WHERE Audit_id >= #Audit_ID
FROM [SPECIFIC_DATABASE]..Specific_table
Any suggestions and guidelines on this would be much appreciated!
I think this should give you what you want, but I don't see any any reference to a Column_Name, so I'm assuming you will hardcode that.
select 'UPDATE tbl' + stuff((
select ' set Column_Name = ' + 'dbo.' + Function_Name + '(' + Parameters_List + ')'
from [SPECIFIC_DATABASE]..Specific_table c
where c.Table_Name = t.Table_Name
for xml PATH('')
), 1, 1, '')
+ ' from [' + Database_Name + '].[dbo].[' + Table_Name + '] tbl'
+ 'Where Audit_ID>'
+ CAST(#Audit_ID as nvarchar(100)) as 'Specific Queries'
from (
select distinct Database_Name, Table_Name
from [SPECIFIC_DATABASE]..Specific_table
) t
If the answer's not right then it might be helpful if you post what is the current output of your first query and maybe some more details as to what are the contents of the table called "Specific_table".
This question already has answers here:
Find a value anywhere in a database
(18 answers)
Closed 5 years ago.
I want to loop throgh all the column of all the tables available in my database to check which column containts string "Kroki Homes".
I have approx. 56 tables in my database. So it is hard to check in all the columns of these 56 tables. is there any easy way to achieve this in sql server?
You can create one procedure and can pass the string for search. I found this some where this might help you.
CREATE PROC [dbo].[SearchDataFromAllTables] (#SearchStr NVARCHAR(100))
AS
BEGIN
SET NOCOUNT ON;
CREATE TABLE #Results
(
ColumnName NVARCHAR(370),
ColumnValue NVARCHAR(3630)
)
DECLARE #TableName NVARCHAR(256)
, #ColumnName NVARCHAR(128)
, #SearchStr2 NVARCHAR(110)
SET #TableName = ''
SET #SearchStr2 = QUOTENAME('%' + #SearchStr + '%', '''')
WHILE #TableName IS NOT NULL
BEGIN
SET #ColumnName = ''
SET #TableName = (
SELECT MIN(QUOTENAME(TABLE_SCHEMA) + '.' + QUOTENAME(TABLE_NAME))
FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_TYPE = 'BASE TABLE'
AND QUOTENAME(TABLE_SCHEMA) + '.' + QUOTENAME(TABLE_NAME) > #TableName
AND OBJECTPROPERTY(OBJECT_ID(QUOTENAME(TABLE_SCHEMA) + '.' + QUOTENAME(TABLE_NAME)), 'IsMSShipped') = 0
)
WHILE (#TableName IS NOT NULL)
AND (#ColumnName IS NOT NULL)
BEGIN
SET #ColumnName = (
SELECT MIN(QUOTENAME(COLUMN_NAME))
FROM INFORMATION_SCHEMA.COLUMNS
WHERE TABLE_SCHEMA = PARSENAME(#TableName, 2)
AND TABLE_NAME = PARSENAME(#TableName, 1)
AND DATA_TYPE IN (
'char'
,'varchar'
,'nchar'
,'nvarchar'
)
AND QUOTENAME(COLUMN_NAME) > #ColumnName
)
IF #ColumnName IS NOT NULL
BEGIN
INSERT INTO #Results
EXEC (
'SELECT ''' + #TableName + '.' + #ColumnName + ''', LEFT(' + #ColumnName + ', 3630)
FROM ' + #TableName + ' (NOLOCK) ' + ' WHERE ' + #ColumnName + ' LIKE ' + #SearchStr2
)
END
END
END
SELECT ColumnName, ColumnValue FROM #Results
END
You need to write a stored procedure to scan through all the tables.
The script below gives to the tables and columns. From this you should create a SQL in-flight and execute dynamically to SELECT from tables. Let us know if you need more details.
SELECT tb.name AS table_name,
c.name AS column_name,
c.column_id,
tp.name AS column_data_type,
c.max_length,
c.precision,
c.scale,
CASE c.is_nullable WHEN 1 THEN 'YES' ELSE 'NO' END AS is_nullable
FROM sys.tables tb,
sys.columns c,
sys.types tp
WHERE tb.object_id = c.object_id
AND SCHEMA_NAME(tb.schema_id) = 'dbo'
AND c.user_type_id = tp.user_type_id
ORDER BY table_name, column_id;
I'm trying to create some overviews and eventually some dynamic select statements for a database.
select 'select ''' + TABLE_SCHEMA + '.' + TABLE_NAME + ''', count(*) from [' + TABLE_SCHEMA + '].[' + TABLE_NAME + '] union all'
from INFORMATION_SCHEMA.TABLES
where TABLE_TYPE != 'VIEW'
and (select count(*) from TABLE_SCHEMA.TABLE_NAME) > 0
I'm stuck on this part: select count(*) from TABLE_SCHEMA.TABLE_NAME
Which throws Invalid object name
Is there any way I can make that work dynamically?
If you want to exclude tables with no rows from your result set, add a HAVING COUNT(*) > 0 clause to your generated SQL:
select 'select ''' + TABLE_SCHEMA + '.' + TABLE_NAME + ''', count(*) from [' + TABLE_SCHEMA + '].[' + TABLE_NAME + '] having count(*) > 0 union all'
from INFORMATION_SCHEMA.TABLES
where TABLE_TYPE != 'VIEW'
I have a stored procedure which when run gives a table output. I want to export this procedure to a csv file but want to append double/single quotes for all the columns with a datatype CHAR/VARCHAR.
For Example:
Stored Proc O/P:
ID Name Address SSN
1 abd 9301,LeeHwy, 22031 64279100
Output in CSV File:
1,"abd","9301,LeeHwy, 22031",64279100
Can anyone also help me on how I can use a BAT file to execute the procedure and generate this csv file.
One way to do this, is to loop through the table schema to extract the varchar columns. I have tested this for one of my tables, and it worked:
DECLARE #tableName VARCHAR(Max) = '[Put your table name here]';
DECLARE #currColumns VARCHAR(Max) = NULL;
SELECT #currColumns = COALESCE(#currColumns + ','
+ CASE WHEN t.Name = 'varchar' THEN '''"'' + ' ELSE '' END
+ '[', '[') + c.name + ']'
+ CASE WHEN t.Name = 'varchar' THEN '+ ''"''' ELSE '' END
+ ' as [' + c.name + ']'
FROM
sys.columns c
INNER JOIN
sys.types t ON c.user_type_id = t.user_type_id
WHERE
c.object_id = OBJECT_ID(#tableName)
EXEC('SELECT ' + #currColumns + ' FROM ' + #tableName);
It's a quick and dirty way.
UPDATE (comment):
Inserting into a table is really easy. Just do this:
INSERT INTO [TABLE]
EXEC('SELECT ' + #currColumns + ' FROM ' + #tableName);
I have found a solution for my problem.
Credits also go to #Rogala (The developer who gave initial answer to the question) for triggering the idea of using system tables.
The code is as below:
DECLARE #tableName VARCHAR(Max) = '[Put your table name here]';
DECLARE #currColumns VARCHAR(Max) = NULL;
Declare #Delim CHAR(5)='''"''+'
SELECT #currColumns = COALESCE(#currColumns + ','+ CASE WHEN DATA_TYPE= 'varchar' THEN '''"'' + ' ELSE '' END + '[', '[') + COLUMN_NAME + ']'
+ CASE WHEN DATA_TYPE = 'varchar' THEN '+ ''"''' ELSE '' END + ' as [' + COLUMN_NAME + ']'
FROM INFORMATION_SCHEMA.Columns
WHERE table_name = #tableName
Set #currColumns= #Delim+#currColumns
EXEC('SELECT ' + #currColumns + ' FROM ' + #tableName);
How can I check if any column in a given table only have null or empty string values? Can I in some way extend this for every table in the database?
Here is a stored proc for finding an arbitrary value in the database. It's a fairly small modification to make it search for empty columns.
The procedure generates a list of all the tables and all the columns in the database, and creates a temporary table for storing the results. Then it generates a dynamic SQL and uses the INSERT INTO ... EXEC to fill the result table.
Here's a runnable example off the StackOverflow database:
-- Look for NULLs
DECLARE #sql AS varchar(max)
SELECT #sql = COALESCE(#sql + ' UNION ALL ', '') + sql
FROM (
SELECT 'SELECT ''' + c.TABLE_NAME + '.' + c.COLUMN_NAME + ''' AS COLUMN_NAME, COUNT(NULLIF(' + QUOTENAME(c.COLUMN_NAME) + ', '''')) AS NON_NULL_COUNT, COUNT(*) AS TOTAL_COUNT FROM ' + QUOTENAME(c.TABLE_CATALOG) + '.' + QUOTENAME(c.TABLE_SCHEMA) + '.' + QUOTENAME(c.TABLE_NAME) AS sql
FROM INFORMATION_SCHEMA.COLUMNS AS c
INNER JOIN INFORMATION_SCHEMA.TABLES AS t
ON t.TABLE_CATALOG = c.TABLE_CATALOG
AND t.TABLE_SCHEMA = c.TABLE_SCHEMA
AND t.TABLE_NAME = c.TABLE_NAME
WHERE c.DATA_TYPE IN ('nvarchar', 'varchar')
UNION ALL
SELECT 'SELECT ''' + c.TABLE_NAME + '.' + c.COLUMN_NAME + ''' AS COLUMN_NAME, COUNT(' + QUOTENAME(c.COLUMN_NAME) + ') AS NON_NULL_COUNT, COUNT(*) AS TOTAL_COUNT FROM ' + QUOTENAME(c.TABLE_CATALOG) + '.' + QUOTENAME(c.TABLE_SCHEMA) + '.' + QUOTENAME(c.TABLE_NAME) AS sql
FROM INFORMATION_SCHEMA.COLUMNS AS c
INNER JOIN INFORMATION_SCHEMA.TABLES AS t
ON t.TABLE_CATALOG = c.TABLE_CATALOG
AND t.TABLE_SCHEMA = c.TABLE_SCHEMA
AND t.TABLE_NAME = c.TABLE_NAME
WHERE c.DATA_TYPE NOT IN ('nvarchar', 'varchar')
AND c.IS_NULLABLE = 'YES'
) AS checks
SET #sql = 'SELECT * FROM (' + #sql + ') AS checks WHERE NON_NULL_COUNT = 0'
EXEC (#sql)
A few things to note:
There are two columns it finds which are completely NULL/blank:
Posts.OwnerDisplayName, Bdges.CreationDate
It converts '' to NULL for nvarchar and varchar columns (if you have char or nchar columns, you would have to change this)
You can't normally put a condition on a query for all columns in a table. You have to pick the columns you want. To get around this you need dynamic sql and the information_schema views.