Dynamically changing databases in SQL Server 2000 - sql-server-2000

At work we have a number of databases that we need to do the same operations on. I would like to write 1 SP that would loop over operations and set the database at the beginning of the loop (example to follow). I've tried sp_executesql('USE ' + #db_id) but that only sets the DB for the scope of that stored procedure. I don't really want to loop with hard coded database names because we need to do similar things in many different places and it's tough to remember where things need to change if we add another DB.
Any thoughts>
Example:
DECLARE zdb_loop CURSOR FAST_FORWARD FOR
SELECT distinct db_id from DBS order by db_id
OPEN zdb_loop
FETCH NEXT FROM zdb_loop INTO #db_id
WHILE ##FETCH_STATUS = 0
BEGIN
USE #db_id
--Do stuff against 3 or 4 different DBs
FETCH NEXT FROM zdb_loop INTO #db_id
END
CLOSE zdb_loop
DEALLOCATE zdb_loop

You can use the stored procedure sp_MSforeachdb for this:
This example will perform a database
backup, then a "DBCC CHECKDB" against
each database:
declare #cmd1 varchar(500)
declare #cmd2 varchar(500)
declare #cmd3 varchar(500)
set #cmd1 =
'if ''?'' <> ''tempdb'' print ''*** Processing DB ? ***'''
set #cmd2 = 'if ''?'' <> ''tempdb'' backup database ? to disk=''c:\temp\?.bak'''
set #cmd3 = 'if ''?'' <> ''tempdb'' dbcc checkdb(?)'
exec sp_MSforeachdb #command1=#cmd1,
#command2=#cmd2,
#command3=#cmd3

So far looks like dynamic SQL is the only way to do this. Pretty lame.

Related

Copy SQL Server database for development but smaller

We have production database above 100Gig. I want to duplicate this database and give it to every developer to test its code, but the size is too large. Is there anyway I can backup just top 1000 rows with FK's and restore it to new DB? Or duplicate the DB first and delete all records from all tables, but keep 1000 rows with FK's or any other way to keep size below 5Gig.
I did search, but none of solutions were for tables having foreign keys.
Thanks,
Basheer
This the IDEA:
First:
Create new database:
Second:
Select small records only like:
select top 500 from allYourTables
then insert to each every table to your new Database Created.
Third:
Dump the new database and give to its every developer
Hope it helps:
Assuming that you have a new db_to_dev Database name and you are working to your current database:
This procedure will insert all the data from your working database, making sure that you had already a database created. db_to_dev:
Using Information_Schema you can select all your tables:
CREATE PROCEDURE PROC_TRANSFER_DATA #NUM_OF_RECORDS nvarchar(255) as
BEGIN
SET NOCOUNT ON;
DECLARE #message varchar(80), #tablename nvarchar(50);
Declare #sqlstmt nvarchar(255);
PRINT '-------- List of tables --------';
DECLARE Schema_cursor CURSOR FOR
SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES
OPEN Schema_cursor
FETCH NEXT FROM Schema_cursor INTO #tablename
IF ##FETCH_STATUS <> 0
PRINT ' <<None>>'
WHILE ##FETCH_STATUS = 0
BEGIN
SELECT #message = ' ' + #tablename
set #sqlstmt = 'select top ' + #NUM_OF_RECORDS + ' * into [db_to_dev].[dbo].['+ #tablename +'] from ' + #tablename;
EXEC sp_executesql #sqlstmt
PRINT #message
FETCH NEXT FROM Schema_cursor INTO #tablename
END
CLOSE Schema_cursor
DEALLOCATE Schema_cursor
END
To use:
With an option parameter:
EXEC PROC_TRANSFER_DATA '500'
Parameter value is depend on you if how many records you want to transfer into your new database db_to_dev.
This Stored Proc is tested.
Good luck
There are a number of projects on github which seek to do exactly that: make a subset that preserves referential integrity. Here is one such project:
https://github.com/18F/rdbms-subsetter

Use “insert into” only with databases that has a certain table ( SQL Server 2008 R2 )

I need to do a "log" with all the information of a certain table, of all databases, inside a new table that I will create (With the same structure).
But not all databases has this table.
I could make a query to find all databases that has this table I want:
SELECT name
FROM sys.databases
WHERE CASE
WHEN state_desc = 'ONLINE'
THEN OBJECT_ID(QUOTENAME(name) + '.[dbo].[tblLogdiscador]', 'U')
END IS NOT NULL
It will only list the databases with this table I want to log. But now I need to do a loop, to pass by all databases, inserting the information of the "tbllogdiscador" into the table I created. I was thinking in SP_MSFOREACHDB but I see a lot of people saying to not use it.
How can I loop trough all databases that has the table, and if it has, insert into the new log table??
The code below is not helping me:
exec sp_msforeachdb 'if ((select count(*)
from [?].sys.tables Where name in(''tbllogdiscador''))=1)
begin
insert into [The new tbl log]
select * from ?.dbo.tbllog
end
I'm trying to use a cursor, but i'm having problems.
Any Ideas how to do this with WHILE?
To do what you are thinking, you need some kind of process flow logic (A cursor seems to be the most fitting choice), and dynamic sql.
So the high level is we need to get all of the DB names, put them into the cursor, and then use the cursor to execute the dynamic sql statement where you test to see of the table exists, and pull the records if so.
Ok so here is an example cursor that loops through the DBs on a server looking for a particular table name, and if it exists, does something (you'll have to do the sql for #Sql2):
declare #DBName varchar(256)
declare #SQL1 nvarchar(max)
declare #Sql2 nvarchar(max)
DECLARE db_cursor CURSOR FOR
SELECT name
FROM sys.databases
WHERE state_desc = 'ONLINE'
OPEN db_cursor
FETCH NEXT FROM db_cursor INTO #DBName
WHILE ##FETCH_STATUS = 0
Begin
set #SQL1 = 'Select Name from ' + #DBName + '.sys.objects where name = ''tblLogdiscador'' '
set #SQL2 = --Your select and insert statement selecting from #DBName + 'dbo.tbllogdiscador'
if exists(exec sp_executesql #Sql1)
begin
exec sp_executesql #sql2
end
FETCH NEXT FROM db_cursor INTO #DBName
end
close db_cursor
deallocate db_cursor

SQL Server 2008 r2: How to check all views for runtime errors?

I have a LOT of views in the database.
Each view ofc refers to one or more tables.
There was some work done with those tables (alter, delete columns) and now i need to check all views for any runtime errors.
I went straithforward: got list of all views, iterated over it and launch SELECT TOP 0 * FROM view_name dynamically so any errors should appear in the Messages pane.
This is my code
DECLARE #view_name_template varchar(max) = '%'
DECLARE #columnList varchar(75) = '*'
--------------------------
DECLARE #tmp_views AS TABLE (view_name varchar(max))
DECLARE #view_name varchar(max)
DECLARE #sqlCommand nvarchar(max)
DECLARE #num int = 1
DECLARE #total_count int
SET NOCOUNT ON
INSERT INTO #tmp_views
SELECT name FROM sys.views
WHERE name LIKE #view_name_template
SELECT #total_count = COUNT(*) FROM sys.views WHERE name LIKE #view_name_template
DECLARE db_cursor CURSOR FOR
SELECT view_name FROM #tmp_views ORDER BY LOWER(view_name)
OPEN db_cursor
FETCH NEXT FROM db_cursor INTO #view_name
WHILE ##FETCH_STATUS = 0
BEGIN
SET #sqlCommand = 'SELECT TOP 0 ' + #columnList + ' FROM ' + #view_name
PRINT CAST(#num as varchar(31)) + '/' + CAST(#total_count as varchar(31)) + ' ' + #sqlCommand
EXECUTE sp_executesql #sqlCommand
FETCH NEXT FROM db_cursor INTO #view_name
SET #num = #num + 1
END
CLOSE db_cursor
DEALLOCATE db_cursor
It works fine except it completely freezes on some views (select from those views in other window works extremely fast and fine). I think it is server a memory overflow issue or something similar.
Tell me please: what is the lightweighiest way to check view has errors or not? Maybe SQL Server has a special function or stored procedure?
The code is not "hanging". It is waiting for the view to run, despite the top 0.
SQL Server offers several ways of testing queries. In addition to the top 0, you also have:
`set parseonly1
set noexec on
And then the more recent sp_describe_first_result_set.
Each of these do different things. parseonly checks for syntax errors but doesn't look at table layouts. I believe noexec completely compiles the query, creating the execution plan. top 0 will compile the query and also run it.
In some cases, the optimizer may not recognize that a query that returns no rows might need to do no work. For instance, there might be subqueries that are run despite the top 0, and this is causing the delay.
Two approaches. The first is to use noexec on (documented here). The second, if feasible, would be to create another database with the same structure and no data. You can then test the queries on that database.

Use SELECT results as a variable in a loop

I've searched here and elsewhere, and haven't found an answer yet. Hope I didn't miss it.
Using SQL Server Management Studio 2008 R2.
I have n specific databases on my server (there are other DBs as well, but I'm only interested in some of them)
Each of these databases has a table within it, which all have the same name. The only difference is the DB name. I want to aggregate these tables together to make one big table on a different database (different to the other DBs).
I can get the db names from the results of a query.
N is unknown.
Is a loop the way to go about this?
I was thinking something along the lines of the following pseudocode:
Set #dbnames = SELECT DISTINCT dbname FROM MyServer.dbo.MyTable
For each #name in #dbnames
INSERT INTO ADifferentDB.dbo.MyOtherTable
SELECT * FROM #name.dbo.table
Next name
(Clearly I'm new to using SQL variable as well, as you can see)
Your first problem is about iterating the databases: you cand do that with a cursor
Then you have another problem, executing a query where part of it is variable (database's name). You can do that with execute function.
All that is something similar to this:
DECLARE #query VARCHAR(max)
DECLARE #dbname VARCHAR(100)
DECLARE my_db_cursor CURSOR
FOR SELECT DISTINCT dbname FROM MyServer.dbo.MyTable
OPEN my_db_cursor
FETCH NEXT FROM my_db_cursor
INTO #dbname
WHILE ##FETCH_STATUS = 0
BEGIN
SET #query = 'INSERT INTO ADifferentDB.dbo.MyOtherTable
SELECT * FROM ' + #dbname + '.dbo.table'
EXECUTE(#query)
FETCH NEXT FROM my_db_cursor
INTO #dbname
END
CLOSE my_db_cursor
DEALLOCATE my_db_cursor
what you want to do is define a CURSOR for row-level operation. here is some doc
I would suggest using sp_MSForEachDB:
EXEC sp_MSForEachDB '
-- Include only the databases you care about.
IF NOT EXISTS (
SELECT *
FROM MySever.dbo.MyTable
WHERE dbname = ''?''
)
-- Exit if the database is not in your table.
RETURN
-- Otherwise, perform your insert.
INSERT INTO ADifferentDB.dbo.MyOtherTable
SELECT * FROM ?.dbo.table
'
In this case, ? is a token that is replaced with each database on the server.

How do I go through a cursor to perform logic on multiple tables? (The table names are in the cursor)

I get the feeling this is pretty basic database work, but it isn't for me. I'm trying to get a list of all of my tombstone tables from system tables and store the results in a cursor. I'm then trying to perform some logic on each of those tables I'm having trouble doing so.
Any help would be greatly appreciated.
Here is the error I get:
Must declare the table variable "#tablename"
Here is the code:
declare tombstonetables cursor for
(select name from sys.objects
where
name like'%tombstone%'
and type = 'U'--for user_table
)
Print 'Begin purging tombstone tables'
declare #tablename varchar(250)
open tombstonetables
fetch next from tombstonetables into #tablename
WHILE ##FETCH_STATUS = 0
begin
select * from #tablename--real logic goes here later
fetch next from tombstonetables into #tablename
end
close tombstonetables
deallocate tombstonetables
Looks like you need to use Dynamic SQL
Here is a reference to a simple walk through http://www.mssqltips.com/tip.asp?tip=1160
You will probably need to make use of sp_executesql
Here is a simple example of using Dynamic SQL with your example
DECLARE #DynamicSQL nvarchar(100)
WHILE ##FETCH_STATUS = 0
begin
SET #DynamicSQL = 'select * from ' + #tablename --real logic goes here later
EXEC #DynamicSQL
fetch next from tombstonetables into #tablename
end