I have almost 150 databases with all the same tables. I know its bad but I don't have control over it. I'm trying to improve performance with some indexes. I know what the indexes should be but I need to build them on the same tables in every database. Is there a way to do this bsides creating them all separately?
I had a similar situation a while back so I came up with this code. You can use dynamic SQL with sp_MSforeachdb to loop through your databases. I've excluded the system databases below but you can include/exclude databases as you like in that first IF.
This code will check each database for your specific table as well as checking to see if that index already exists on that table. If not, it creates it. I included a RAISERROR to show the progress through the databases in SSMS messages. Just change the table/index names below and update the CREATE INDEX statement as appropriate for you.
DECLARE #command varchar(1000)
SELECT #command = 'IF ''?'' NOT IN(''master'', ''model'', ''msdb'', ''tempdb'')
BEGIN USE ?
EXEC(''
DECLARE #DB VARCHAR(200)
SET #DB = DB_NAME()
RAISERROR (#DB, 10, 1) WITH NOWAIT
IF OBJECT_ID(''''dbo.TableName'''', ''''U'''') IS NOT NULL
BEGIN
IF NOT EXISTS (SELECT 1 FROM sys.indexes WHERE name=''''IX_TableName'''' AND object_id = OBJECT_ID(''''TableName''''))
BEGIN
CREATE INDEX [IX_TableName] ON TableName (indexColumn)
END
END
'') END'
EXEC sp_MSforeachdb #command
Related
I am working in an environment where many users have the same (or almost identical) test database set up on a common MSSQL server. We are talking about well over 100 databases for testing purposes. And at the very least, 95+% of them will contain the table I am trying to target.
These test databases are only filled with junk data - I will not be impacting anyone by doing any kind of a search. I am looking at one table, specifically, and I need to determine if any test database has that table actually containing any data at all. It doesn’t matter what the data is, I just need to find a table actually containing any data, so I can determine why that data exists in the first place. (This DB is quite old - almost two decades, so sometimes no-one has a clear answer why something in it exists).
I have been trying to build an SQL statement that iterates through all the databases, and checks that particular table specifically to see if it has any content, to bring back a list of databases that have that table containing data.
So to be specific: I need to find all databases where a specific table has any content at all (COUNT(*) > 0). Right now totally stuck with not much of any clues as to how to proceed.
In both methods replace <tablename> with the table name
Using sp_foreachdb
You can use sp_foreachDb
CREATE TABLE ##TBLTEMP(dbname varchar(100), rowscount int)
DECLARE #command varchar(4000)
SELECT #command =
'if exists(select 1 from [?].INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME =''<TABLE NAME>'') insert into ##TBLTEMP(dbname,rowscount) select ''[?]'',count(*) from [?].dbo.<tablename>'
EXEC sp_MSforeachdb #command
SELECT * FROM ##TBLTEMP WHERE rowscount > 0
DROP TABLE ##TBLTEMP
Using CURSOR
CREATE TABLE ##TBLTEMP(dbname varchar(100), rowscount int)
DECLARE #dbname Varchar(100), #strQuery varchar(4000)
DECLARE csr CURSOR FOR SELECT [name] FROM sys.databases
FETCH NEXT FROM csr INTO #dbname
WHILE ##FETCH_STATUS = 0
BEGIN
SET #strQuery = 'if exists(select 1 from [' + #dbname +'].INFORMATION_SCHEMA.TABLES WHERE TABLE_NAME =''<TABLE NAME>'') INSERT INTO ##TBLTEMP(dbname,rowscount) SELECT ''' + #dbname + '' ', COUNT(*) FROM [' + #dbname + '].[dbo].<table name>'
EXEC(#strQuery)
FETCH NEXT FROM csr INTO #dbname
END
CLOSE csr
DEALLOCATE csr
SELECT * FROM ##TBLTEMP where rowscount > 0
References
Sp MSforeachDB
Run same command on all SQL Server databases without cursors
DECLARE CURSOR (Transact-SQL)
I'm having quite the problem.
The process data from our machines is stored in multiple MS SQL Databases.
It is possible to mount and unmount data which is no longer used or which should be archived.
Thats the reason, why there are multiple databases.
In each Database there exists multiple tables with the data values for one or more measuring points.
Which a JOIN Query i can get the values from one table with the corresponding table and tagname:
> SELECT HA_DATA_yyyy.R_xxxxx.time HA_DATA_yyyy.R_xxxxx.value, HA_DATA_yyyy.tags.tagname FROM HA_DATA_yyyy.R_xxxxx
> INNER JOIN HA_DATA_yyyy.RI_xxxxx ON HA_DATA_yyyy.R_xxxxx.id = HA_DATA_yyyy.RI_xxxxx.id
> INNER JOIN HA_DATA_yyyy.tags on HA_DATA_yyyy.RI_xxxxx.tag_uuid = HA_DATA_yyyy.tags.tag_uuid
> WHERE (HA_DATA_yyyy.tags.tagname like 'tagname')
yyyy - represents a number for the database
xxxxx - represents a number which is unique on the database-server, but differents in each database.
But now I'm looking for a solution to get this for all R_xxxxx tables of a database and for all mounted databases.
Is there any way to do this without external software? Just with the right query request or user defined function / stored procedure?
maybe dynamic sql is an option.
as a starting point you could use the list of databases:
insert into #dblist (dbname)
select d.name from sys.databases d where d.name like 'HA_DATA_%'
then for each database gather the list of tables to read data from (you can go with cursors or other loop as you prefer):
declare #dbname varchar(128) = ''
declare #dynsql nvarchar(max)
create table #listoftables (name varchar(128), db varchar(128))
while exists(select top 1 dbname from #dblist where dbname > #dbname order by dbname)
begin
set #dynsql = 'insert into #listoftables(name,db) select name,''' + #db + ''' from '+ #db +'.sys.tables'
exec sp_executesql #statement = #dynsql
-- move on to next db
select top 1 #dbname = dbname from #dblist where dbname > #dbname order by dbname
end
now you have a table list to loop onto to build a dynamic query to read all the data at once.
beware of the many issues you may incur using dynamic sql; here and there you can find just the first 2 results gathered with some explanations on why you have to be careful using dynamic sql.
Please have a look at the answer of this Stackoverflow question:
Archiving large amounts of old data in SQL Server
I think it might be what you need.
Update:
You can query the mounted databases by using the SQL query:
select Name into #myDatabases -- get all mounted databases into #myDatabases
from sys.databases
where (has_dbaccess(name) > 0)
and name not in ('master', 'tempdb', 'model', 'msdb')
order by 1
select * from #myDatabases -- list all mounted databases
drop table #myDatabases -- don't forget to drop it at the end
This can be used to create a dynamic SQL statement, which you execute via the sp_executesql command:
EXECUTE sp_executesql #SQL_String, #Parameter_Definition, #Param1, ..., #ParamN
The parameter definition is a string containing the list of parameter names and their datatypes.
So you can build up your own query based on the database list above and then execute it.
Is there a way to drop all objects in a db, with the objects belonging to two different schemas?
I had been previously working with one schema, so I query all objects using:
Select * From sysobjects Where type=...
then dropped everything I using
Drop Table ...
Now that I have introduced another schema, every time I try to drop it says something about I don't have permission or the object does not exist. BUT, if I prefix the object with the [schema.object] it works. I don't know how to automate this, cause I don't know what objects, or which of the two schemas the object will belong to. Anyone know how to drop all objects inside a db, regardless of which schema it belongs to?
(The user used is owner of both schemas, the objects in the DB were created by said user, as well as the user who is removing the objects - which works if the prefix I used IE. Drop Table Schema1.blah)
Use sys.objects in combination with OBJECT_SCHEMA_NAME to build your DROP TABLE statements, review, then copy/paste to execute:
SELECT 'DROP TABLE ' +
QUOTENAME(OBJECT_SCHEMA_NAME(object_id)) + '.' +
QUOTENAME(name) + ';'
FROM sys.objects
WHERE type_desc = 'USER_TABLE';
Or use sys.tables to avoid need of the type_desc filter:
SELECT 'DROP TABLE ' +
QUOTENAME(OBJECT_SCHEMA_NAME(object_id)) + '.' +
QUOTENAME(name) + ';'
FROM sys.tables;
SQL Fiddle
Neither of the other questions seem to have tried to address the all objects part of the question.
I'm amazed you have to roll your own with this - I expected there to be a drop schema blah cascade. Surely every single person who sets up a dev server will have to do this and having to do some meta-programming before being able to do normal programming is seriously horrible. Anyway... rant over!
I started looking at some of these articles as a way to do it by clearing out a schema: There's an old article about doing this, however the tables mentioned on there are now marked as deprecated. I've also looked at the documentation for the new tables to help understand what is going on here.
There's another answer and a great dynamic sql resource it links to.
After looking at all this stuff for a while it just all seemed a bit too messy.
I think the better option is to go for
ALTER DATABASE 'blah' SET SINGLE_USER WITH ROLLBACK IMMEDIATE
drop database 'blah'
create database 'blah'
instead. The extra incantation at the top is basically to force drop the database as mentioned here
It feels a bit wrong but the amount of complexity involved in writing the drop script is a good reason to avoid it I think.
If there seem to be problems with dropping the database I might revisit some of the links and post another answer
try this with sql2012 or above,
this script may help to delete all objects by selected schema
Note: below script for dbo schema for all objects but you may change in very first line #MySchemaName
DECLARE #MySchemaName VARCHAR(50)='dbo', #sql VARCHAR(MAX)='';
DECLARE #SchemaName VARCHAR(255), #ObjectName VARCHAR(255), #ObjectType VARCHAR(255), #ObjectDesc VARCHAR(255), #Category INT;
DECLARE cur CURSOR FOR
SELECT (s.name)SchemaName, (o.name)ObjectName, (o.type)ObjectType,(o.type_desc)ObjectDesc,(so.category)Category
FROM sys.objects o
INNER JOIN sys.schemas s ON o.schema_id = s.schema_id
INNER JOIN sysobjects so ON so.name=o.name
WHERE s.name = #MySchemaName
AND so.category=0
AND o.type IN ('P','PC','U','V','FN','IF','TF','FS','FT','PK','TT')
OPEN cur
FETCH NEXT FROM cur INTO #SchemaName,#ObjectName,#ObjectType,#ObjectDesc,#Category
SET #sql='';
WHILE ##FETCH_STATUS = 0 BEGIN
IF #ObjectType IN('FN', 'IF', 'TF', 'FS', 'FT') SET #sql=#sql+'Drop Function '+#MySchemaName+'.'+#ObjectName+CHAR(13)
IF #ObjectType IN('V') SET #sql=#sql+'Drop View '+#MySchemaName+'.'+#ObjectName+CHAR(13)
IF #ObjectType IN('P') SET #sql=#sql+'Drop Procedure '+#MySchemaName+'.'+#ObjectName+CHAR(13)
IF #ObjectType IN('U') SET #sql=#sql+'Drop Table '+#MySchemaName+'.'+#ObjectName+CHAR(13)
--PRINT #ObjectName + ' | ' + #ObjectType
FETCH NEXT FROM cur INTO #SchemaName,#ObjectName,#ObjectType,#ObjectDesc,#Category
END
CLOSE cur;
DEALLOCATE cur;
SET #sql=#sql+CASE WHEN LEN(#sql)>0 THEN 'Drop Schema '+#MySchemaName+CHAR(13) ELSE '' END
PRINT #sql
EXECUTE (#sql)
I do not know wich version of Sql Server are you using, but assuming that is 2008 or later, maybe the following command will be very useful (check that you can drop ALL TABLES in one simple line):
sp_MSforeachtable "USE DATABASE_NAME DROP TABLE ?"
This script will execute DROP TABLE .... for all tables from database DATABASE_NAME. Is very simple and works perfectly. This command can be used for execute other sql instructions, for example:
sp_MSforeachtable "USE DATABASE_NAME SELECT * FROM ?"
I've searched here and elsewhere, and haven't found an answer yet. Hope I didn't miss it.
Using SQL Server Management Studio 2008 R2.
I have n specific databases on my server (there are other DBs as well, but I'm only interested in some of them)
Each of these databases has a table within it, which all have the same name. The only difference is the DB name. I want to aggregate these tables together to make one big table on a different database (different to the other DBs).
I can get the db names from the results of a query.
N is unknown.
Is a loop the way to go about this?
I was thinking something along the lines of the following pseudocode:
Set #dbnames = SELECT DISTINCT dbname FROM MyServer.dbo.MyTable
For each #name in #dbnames
INSERT INTO ADifferentDB.dbo.MyOtherTable
SELECT * FROM #name.dbo.table
Next name
(Clearly I'm new to using SQL variable as well, as you can see)
Your first problem is about iterating the databases: you cand do that with a cursor
Then you have another problem, executing a query where part of it is variable (database's name). You can do that with execute function.
All that is something similar to this:
DECLARE #query VARCHAR(max)
DECLARE #dbname VARCHAR(100)
DECLARE my_db_cursor CURSOR
FOR SELECT DISTINCT dbname FROM MyServer.dbo.MyTable
OPEN my_db_cursor
FETCH NEXT FROM my_db_cursor
INTO #dbname
WHILE ##FETCH_STATUS = 0
BEGIN
SET #query = 'INSERT INTO ADifferentDB.dbo.MyOtherTable
SELECT * FROM ' + #dbname + '.dbo.table'
EXECUTE(#query)
FETCH NEXT FROM my_db_cursor
INTO #dbname
END
CLOSE my_db_cursor
DEALLOCATE my_db_cursor
what you want to do is define a CURSOR for row-level operation. here is some doc
I would suggest using sp_MSForEachDB:
EXEC sp_MSForEachDB '
-- Include only the databases you care about.
IF NOT EXISTS (
SELECT *
FROM MySever.dbo.MyTable
WHERE dbname = ''?''
)
-- Exit if the database is not in your table.
RETURN
-- Otherwise, perform your insert.
INSERT INTO ADifferentDB.dbo.MyOtherTable
SELECT * FROM ?.dbo.table
'
In this case, ? is a token that is replaced with each database on the server.
I want to create a Stored procedure and Job in Ms SQL server that would delete all tables in a schema that are more than a weeks old.
i create backups every single day and i want to automate the process by scheduling a job to delete any backups(SQL Tables) that are older than a week.
Your help would be greatly appreciated.
You can use the sys.objects keyword within SQL Server to accomplish this.
The query would be something like:
USE [Your_Database_Name];
GO
SELECT name AS object_name
,SCHEMA_NAME(schema_id) AS schema_name
,type_desc
,create_date
,modify_date
FROM sys.objects
WHERE create_date > GETDATE() - [No_Of_Days_Old]
ORDER BY create_date;
GO
This above example is a slight variation on the first example shown on the MSDN page that details the sys.objects keyword (A. Returning all the objects that have been modified in the last N days).
That page can be found here:
sys.objects (Transact-SQL)
and is a great resource for many different methods of querying the "metadata" of your database objects.
Of course, the above will simply "SELECT" the table name that will need to be deleted and won't actually delete (or DROP) the table from your database.
To achieve this, you will need a mechanism to issue a DROP TABLE command. Unfortunately, the DROP TABLE command won't accept a parameter valued table name (i.e. You can't do DROP TABLE #tablename), but you can build a string/varchar of a complete T-SQL statement and EXECUTE it).
To achieve this, you can use a CURSOR to loop through the results of the earlier SELECT statement, building a new T-SQL command in a string/varchar that will drop the table name. An example of this is below:
DECLARE #tname VARCHAR(100)
DECLARE #sql VARCHAR(max)
DECLARE db_cursor CURSOR FOR
SELECT name AS tname
FROM sys.objects
WHERE create_date > GETDATE() - 7
OPEN db_cursor
FETCH NEXT FROM db_cursor INTO #tname
WHILE ##FETCH_STATUS = 0
BEGIN
SET #sql = 'DROP TABLE ' + #tname
--EXEC (#sql)
PRINT #sql
FETCH NEXT FROM db_cursor INTO #tname
END
CLOSE db_cursor
DEALLOCATE db_cursor
Note that in the above example, I have commented out the line EXEC (#sql). This is the line that actually executes the T-SQL statement in the #sql variable, and since it's a destructive command, I simply commented it out and used a PRINT #sql command instead (the line below). Run this as is, to see what tables you're likely to delete, and when you're happy, uncomment the EXEC (#sql) command and comment out the PRINT #sql command!
You can use this query to get the list of tables that are more than a week old:
SELECT
[name]
,create_date
FROM
sys.tables
WHERE DATEDIFF(day, create_date, getdate()) > 7
So in your SP, you could then write an SP to loop over the tables returned from that query and delete them. You'll have to take into account that if the tables have foreign keys, the order in which you delete them is important, so this idea will probably need some tweeking if that's your scenario.