Get number of rows for tables across all databases - sql

Searched a bit and didn't see anything that really fit my needs (although I don't really understand SQL beyond the basic select statement so maybe I just missed it). Took a SQL class in school many moons ago.
I have:
Virtual Windows Server 2008 R2 running SQL 2008 R2. I connect via MS SQL Server Management Studio
90-100 databases
All DB's have the same table structure (each one is a different client)
I want to:
Search all databases and return a list of all databases which have a table (TableName) that is larger than say, 5000 records.
Some sort of script that I can schedule that will use that list and if it finds a DB that the TableName table is more than 5000 records, will delete anything older than x amount of days (say 30). Any kind of logging to know what happened over the last few days would be a bonus.
Any help would be appreciated. Thank you.
EDIT/UPDATE (2/24/15): Hiren Dhaduk provided a nice stored procedure that works. Thanks!

You can use following store procedure in any database. Then run this store procedure by passing table name and NoOfRows(Minimum number of rows for table , In your case it will be 5000):
CREATE PROCEDURE usp_FindLargeTables
#TableName VARCHAR(256) , #NoOfRows int
AS
BEGIN
DECLARE #DBName VARCHAR(256)
DECLARE #varSQL VARCHAR(512)
DECLARE #getDBName CURSOR
SET #getDBName = CURSOR FOR
SELECT name
FROM sys.databases
WHERE state != 6
CREATE TABLE #TmpTable (TableName VARCHAR(256),
SchemaName VARCHAR(256),
DBName VARCHAR(256))
OPEN #getDBName
FETCH NEXT
FROM #getDBName INTO #DBName
WHILE ##FETCH_STATUS = 0
BEGIN
SET #varSQL = 'USE ' + #DBName + ';
INSERT INTO #TmpTable
SELECT
sysobjects.Name as TableName
, sysindexes.Rows as NoOfRows , '''+ #DBName + ''' AS DBName
FROM
sysobjects
INNER JOIN sysindexes
ON sysobjects.id = sysindexes.id
WHERE
type = ''U''
AND sysindexes.IndId < 2
and sysobjects.Name = '''+ #TableName +''' and sysindexes.Rows > ' + CONVERT(VARCHAR, #NoOfRows) + ''
EXEC (#varSQL)
FETCH NEXT
FROM #getDBName INTO #DBName
END
CLOSE #getDBName
DEALLOCATE #getDBName
SELECT *
FROM #TmpTable
WHERE DBName != 'master'
-- STEP 2
DECLARE #DYNAMICQUERY VARCHAR(MAX)
SET #DYNAMICQUERY =
REPLACE((
SELECT 'DELETE FROM ['+ DBName +'].[dbo].['+ TableName +'] where Createdate < DATEADD(day, -30, GETDATE());'
FROM #TmpTable
WHERE DBName != 'master'
FOR XML PATH('')
), '<' , '<');
EXEC(#DYNAMICQUERY);
DROP TABLE #TmpTable
END
Example : usp_FindLargeTables 'DeltaStuds',5000
Clarification on your second point :
Unless you run a trace when the changes happen it is not possible.
So to do this i would suggest you to put one column called createdate in this table and then you will be able to delete record created before 30 days.

What you are asking is very custom to your setup...
You could use the following query to identify tables with row count greater than 5000
SELECT sc.name +'.'+ ta.name TableName
FROM sys.tables ta
INNER JOIN sys.partitions pa
ON pa.OBJECT_ID = ta.OBJECT_ID
INNER JOIN sys.schemas sc
ON ta.schema_id = sc.schema_id
WHERE ta.is_ms_shipped = 0 AND pa.index_id IN (1,0)
and ta.name = 'DeltaStuds'
GROUP BY sc.name,ta.name
having SUM(pa.rows)>5000
But you would need something like this link to run it for all DBs. How to find column names for all tables in all databases in SQL Server. I dont know of a easier way...
For the second part, you will have to setup a delete process that takes the resultset from the above query, creates a dynamic query and deletes rows based on your date field. In that process, you could inject an audit mechanism that logs that delete action, row count, datetime, etc...

Related

SQL Query against multiple databases

I'm trying to run a query against multiple databases on the same server.
I need to pull all the values from 2 tables in a database based on a criteria from a 3rd table in the database, if the database was created after a certain date.
I have a query to find when the database was created:
SELECT *
FROM sys.databases
WHERE STATE = 0 --ignores offline databases
AND database_id > 4 --does not include master, model, msdb, tempdb
AND create_date > CONVERT(datetime, '2021-01-01')
And the query that I need run on each database is generally as follows:
SELECT *
FROM Table1
INNER JOIN Table3 ON Table3.Column6=Table1.Column2
AND Table3.Column3='Value1'
AND Table3.Column4='Value2'
INNER JOIN Table2 ON Table3.Column6=Table2.Column2
I did find this question, which is essentially would I would like to do, but when I look at the INFORMATION_SCHEMA.TABLES the TABLE_CATALOG column does not have the table names I would like to query against. I thought I could try pulling the names from the sys.databases table like above, so I tried modifying it to:
DECLARE #cmd VARCHAR(max) = N''
SELECT #cmd += COALESCE(#cmd + ' UNION ALL ', '') + 'SELECT *
FROM [' + name + '].dbo.Table1
INNER JOIN [' + name + '].dbo.Table3 on Table3.Column6=Table1.Column2
AND Table3.Column3= ''Value1''
AND Table3.Column4=''Value2''
INNER JOIN [' + name + '].dbo.Table2 on Table3.Column6=Table2.Column2'
FROM sys.databases
WHERE STATE = 0
AND database_id>4
AND create_date>CONVERT(datetime,'2021-08-26')
SET #cmd = STUFF(#cmd, CHARINDEX('UNION ALL', #cmd), 10, '')
PRINT #cmd
EXEC(#cmd)
But when I run it with a date earlier than 2021-08-26 (which grabs more than 5 tables), I get a memory error. I need to run this at least to the beginning of April (preferably up to the beginning of the year) which will grab around 500 tables.
What is the recommended way to run a query against multiple databases in SQL?
My recommendation would be instead of trying to build one massive UNION ALL dynamic SQL statement, that you build a #temp table to hold the results of each output, and then it's much easier to send the same string to each database:
CREATE TABLE #hold(dbname sysname, Column1 {data type}, ...);
DECLARE #sql nvarchar(max), #exec nvarchar(1024);
SET #sql = N'SELECT DB_NAME(), *
FROM dbo.Table1
INNER JOIN dbo.Table3
ON Table3.Column6 = Table1.Column2
AND Table3.Column3 = ''Value1''
AND Table3.Column4 = ''Value2''
INNER JOIN dbo.Table2
ON Table3.Column6 = Table2.Column2;';
DECLARE #dbname sysname, #c cursor;
SET #c = CURSOR FORWARD_ONLY STATIC READ_ONLY FOR
SELECT name FROM sys.databases
WHERE state = 0 -- ignores offline databases
AND database_id > 4 -- does not include master, model, msdb, tempdb
AND create_date > CONVERT(datetime, '20210101');
OPEN #c;
FETCH NEXT FROM #c INTO #dbname;
WHILE ##FETCH_STATUS = 0
BEGIN
SET #exec = QUOTENAME(#dbname) + N'.sys.sp_executesql';
INSERT #hold EXEC #exec #sql;
FETCH NEXT FROM #c INTO #dbname;
END;
SELECT * FROM #hold;
You might also consider investing in sp_ineachdb, a procedure I wrote to help simplify running the same command in the context of each database.
Execute a Command in the Context of Each Database in SQL Server using sp_ineachdb
Execute a Command in the Context of Each Database in SQL Server - Part 2

Retrieve Max loaded date across all tables on a DB

Output I'm trying to get to;
(Database name = ATT)
Table Name
Column name
MAX loaded date = MAX(loaded_date) for this column only
loaded_date is a column in around 50 tables in a database with the same name and datatype (Datetime)
select * FROM sys.tables
select * FROM syscolumns
I've been exploring the system tables without much luck, looking at some posts it may be done dynamic SQL which I've never done.
You can write an sql that writes an sql..
SELECT REPLACE(
'select ''{tn}'' as table_name, max(loaded_date) as ld from {tn} union all'
,'{tn}',table_name)
FROM
information_schema.columns
WHERE
column_name = 'loaded_date'
Run that, then copy all but the final UNION ALL out of the results window and into the query window, and run again
If you wanted to get all this into a single string for dynamic exec, i guess it'd look like (untested) a procedure that contained:
DECLARE #x NVARCHAR(MAX);
SELECT #x =
STRING_AGG(
REPLACE(
'select ''{tn}'' as table_name, max(loaded_date) as ld from {tn}'
,'{tn}',table_name)
,' union all ')
FROM
information_schema.columns
WHERE
column_name = 'loaded_date';
EXECUTE sp_executesql #x;
If your SQLS is old and doesnt have string_agg it's a bit more awkward - but there are many examples of "turn rows into CSV" in sql server that look like STUFF..FOR XML PATH - https://duckduckgo.com/?t=ffab&q=rows+to+CSV+SQLS&ia=web
I wrote up a more permanent type of script that does this. It returns a result set of the list of tables in the current database with a column named loaded_date along with the MAX(loaded_date) result from each table. This script individually queries each table by looping through and running the query on each table individually and keeping track of the max value for each table in a table variable. It also has a #Debug variable that allows you to see the text of the queries that would be run instead of actually running them and implements custom error message to troubleshoot any issues.
/*disable row count messages*/
SET NOCOUNT ON;
/*set to 1 to debug (aka just print queries instead of running)*/
DECLARE #Debug bit = 0;
/*get list of tables to query and assign a unique index to row to assist in looping*/
DECLARE #TableList TABLE(
SchemaAndTableName nvarchar(257) NOT NULL
,OrderToQuery bigint NOT NULL
,MaxLoadedDate datetime NULL
,PRIMARY KEY (OrderToQuery)
);
INSERT INTO #TableList (SchemaAndTableName,OrderToQuery)
SELECT
CONCAT(QUOTENAME(s.name),N'.', QUOTENAME(t.name)) AS SchemaAndTableName
,ROW_NUMBER() OVER(ORDER BY s.name, t.name) AS OrderToQuery
FROM
sys.columns AS c
INNER JOIN sys.tables AS t ON c.object_id = t.object_id
INNER JOIN sys.schemas AS s ON t.schema_id = s.schema_id
WHERE
c.name = N'loaded_date';
/*declare and set some variables for loop*/
DECLARE #NumTables int = (SELECT TOP (1) OrderToQuery FROM #TableList ORDER BY OrderToQuery DESC);
DECLARE #I int = 1;
DECLARE #CurMaxDate datetime;
DECLARE #CurTable nvarchar(257);
DECLARE #CurQuery nvarchar(max);
/*start loop*/
WHILE #I <= #NumTables
BEGIN
/*build text of current query*/
SET #CurTable = (SELECT SchemaAndTableName FROM #TableList WHERE OrderToQuery = #I);
SET #CurQuery = CONCAT(N'SELECT #MaxDateOut = MAX(loaded_date) FROM ', #CurTable, N';');
/*check debugging status*/
IF #Debug = 0
BEGIN
BEGIN TRY
EXEC sys.sp_executesql #stmt = #CurQuery
,#params = N'#MaxDateOut datetime OUTPUT'
,#MaxDateOut = #CurMaxDate OUTPUT;
END TRY
BEGIN CATCH
DECLARE #ErrorMessage nvarchar(max) = CONCAT(
N'Error querying table ', #CurTable, N'.', NCHAR(13), NCHAR(10)
,N'Errored query: ', NCHAR(13), NCHAR(10), #CurQuery, NCHAR(13), NCHAR(10)
,N'Error message: ', ERROR_MESSAGE()
);
RAISERROR(#ErrorMessage,16,1) WITH NOWAIT;
/*on error end loop so error can be investigated*/
SET #I = #NumTables + 1;
END CATCH;
END;
ELSE /*currently debugging*/
BEGIN
PRINT(CONCAT(N'Debug output: ', #CurQuery));
END;
/*update value in our table variable*/
UPDATE #TableList
SET MaxLoadedDate = #CurMaxDate
WHERE
OrderToQuery = #I;
/*increment loop*/
SET #I = #I + 1;
END;
SELECT
SchemaAndTableName AS TableName
,MaxLoadedDate AS Max_Loaded_date
FROM
#TableList;
I like this solution better as querying each table one at a time would be much less system impact than attempting one large UNION ALL query. Querying a large set of a tables all at once could cause some serious resource semaphore or locking contention (depending on usage of your db).
It is fairly well commented, but let me know if something is not clear.
Also, just a note, dynamic SQL should be used as a last resort. I provided this script to answer your question, but you should explore better options than something like this.
You can go for undocumented stored procedure sp_MSforeachtable. But, don't use in production code, as this stored procedure might not be available in future versions.
Read more on sp_MSforeachtable
EXEC sp_MSforeachtable 'SELECT ''?'' as tablename, max(loaded_Date) FROM ?'

SQL: Looping through a column, stored the value as a variable, run SQL, then move on to the next line?

I'm currently shifting roles at my job and trying to teach myself some SQL Skills.
Scenario: I'm in charge of 1 database - 10 tables with 10 Primary Keys. Every month, our code team publishes updates to the tables. I am suppose to drop the tables and generate scripts to create the updated tables.
Rather than just drop the old tables and stored procedures, I want to rename my current tables to preserve the structure/data for whatever reason.
In my database, I have an additional table called "TableUpdateList" with 1 column "TableName" and 10 rows - each row containing the name of the updated column (Row 1 = TableName1, Row 2 = TableName2, Row 3 = TableName3)
I would like to be able to "loop" through the TableUpdateList Table and insert each value into a set of SQL statements.
For Example, here are the SQL statements I want to run:
--drop the previous backup table
IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLES where TABLE_NAME = '*TableName1*'+'_Old') DROP TABLE TableName1_Old
-- rename the current tables to _old
EXEC sp_rename *TableName1*, TableName1_Old;
I'm trying to find a way to scroll through the column of my TableUpdateList and run the above two statements filling in where I've italicized with whatever value is present in that row.
Just taking a wild stab because I think in order to get an answer here, you have to try something so here is my pseudo-code:
Declare #TableNames as List
For i in #TableNames
IF EXISTS (SELECT * FROM INFORMATION_SCHEMA.TABLES where TABLE_NAME = '*i*'+'_Old') DROP TABLE TableName1_Old
-- rename the current tables to _old
EXEC sp_rename *i*, TableName1_Old;
Oi, thanks in advance for any help or a point in the right direction to where I could do some further reading about the above online.
You can use sp_executesql with CURSORS for such type of work. Here is what i think you need:
Test objects:
CREATE TABLE TableName1 ( ID INT )
GO
CREATE TABLE TableName2 ( ID INT )
GO
CREATE TABLE TableNames ( Name NVARCHAR(MAX) )
GO
INSERT INTO TableNames
VALUES ( 'TableName1' ),
( 'TableName2' )
Script itself:
DECLARE #name NVARCHAR(MAX) ,
#dropStatement NVARCHAR(MAX),
#renameStatement NVARCHAR(MAX)
DECLARE cur CURSOR FAST_FORWARD READ_ONLY
FOR
SELECT Name
FROM dbo.TableNames
OPEN cur
FETCH NEXT FROM cur INTO #name
WHILE ##FETCH_STATUS = 0
BEGIN
IF EXISTS ( SELECT *
FROM INFORMATION_SCHEMA.TABLES
WHERE TABLE_NAME = #name + '_Old' )
BEGIN
SET #dropStatement = 'DROP TABLE ' + #name + '_Old'
EXEC sp_executesql #dropStatement
END
SET #renameStatement = 'sp_rename ' + #name + ', ' + #name + '_Old';
EXEC sp_executesql #renameStatement
FETCH NEXT FROM cur INTO #name
END
CLOSE cur
DEALLOCATE cur
After this you should add TableName1 and TableName2 again.
Cursors must be avoided as long as possible.
--Preparing script which would check if the old tables exists. If it does,
--it drops the old table
--e.g. first the value 'Table1' is found in TableUpdateList table.
--Then, Table1_Old is deleted and Table1 is renamed to Table1_Old
SELECT 'DROP TABLE ' + b.name + '_Old; EXEC sp_rename ''' + b.name+ ''', ''' + b.name+ '_Old;''' AS [Action]
INTO #Action
FROM INFORMATION_SCHEMA.TABLES A JOIN TableUpdateList B ON A.TABLE_NAME = b.NAME + '_Old'
DECLARE #sql VARCHAR(8000)
SELECT #sql = COALESCE(#sql + ' ', '') + [Action]
FROM #Action
select #sql
--EXEC (#sql)
First verify the value of variable #sql. Then, uncomment the last line to execute the code.
SQL fiddle

Search SQL DB's For A Specific Word

I am completely new to SQL and have no experience what so ever in it so please bear with me with this question.
I need to know if it is possible to search a SQL database for a specific word and if so how?
We are currently going through a rebranding project and I need to look in our CMS (Content Management System) database for all reference to an email address. All I need to search for is:
.co.uk
Below is a screenshot of the database in question with all the containing tables, I just cant get me head around SQL and I have had no joy on Google trying to find the answer.
I need to search everything in this database but I don't know what tables, views, column names etc the content sits in as it's all spread across them all.
There are other tables I need to search but hopefully an answer will be provided which I can modify to search these.
DB's aren't really meant for such vague search descriptions, you should have some definition or model or requirement specs to describe where values like that could exist.
But of course, you could opt for an insanely slow method of doing it by using dynamic SQL.
I made this right fast and just tested it fast, but it should work:
SET NOCOUNT ON
IF OBJECT_ID('tempdb..#SEARCHTABLE') IS NOT NULL
DROP TABLE #SEARCHTABLE
IF OBJECT_ID('tempdb..#RESULTS') IS NOT NULL
DROP TABLE #RESULTS
CREATE TABLE #SEARCHTABLE (ROWNUM INT IDENTITY(1,1), SEARCHCLAUSE VARCHAR(2000) COLLATE DATABASE_DEFAULT)
INSERT INTO #SEARCHTABLE (SEARCHCLAUSE)
SELECT 'SELECT TOP 1 '''+TAB.name+''', '''+C.name+'''
FROM ['+S.name+'].['+TAB.name+']
WHERE '
+CASE WHEN T.name <> 'xml'
THEN '['+C.name+'] LIKE ''%.co.uk%'' AND ['+C.name+'] LIKE ''%#%'''
ELSE 'CAST(['+C.name+'] AS VARCHAR(MAX)) LIKE ''%.co.uk%'' AND CAST(['+C.name+'] AS VARCHAR(MAX)) LIKE ''%#%'''
END AS SEARCHCLAUSE
FROM sys.tables TAB
JOIN sys.schemas S on S.schema_id = TAB.schema_id
JOIN sys.columns C on C.object_id = TAB.object_id
JOIN sys.types T on T.user_type_id = C.user_type_id
WHERE TAB.type_desc = 'USER_TABLE'
AND (T.name LIKE '%char%' OR
T.name LIKE '%xml%')
AND CASE WHEN C.max_length = -1 THEN 10 ELSE C.max_length END >= 6 -- To only search through sufficiently long column
CREATE TABLE #RESULTS (ROWNUM INT IDENTITY(1,1), TABLENAME VARCHAR(256) COLLATE DATABASE_DEFAULT, COLNAME VARCHAR(256) COLLATE DATABASE_DEFAULT)
DECLARE #ROWNUM_NOW INT, #ROWNUM_MAX INT, #SQLCMD VARCHAR(2000), #STATUSSTRING VARCHAR(256)
SELECT #ROWNUM_NOW = MIN(ROWNUM), #ROWNUM_MAX = MAX(ROWNUM) FROM #SEARCHTABLE
WHILE #ROWNUM_NOW <= #ROWNUM_MAX
BEGIN
SELECT #SQLCMD = SEARCHCLAUSE FROM #SEARCHTABLE WHERE ROWNUM = #ROWNUM_NOW
INSERT INTO #RESULTS
EXEC(#SQLCMD)
SET #STATUSSTRING = CAST(#ROWNUM_NOW AS VARCHAR(25))+'/'+CAST(#ROWNUM_MAX AS VARCHAR(25))+', time: '+CONVERT(VARCHAR, GETDATE(), 120)
RAISERROR(#STATUSSTRING, 10, 1) WITH NOWAIT
SELECT #ROWNUM_NOW = #ROWNUM_NOW + 1
END
SET NOCOUNT ON
SELECT 'This table and column contains strings ".co.uk" and a "#"' INFORMATION, TABLENAME, COLNAME FROM #RESULTS
-- Uncomment to drop the created temp tables
--IF OBJECT_ID('tempdb..#SEARCHTABLE') IS NOT NULL
-- DROP TABLE #TABLECOLS
--IF OBJECT_ID('tempdb..#RESULTS') IS NOT NULL
-- DROP TABLE #RESULTS
What it does, it search the DB for all user-created tables with their schemas, which have (n)char/(n)varchar/xml columns of a sufficient length, and search each of them one by one until at least one match is found, then it moves to the next one on the list. Match is defined as any string or XML cast as string, which contains the text ".co.uk" and an "#"-sign somewhere in there.
It will show the progress of the script (how many searchable TABLE.COLUMN combinations are have been found and which one on that list is currently running, as well as the current timestamps down to seconds) on the messages tab. When ready, it will show you all the tables and column names that contained at least one match.
So from that list, you'll have to search through the tables and columns manually to find exactly how many and what kinds of matches there are, and what it is you actually want to do.
Edit: Again I disregarded using sysnames for sysobjects, but I'll modify later if needed.
I threw together a quick query that seems to work for me:
--Search for a word in the current database
SET NOCOUNT ON;
--First make a hit list of possible tables/ columns
DECLARE #HitList TABLE (
Id INT IDENTITY(1,1) PRIMARY KEY,
TableName VARCHAR(255),
SchemaName VARCHAR(255),
ColumnName VARCHAR(255));
INSERT INTO
#HitList (
TableName,
SchemaName,
ColumnName)
SELECT
t.name,
s.name,
c.name
FROM
sys.tables t
INNER JOIN sys.columns c ON c.object_id = t.object_id
INNER JOIN sys.schemas s ON s.schema_id = t.schema_id
WHERE
c.system_type_id = 167;
--Construct Dynamic SQL
DECLARE #Id INT = 1;
DECLARE #Count INT;
SELECT #Count = COUNT(*) FROM #HitList;
DECLARE #DynamicSQL VARCHAR(1024);
WHILE #Id <= #Count
BEGIN
DECLARE #TableName VARCHAR(255);
DECLARE #SchemaName VARCHAR(255);
DECLARE #ColumnName VARCHAR(255);
SELECT #TableName = TableName FROM #HitList WHERE Id = #Id;
SELECT #SchemaName = SchemaName FROM #HitList WHERE Id = #Id;
SELECT #ColumnName = ColumnName FROM #HitList WHERE Id = #Id;
SELECT #DynamicSQL = 'SELECT * FROM [' + #SchemaName + '].[' + #TableName + '] WHERE [' + #ColumnName + '] LIKE ''%co.uk%''';
--PRINT #DynamicSQL;
EXECUTE (#DynamicSQL);
IF ##ROWCOUNT != 0
BEGIN
PRINT 'We have a hit in ' + #TableName + '.' + #ColumnName + '!!';
END;
SELECT #Id = #Id + 1;
END;
Basically it makes a list of any VARCHAR columns (you might need to change this to include NVARCHARs if you have Unicode text columns - just change the test for system type id from 167 to 231) then performs a search for each one. When you run this from management studio switch to the messages pane to see the hits and just ignore the results.
It will be slow if your database is any sort of size... but then that is to be expected?

Sql Server 2008 - How to query for status of fulltext catalogs on all user databases?

I have several databases in a Sql Server 2008 R2 instance. Some of those databases have a full-text enabled table. The name of the full-text table is equal for all databases, but the databases have different names and they are created on demand (I never know what databases exists and what does not).
The thing is: I need to query all catalogs in all databases to check if a population is done, but I have no idea how many databases I have (of course I know, but they are created on demand as I said). The script must query all databases and check if a population is done in a table (which the name I know because it never changes besides the name of the database that does change)
I have seen many people using things like:
sys.fulltext_catalogs
But it does not work if i am using the master database for example.
Any ideas?
Edit: Here is more complete code with a cursor, a full list of databases (even those without catalogs), and the right catalog view name:
SET NOCOUNT ON;
DECLARE #sql NVARCHAR(MAX) = N'';
SELECT #sql += ' UNION ALL
SELECT [name] = ''' + QUOTENAME(name) + ''',
catalog_name = name COLLATE Latin1_General_CI_AI,
is_importing
FROM ' + QUOTENAME(name) + '.sys.fulltext_catalogs'
FROM sys.databases WHERE database_id > 4;
SET #sql = 'SELECT [database] = d.name,
s.catalog_name,
s.is_importing
FROM sys.databases AS d
LEFT OUTER JOIN (' + STUFF(#sql, 1, 10, '') + ') AS s
ON QUOTENAME(d.name) = s.name
WHERE d.database_id > 4;';
CREATE TABLE #temp(db SYSNAME, catalog_name NVARCHAR(255), is_importing BIT);
INSERT #temp EXEC sp_executesql #sql;
DECLARE #db SYSNAME, #catalog_name NVARCHAR(255), #is_importing BIT;
DECLARE c CURSOR LOCAL STATIC FORWARD_ONLY READ_ONLY
FOR SELECT db, catalog_name, is_importing FROM #temp;
OPEN c;
FETCH NEXT FROM c INTO #db, #catalog_name, #is_importing;
WHILE ##FETCH_STATUS = 0
BEGIN
IF #catalog_name IS NULL
BEGIN
PRINT 'No catalogs for ' + #db;
END
ELSE
BEGIN
IF #is_importing = 1
BEGIN
PRINT 'Do something to ' + #db
+ '(importing)';
END
ELSE
BEGIN
PRINT #db + ' is not importing.';
END
END
FETCH NEXT FROM c INTO #db, #catalog_name, #is_importing;
END
CLOSE c;
DEALLOCATE c;
DROP TABLE #temp;
This gives you a complete list of used catalogs.
CREATE TABLE #info (
databasename VARCHAR(128)
, [Fulltext Catalog Name] VARCHAR(128));
SET NOCOUNT ON;
INSERT INTO #info
EXEC sp_MSforeachdb 'use ?
SELECT ''?''
, name
FROM sys.fulltext_catalogs;'
SELECT * FROM #info
-- get rid of temp table
DROP TABLE #info;