Select from all tables in all databases with specific name - sql

I need to query all the tables with specific name in all databases on server.
Databases creates daily by ISA and its names generates by mask ISALOG_current_date_WEB_000. Each database contains table WebProxyLog. Total count of databases is 60.
My goal is to query WebProxyLog table in all databases or in databases of specific dates.
Something like foreach loop:
foreach($db in $databases)
{
if($db.Name.Contains("_web"))
{
SELECT [ClientUserName],[logTime],[uri],[UrlDestHost],[bytesrecvd],[bytessent],[rule]
FROM [$db].[dbo].[WebProxyLog]
WHERE [ClientUserName] like ''%username%''
}
}
Perfect if result of query will be merged in single table or view.
Is there a way to perform that?

There is an undocumented stored procedure called sp_MSForEachDB, However, I would not rush to use undocumented features. This can by done by using dynamic SQL that gets the databases names from sys.DataBases system table:
DECLARE #SQL nvarchar(max) = N''
SELECT #SQL = #SQL +
'UNION ALL
SELECT ['+ name +'] As DBName, [ClientUserName],[logTime],[uri],[UrlDestHost],[bytesrecvd],[bytessent],[rule]
FROM [' + name + '].[dbo].[WebProxyLog]
WHERE [ClientUserName] like ''%username%''
'
FROM sys.DataBases
WHERE name LIKE '%ISALOG%WEB%'
SET #SQL = STUFF(#SQL, 1, 10, '') + ' ORDER BY DBName'
PRINT #SQL
--EXEC(#SQL)
Once you've printed the sql and tested it, you can remove the print row and un-comment the exec row.
Further reading - Aaron Bertrand's Bad habits to kick : relying on undocumented behavior And his answer to a question on SO about sp_MSForEachDB.
Edit: small correction of SELECT:
'UNION ALL
SELECT [ClientUserName],[logTime],[uri],[UrlDestHost],[bytesrecvd],[bytessent],[rule]
FROM [' + name + '].[dbo].[WebProxyLog]
WHERE [ClientUserName] like ''%username%''
'
As result it prints listing of queries to the tables, right?

Related

Dynamic SQL w/ Loop Over All Columns in a Table

I recently was pulled off of an ASP.net conversion project at my new job to help with a rather slow, mundane, but desperate task another department is handling. Basically, they are using a simple SQL script on every column of every table in every database (it's horrible) to generate a count of all of the distinct records on each table for each column. My SQL experience is limited and my dynamic SQL experience is zero, more or less, but since I have not been given permissions yet to even access this particular database I went to work attempting to formulate a more automated query to perform this task, testing on a database I do have access to.
In short, I ran into some issues and I was hoping someone might be able to help me fill in the blanks. It'll save this department more than a month of speculated time if something more automated can be utilized.
These are the two scripts I was given and told to run on each column. The first one was for any non-bit/boolean column and also for non-datetime columns. The second was to be used for any datetime column.
select columnName, count(*) qty
from tableName
group by columnName
order by qty desc
select year(a.columnName), count(*) qty
from tableName a
group by year(a.columnName)
order by qty desc
Doing this thousands of times doesn't seem like a lot of fun to me, so here is more or less some pseudo-code that I came up with that I think could solve the issue, I will point out which areas I am fuzzy on.
declare #sql nvarchar(2500)
set #sql = 'the first part(s) of statement'
[pseudo-pseudo] Get "List" of All Column Names in Table (I do not believe there is a Collection datatype in SQL code, but you get the idea)
[pseudo-pseudo] Loop Through "List" of Column Names
(I know this dot notation wouldn't work, but I would like to perform something similar to this)
IF ColumnName.DataType LIKE 'date%'
set #sql = #sql + ' something'
IF ColumnName.DataType = bit
set #sql = #sql + ' something else' --actually it'd be preferable to skip bit/boolean datatypes if possible as these aren't necessary for the reports being created by these queries
ELSE
set #sql = #sql + ' something other than something else'
set #sql = #sql + ' ending part of statement'
EXEC(#sql)
So to summarize, for simplicity's sake I'd like to let the user plug the table's name into a variable at the start of the query:
declare #tableName nvarchar(50)
set #tableName = 'TABLENAME' --Enter Query's Table Name Here
Based on this, the code will loop through every column of that table, checking for datatype. If the datatype is a datetime (or other date like datatype), the "year" code would be added to the dynamic SQL. If it is anything else (except bit/boolean), then it will add the default logic to the dynamic SQL code.
Again, for simplicity's sake (even if it is bad practice) I figure the end result will be a dynamic SQL statement with multiple selects, one for each column in the table. Then the user would simply copy the output to excel (which they are doing right now anyway). I know this isn't the perfect solution so I am open to suggestions, but since time is of the essence and my experience with dynamic SQL is close to null, I thought a somewhat quick and dirty approach would be tolerable in this case.
I do apologize for my very haphazard preparation with this question but I do hope someone out there might be able to steer me in the right direction.
Thanks so much for your time, I certainly appreciate it.
Here's an example working through all the suggestions in the comments.
declare #sql nvarchar(max);
declare stat_cursor cursor local fast_forward for
select
case when x.name not in ('date', 'datetime2', 'smalldatetime', 'datetime') then
N'select
' + quotename(s.name, '''') + ' as schema_name,
' + quotename(t.name, '''') + ' as table_name,
' + quotename(c.name) + ' as column_name,
count(*) qty
from
' + quotename(s.name) + '.' + quotename(t.name) + '
group by
' + quotename(c.name) + '
order by
qty desc;'
else
N'select
' + quotename(s.name, '''') + ' as schema_name,
' + quotename(t.name, '''') + ' as table_name,
year(' + quotename(c.name) + ') as column_name,
count(*) qty
from
' + quotename(s.name) + '.' + quotename(t.name) + '
group by
year(' + quotename(c.name) + ')
order by
qty desc;'
end
from
sys.schemas s
inner join
sys.tables t
on s.schema_id = t.schema_id
inner join
sys.columns c
on c.object_id = t.object_id
inner join
sys.types x
on c.system_type_id = x.user_type_id
where
x.name not in (
'geometry',
'geography',
'hierarchyid',
'xml',
'timestamp',
'bit',
'image',
'text',
'ntext'
);
open stat_cursor;
fetch next from stat_cursor into #sql;
while ##fetch_status = 0
begin
exec sp_executesql #sql;
fetch next from stat_cursor into #sql;
end;
close stat_cursor;
deallocate stat_cursor;
Example SQLFiddle (note this only shows the first iteration through the cursor. Not sure if this is a limitation of SQLFiddle or a bug).
I'd probably stash the results into a separate database if I was doing this. Also, I'd probably put the SQL building bits into user defined functions for maintainability (the slow bit will be running the queries, no point optimizing generating them).

Get list of all databases that have a view named 'foo' in them

I have a few servers that have a bunch of databases in them.
Some of the databases have a view called vw_mydata.
What I want to do is create a list of all databases containing a view named vw_mydata and then execute that view and store it's contents in a table that then contains al the data from all the vw_mydata.
I know I can find all the databases containing that view using
sp_msforeachdb 'select "?" AS dbName from [?].sys.views where name like ''vw_mydata'''
But then I have as many recordsets as I have databases. How do I use that to loop through the databases?
What I would preferis a single neat list of the databasenames that I then can store in a resultset. Then it would be pretty straightforward.
I have thought about running above TSQL and storing the results in a table but I would like to keep it all in one SSIS package and not having all kind of tables/procedures lying around. Can I use a #table in a Execute SQL Task in SSIS?
DECLARE #Tsql VARCHAR(MAX)
SET #Tsql = ''
SELECT #Tsql = #Tsql + 'SELECT ''' + d.name + ''' AS dbName FROM [' + d.name + '].sys.views WHERE name LIKE ''vw_mydata'' UNION '
FROM master.sys.databases d
--"trim" the last UNION from the end of the tsql.
SET #Tsql = LEFT(#Tsql, LEN(#Tsql) - 6)
PRINT #Tsql
--Uncomment when ready to proceed
--EXEC (#Tsql)
To use a temp table in SSIS, you'll need to use a global temp table (##TABLE).
On the properties for the connection, I'm pretty sure you'll need to set RetainSameConnection to TRUE.
On the SQL task after you create the temp table, you'll need to set DelayValidation to TRUE.

Running the same SQL code against a number of tables sequentially

I have a number of tables (around 40) containing snapshot data about 40 million plus vehicles. Each snapshot table is at a specific point in time (the end of the quarter) and is identical in terms of structure.
Whilst most of our analysis is against single snapshots, on occasion we need to run some analysis against all the snapshots at once. For instance, we may need to build a new table containing all the Ford Focus cars from every single snapshot.
To achieve this we currently have two options:
a) write a long, long, long batch file repeating the same code over and over again, just changing the FROM clause
[drawbacks - it takes a long time to write and changing a single line of code in one of blocks requires fiddly changes in all the other blocks]
b) use a view to union all the tables together and query that instead
[drawbacks - our tables are stored in separate database instances and cannot be indexed, plus the resulting view is something like 600 million records long by 125 columns wide, so is incredibly slow]
So, what I would like to find out is whether I can either use dynamic sql or put the SQL into a loop to spool through all tables. This would be something like:
for each *table* in TableList
INSERT INTO output_table
SELECT *table* as OriginTableName, Make, Model
FROM *table*
next *table* in TableList
Is this possible? This would mean that updating the original SQL when our client changes what they need (a very regular occurrence!) would be very simple and we would benefit from all the indexes we already have on the original tables.
Any pointers, suggestions or help will be much appreciated.
If you can identify your tables (e.g. a naming pattern), you could simply say:
DECLARE #sql NVARCHAR(MAX);
SELECT #sql = N'';
SELECT #sql = #sql + 'INSERT output_table SELECT ''' + name + ''', Make, Model
FROM dbo.' + QUOTENAME(name) + ';'
FROM sys.tables
WHERE name LIKE 'pattern%';
-- or WHERE name IN ('t1', 't2', ... , 't40');
EXEC sp_executesql #sql;
This assumes they're all in the dbo schema. If they're not, the adjustment is easy... just replace dbo with ' + QUOTENAME(SCHEMA_NAME([schema_id])) + '...
In the end I used two methods:
Someone on another forum suggested making use of sp_msforeachtable and a table which contains all the table names. Their suggestion was:
create table dbo.OutputTable (OriginTableName nvarchar(500), RecordCount INT)
create table dbo.TableList (Name nvarchar (500))
insert dbo.TableList
select '[dbo].[swap]'
union select '[dbo].[products]'
union select '[dbo].[structures]'
union select '[dbo].[stagingdata]'
exec sp_msforeachtable #command1 = 'INSERT INTO dbo.OutputTable SELECT ''?'', COUNT(*) from ?'
,#whereand = 'and syso.object_id in (select object_id(Name) from dbo.TableList)'
select * from dbo.OutputTable
This works perfectly well for some queries, but seems to suffer from the fact that one cannot use a GROUP BY clause within the query (or, at least, I could not find a way to do this).
The final solution I used was to use Dynamic SQL with a lookup table containing the table names. In a very simple form, this looks like:
DECLARE #TableName varchar(500)
DECLARE #curTable CURSOR
DECLARE #sql NVARCHAR(1000)
SET #curTable = CURSOR FOR
SELECT [Name] FROM Vehicles_LookupTables.dbo.AllStockTableList
OPEN #curTable
FETCH NEXT
FROM #curTable INTO #TableName
WHILE ##FETCH_STATUS = 0
BEGIN
SET #sql = 'SELECT ''' +#TableName + ''', Make, sum(1) as Total FROM ' + #TableName + ' GROUP BY Make'
EXEC sp_executesql #sql
FETCH NEXT
FROM #curTable INTO #TableName
END
CLOSE #curTable
DEALLOCATE #curTable

Accessing 400 tables in a single query

I want to delete rows with a condition from multiple tables.
DELETE
FROM table_1
WHERE lst_mod_ymdt = '2011-01-01'
The problem is that, the number of table is 400, from table_1 to table_400.
Can I apply the query to all the tables in a single query?
If you're using SQL Server 2005 and later you can try something like this (other versions and RDMS also have similar ways to do this):
DECLARE #sql VARCHAR(MAX)
SET #sql = (SELECT 'DELETE FROM [' + REPLACE(Name, '''','''''') + '] WHERE lst_mod_ymdt = ''' + #lst_mod_ymdt + ''';' FROM sys.tables WHERE Name LIKE 'table_%' FOR XML PATH(''))
--PRINT #sql;
EXEC ( #sql );
And as always with dynamic sql, remember to escape the ' character.
This will likely fall over if you have say table_341 which doesn't have a lst_mod_ymdt column.

SQL Dynamic SELECT statement from values stored in a table

I have been researching this for a couple of days and feel like I am going around in circles. I have basic knowledge of SQL but there are many areas I do not understand.
I have a table that stores the names and fields of all the other tables in my database.
tblFields
===================================================
TableName FieldName BookmarkName
---------------------------------------------------
Customer FirstName CustomerFirstName
Customer LastName CustomerLastName
Customer DOB CustomerDOB
I want to write a SELECT statement like the following but i am unable to get it work:
SELECT (SELECT [FieldName] FROM [TableName]) FROM tblFields
Is this possible? The application I have developed requires this for user customization of reports.
If i understand what you are trying to do, i think this will help you. It is not pretty and it works for SQL Server 2005 and above, but maybe this is what you are looking for:
declare #tableName nvarchar(100)
declare #sqlQuery nvarchar(max)
declare #fields varchar(500)
set #tableName = 'YourTableName'
set #fields = ''
select #fields = #fields + QUOTENAME(t.fieldname) + ',' from (
select distinct fieldname from tblfields where tablename = #tableName)t
set #sqlQuery = 'select ' + left(#fields, LEN(#fields)-1) + ' from ' + QUOTENAME(#tableName)
execute sp_executesql #sqlQuery
Edit: As Martin suggested, i edited so that the columns and tablename are using QUOTENAME
If I understand correctly what you are trying to do, you are probably better off doing this as two separate queries from your program. One which gets the fields you want to select which you then use in your program to build up the second query which actually gets the data.
If it must be done entirely in SQL, then you will need to tell us what database you are using. If it is SQL Server, you might be able to user a cursor over the first query to build up the second query which you then execute with the sp_executesql stored procedure. But doing doing it outside of SQL would be recommended.