Accessing 400 tables in a single query - sql

I want to delete rows with a condition from multiple tables.
DELETE
FROM table_1
WHERE lst_mod_ymdt = '2011-01-01'
The problem is that, the number of table is 400, from table_1 to table_400.
Can I apply the query to all the tables in a single query?

If you're using SQL Server 2005 and later you can try something like this (other versions and RDMS also have similar ways to do this):
DECLARE #sql VARCHAR(MAX)
SET #sql = (SELECT 'DELETE FROM [' + REPLACE(Name, '''','''''') + '] WHERE lst_mod_ymdt = ''' + #lst_mod_ymdt + ''';' FROM sys.tables WHERE Name LIKE 'table_%' FOR XML PATH(''))
--PRINT #sql;
EXEC ( #sql );
And as always with dynamic sql, remember to escape the ' character.
This will likely fall over if you have say table_341 which doesn't have a lst_mod_ymdt column.

Related

Select from all tables in all databases with specific name

I need to query all the tables with specific name in all databases on server.
Databases creates daily by ISA and its names generates by mask ISALOG_current_date_WEB_000. Each database contains table WebProxyLog. Total count of databases is 60.
My goal is to query WebProxyLog table in all databases or in databases of specific dates.
Something like foreach loop:
foreach($db in $databases)
{
if($db.Name.Contains("_web"))
{
SELECT [ClientUserName],[logTime],[uri],[UrlDestHost],[bytesrecvd],[bytessent],[rule]
FROM [$db].[dbo].[WebProxyLog]
WHERE [ClientUserName] like ''%username%''
}
}
Perfect if result of query will be merged in single table or view.
Is there a way to perform that?
There is an undocumented stored procedure called sp_MSForEachDB, However, I would not rush to use undocumented features. This can by done by using dynamic SQL that gets the databases names from sys.DataBases system table:
DECLARE #SQL nvarchar(max) = N''
SELECT #SQL = #SQL +
'UNION ALL
SELECT ['+ name +'] As DBName, [ClientUserName],[logTime],[uri],[UrlDestHost],[bytesrecvd],[bytessent],[rule]
FROM [' + name + '].[dbo].[WebProxyLog]
WHERE [ClientUserName] like ''%username%''
'
FROM sys.DataBases
WHERE name LIKE '%ISALOG%WEB%'
SET #SQL = STUFF(#SQL, 1, 10, '') + ' ORDER BY DBName'
PRINT #SQL
--EXEC(#SQL)
Once you've printed the sql and tested it, you can remove the print row and un-comment the exec row.
Further reading - Aaron Bertrand's Bad habits to kick : relying on undocumented behavior And his answer to a question on SO about sp_MSForEachDB.
Edit: small correction of SELECT:
'UNION ALL
SELECT [ClientUserName],[logTime],[uri],[UrlDestHost],[bytesrecvd],[bytessent],[rule]
FROM [' + name + '].[dbo].[WebProxyLog]
WHERE [ClientUserName] like ''%username%''
'
As result it prints listing of queries to the tables, right?

Update data in a SQL Server temp table where the column names are unknown

In a stored procedure I dynamically create a temp table by selecting the name of applications from a regular table. Then I add a date column and add the last 12 months. The result looks like this:
So far so good. Now I want to update the data in columns by querying another regular table. Normally it would be something like:
UPDATE ##TempTable
SET [columName] = (SELECT SUM(columName)
FROM RegularTable
WHERE FORMAT(RegularTable.Date,'MM/yyyy') = FORMAT(##TempMonths.x,'MM/yyyy'))
However, since I don't know what the name of the columns are at any given time, I need to do this dynamically.
So my question is, how can I get the column names of a temp table dynamically while doing an update?
Thanks!
I think you can use something like the following.
select name as 'ColumnName'
from tempdb.sys.columns
where object_id = object_id('tempdb..##TempTable');
And then generate dynamic sql using something like the following.
DECLARE #tableName nvarchar(50)
SET #tableName = 'RegularTable'
DECLARE #sql NVARCHAR(MAX)
SET #sql = ''
SELECT #sql = #sql + ' UPDATE ##TempTable ' + CHAR(13) +
' SET [' + c.name + '] = (SELECT SUM([' + c.name + ']) ' + CHAR(13) +
' FROM RegularTable' + CHAR(13) +
' WHERE FORMAT(RegularTable.Date,''MM/yyyy'') = FORMAT(##TempMonths.x,''MM/yyyy''));' + CHAR(13)
from tempdb.sys.columns c
where object_id = object_id('tempdb..##MyTempTable');
print #sql
-- exec sp_executesql #sql;
Then print statement in above snippet shows that the #sql variable has the following text.
UPDATE ##TempTable
SET [Test Application One] = (SELECT SUM([Test Application One])
FROM RegularTable
WHERE FORMAT(RegularTable.Date,'MM/yyyy') = FORMAT(##TempMonths.x,'MM/yyyy'));
UPDATE ##TempTable
SET [Test Application Two] = (SELECT SUM([Test Application Two])
FROM RegularTable
WHERE FORMAT(RegularTable.Date,'MM/yyyy') = FORMAT(##TempMonths.x,'MM/yyyy'));
So now, you use sp_exec to execute the updates as follows (un-comment it from above snippet).
exec sp_executesql #sql;
If it's a 1 time UPDATE you can PRINT the dynamic SQL statement (as shown above) and then execute it in the SSMS Query Windows.
I recommend you use the print statement first to make sure the UPDATE statements generated are what you want, and then do the sp_executesql or run the printed UPDATE statement in the query window.

SQL Server 2008: create trigger across all tables in db

Using SQL Server 2008, I've created a database where every table has a datetime column called "CreatedDt". What I'd like to do is create a trigger for each table so that when a value is inserted, the CreatedDt column is populated with the current date and time.
If you'll pardon my pseudocode, what I'm after is the T-SQL equivalent of:
foreach (Table in MyDatabase)
{
create trigger CreatedDtTrigger
{
on insert createddt = datetime.now;
}
}
If anyone would care to help out, I'd greatly appreciate it. Thanks!
As #EricZ says, the best thing to do is bind a default for the column. Here's how you'd add it to every table using a cursor and dynamic SQL:
Sure, You can do it with a cursor:
declare #table sysname, #cmd nvarchar(max)
declare c cursor for
select name from sys.tables where is_ms_shipped = 0 order by name
open c; fetch next from c into #table
while ##fetch_status = 0
begin
set #cmd = 'ALTER TABLE ' + #table + ' ADD CONSTRAINT DF_' + #table + '_CreateDt DEFAULT GETDATE() FOR CreateDt'
exec sp_executesql #cmd
fetch next from c into #table
end
close c; deallocate c
No need to go for Cursors. Just copy the result of below Query and Execute.
select distinct 'ALTER TABLE '+ t.name +
' ADD CONSTRAINT DF_'+t.name+'_crdt DEFAULT getdate() FOR '+ c.name
from sys.tables t
inner join sys.columns c on t.object_id=c.object_id
where c.name like '%your column name%'
Here's another method:
DECLARE #SQL nvarchar(max);
SELECT #SQL = Coalesce(#SQL + '
', '')
+ 'ALTER TABLE ' + QuoteName(T.TABLE_SCHEMA) + '.' + QuoteName(T.TABLE_NAME)
+ ' ADD CONSTRAINT ' + QuoteName('DF_'
+ CASE WHEN T.TABLE_SCHEMA <> 'dbo' THEN T.Table_Schema + '_' ELSE '' END
+ C.COLUMN_NAME) + ' DEFAULT (GetDate()) FOR ' + QuoteName(C.COLUMN_NAME)
+ ';'
FROM
INFORMATION_SCHEMA.TABLES T
INNER JOIN INFORMATION_SCHEMA.COLUMNS C
ON T.TABLE_SCHEMA = C.TABLE_SCHEMA
AND T.TABLE_NAME = C.TABLE_NAME
WHERE
C.COLUMN_NAME = 'CreatedDt'
;
EXEC (#SQL);
This yields, and runs, a series of statements similar to the following:
ALTER TABLE [schema].[TableName] -- (line break added)
ADD CONSTRAINT [DF_schema_TableName] DEFAULT (GetDate()) FOR [ColumnName];
Some notes:
This uses the INFORMATION_SCHEMA views. It is best practice to use these where possible instead of the system tables because they are guaranteed to not change between versions of SQL Server (and moreover are supported on many DBMSes, so all things being equal it's best to use standards-compliant/portable code).
In a database with a case-sensitive default collation, one MUST use upper case for the INFORMATION_SCHEMA view names and column names.
When creating script it's important to pay attention to schema names and proper escaping (using QuoteName). Not doing so will break in someone's system some day.
I think it is best practice to put the DEFAULT expression inside parentheses. While no error is received without it in this case, with it, if the function GetDate() is parameterized and/or ever changed to a more complex expression, nothing will break.
If you decide that column defaults are not going to work for you, then the triggers you imagined are still possible. But it will take some serious work to manage whether the trigger already exists and alter or create it appropriately, JOIN to the inserted meta-table inside the trigger, and do it based on the full list of primary key columns for the table (if they exist, and if they don't, then you're out of luck). It is quite possible, but extremely difficult--you could end up with nested, nested, nested dynamic SQL. I have such automated object-creating script that contains 13 quote marks in a row...

Checking whether conditions are met by all rows with dynamic SQL

I have a table in SQL Server 2008 which contains custom validation criteria in the form of expressions stored as text, e.g.
StagingTableID CustomValidation
----------------------------------
1 LEN([mobile])<=30
3 [Internal/External] IN ('Internal','External')
3 ([Internal/External] <> 'Internal') OR (LEN([Contact Name])<=100)
...
I am interested in determining whether all rows in a table pass the conditional statement. For this purpose I am writing a validation stored procedure which checks whether all values in a given field in a given table meet the given condition(s). SQL is not my forte, so after reading this questions this is my first stab at the problem:
EXEC sp_executesql N'SELECT #passed = 0 WHERE EXISTS (' +
N'SELECT * FROM (' +
N'SELECT CASE WHEN ' + #CustomValidationExpr + N' THEN 1 ' +
N'ELSE 0 END AS ConditionalTest ' +
N'FROM ' + #StagingTableName +
N')t ' +
N'WHERE t.ConditionalTest = 0)'
,N'#passed BIT OUTPUT'
,#passed = #PassedCustomValidation OUTPUT
However, I'm not sure if the nested queries can be re-written as one, or if there is an entirely better way for testing for validity of all rows in this scenario?
Thanks in advance!
You should be able to reduce by at least one subquery like this:
EXEC sp_executesql N'SELECT #passed = 0 WHERE EXISTS (' +
N'SELECT 1 FROM ' + #StagingTableName +
N'WHERE NOT(' + #CustomValidationExpr + N')) ' +
,N'#passed BIT OUTPUT'
,#passed = #PassedcustomValidation OUTPUT
Before we answer the original question, have you looked into implementing constraints? This will prevent bad data from entering your database in the first place. Or is the point that these must be dynamically set in the application?
ALTER TABLE StagingTable
WITH CHECK ADD CONSTRAINT [StagingTable$MobileValidLength]
CHECK (LEN([mobile])<=30)
GO
ALTER TABLE StagingTable
WITH CHECK ADD CONSTRAINT [StagingTable$InternalExternalValid]
CHECK ([Internal/External] IN ('Internal','External'))
GO
--etc...
You need to concatenate the expressions together. I agree with #PinnyM that a where clause is easier for full table validation. However, the next question will be how to identify which rows fail which tests. I'll wait for you to ask that question before answering it (ask it as a separate question and not as an edit to this one).
To create the where clause, something like this:
declare #WhereClause nvarchar(max);
select #WhereClause = (select CustomValidation+' and '
from Validations v
for xml path ('')
) + '1=1'
select #WhereClause = replace(replace(#WhereClause, '<', '<'), '>', '>'))
This strange construct, with the for xml path('') and the double select, is the most convenient way to concatenate values in SQL Server.
Also, put together your query before doing the sp_executesql call. It gives you more flexibilty:
declare #sql nvarchar(max);
select #sql = '
select #passed = count(*)
from '+#StagingTableName+'
where '+#WhereClause
That is the number that pass all validation tests. The where clause for the fails is:
declare #WhereClause nvarchar(max);
select #WhereClause = (select 'not '+CustomValidation+' or '
from Validations v
for xml path ('')
) + '1=0'

Is there a way to replace a character or string in all fields without writing it for each field?

I will warn you up front, this question borders on silly, but I'm asking anyway.
The impetus for my question is creating a csv from a query result and some of the fields containing commas already. Obviously, the csv doesn't know any better and just merrily jacks up my good mood by having some stragglers in non-field columns.
I know I can write
Replace(FieldName, OldChar, NewChar)
for each field, but I'm more curious than anything if there's a shortcut to replace them all in the query output.
Basically what I'm looking for (logically) is:
Replace(AllFields, OldChar, NewChar)
I don't know all of the SQL tricks (or many of them), so I thought maybe the SO community may be able to enlighten me...or call me nuts.
There is no SQL syntax to do what you describe, but as you've seen there are many ways to do this with dynamic SQL. Here's the way I prefer (this assumes you want to replace commas with pipe, change this as you see fit):
DECLARE #table NVARCHAR(511),
#newchar NCHAR(1),
#sql NVARCHAR(MAX);
SELECT #table = N'dbo.table_name',
#newchar = N'|', -- tailor accordingly
#sql = N'';
SELECT #sql = #sql + ',
' + QUOTENAME(name)
+ ' = REPLACE(CONVERT(NVARCHAR(MAX), ' + QUOTENAME(name) + '),'','','''
+ #newchar + ''')'
FROM sys.columns
WHERE [object_id] = OBJECT_ID(#table)
ORDER BY column_id;
SELECT #sql = N'SELECT ' + STUFF(#sql, 1, 1, '') + '
FROM ' + #table;
PRINT #sql;
-- EXEC sp_executesql #sql;
I feel your pain. I often have one-time type cleansing steps in ETL routines. I find a script like this helps when you need to remove some oddity from an import (rogue page breaks, whitespace, etc.):
declare #tableName nvarchar(100) = 'dbo.YourTable';
declare #col nvarchar(max);
-- remove quotes and trim every column, kill page breaks, etc.
;with c_Col (colName)
as ( select c.name
from sys.tables t
join sys.columns c on
c.object_id = t.object_id
where t.object_id = object_id(#tableName)
)
select #col = stuff(a.n, 1, 1, '')
from ( select top 100 percent
',' + c.colName + '= nullif(replace(replace(replace(rtrim(ltrim('+c.colName+ ')), ''"'', ''''), char(13), ''''), char(10), ''''), '''') '
from c_col c
for xml path('')
) as a(n)
declare #cmd nvarchar(max)
set #cmd = 'update ' + #tableName + ' set ' + #col
print #cmd;
--exec(#cmd);
If you are just looking to save yourself some typing for a one time query statement affecting all fields in a table then this is a trick I've used in the past.
First query the schema to produce a result set that returns all the field names in any table you specify. You can modify what I've provided here as a template but I've given the basic structure of an update statement around the field names.
select column_name + ' = Replace(' + column_name + ',OldChar,NewChar),'
from information_schema.columns
where table_name = 'YourTableName'
The result set comes back in query analyzer as a series of rows that you can highlight (by clicking on column name) and then copying and pasting right back into your query analyzer window. From there add your update statement to the beginning and where clause to the end. You'll also need to get rid of the one extra comma.
You can then re-run the query to produce the desire outcome.