SQL Server 2008: create trigger across all tables in db - sql

Using SQL Server 2008, I've created a database where every table has a datetime column called "CreatedDt". What I'd like to do is create a trigger for each table so that when a value is inserted, the CreatedDt column is populated with the current date and time.
If you'll pardon my pseudocode, what I'm after is the T-SQL equivalent of:
foreach (Table in MyDatabase)
{
create trigger CreatedDtTrigger
{
on insert createddt = datetime.now;
}
}
If anyone would care to help out, I'd greatly appreciate it. Thanks!

As #EricZ says, the best thing to do is bind a default for the column. Here's how you'd add it to every table using a cursor and dynamic SQL:
Sure, You can do it with a cursor:
declare #table sysname, #cmd nvarchar(max)
declare c cursor for
select name from sys.tables where is_ms_shipped = 0 order by name
open c; fetch next from c into #table
while ##fetch_status = 0
begin
set #cmd = 'ALTER TABLE ' + #table + ' ADD CONSTRAINT DF_' + #table + '_CreateDt DEFAULT GETDATE() FOR CreateDt'
exec sp_executesql #cmd
fetch next from c into #table
end
close c; deallocate c

No need to go for Cursors. Just copy the result of below Query and Execute.
select distinct 'ALTER TABLE '+ t.name +
' ADD CONSTRAINT DF_'+t.name+'_crdt DEFAULT getdate() FOR '+ c.name
from sys.tables t
inner join sys.columns c on t.object_id=c.object_id
where c.name like '%your column name%'

Here's another method:
DECLARE #SQL nvarchar(max);
SELECT #SQL = Coalesce(#SQL + '
', '')
+ 'ALTER TABLE ' + QuoteName(T.TABLE_SCHEMA) + '.' + QuoteName(T.TABLE_NAME)
+ ' ADD CONSTRAINT ' + QuoteName('DF_'
+ CASE WHEN T.TABLE_SCHEMA <> 'dbo' THEN T.Table_Schema + '_' ELSE '' END
+ C.COLUMN_NAME) + ' DEFAULT (GetDate()) FOR ' + QuoteName(C.COLUMN_NAME)
+ ';'
FROM
INFORMATION_SCHEMA.TABLES T
INNER JOIN INFORMATION_SCHEMA.COLUMNS C
ON T.TABLE_SCHEMA = C.TABLE_SCHEMA
AND T.TABLE_NAME = C.TABLE_NAME
WHERE
C.COLUMN_NAME = 'CreatedDt'
;
EXEC (#SQL);
This yields, and runs, a series of statements similar to the following:
ALTER TABLE [schema].[TableName] -- (line break added)
ADD CONSTRAINT [DF_schema_TableName] DEFAULT (GetDate()) FOR [ColumnName];
Some notes:
This uses the INFORMATION_SCHEMA views. It is best practice to use these where possible instead of the system tables because they are guaranteed to not change between versions of SQL Server (and moreover are supported on many DBMSes, so all things being equal it's best to use standards-compliant/portable code).
In a database with a case-sensitive default collation, one MUST use upper case for the INFORMATION_SCHEMA view names and column names.
When creating script it's important to pay attention to schema names and proper escaping (using QuoteName). Not doing so will break in someone's system some day.
I think it is best practice to put the DEFAULT expression inside parentheses. While no error is received without it in this case, with it, if the function GetDate() is parameterized and/or ever changed to a more complex expression, nothing will break.
If you decide that column defaults are not going to work for you, then the triggers you imagined are still possible. But it will take some serious work to manage whether the trigger already exists and alter or create it appropriately, JOIN to the inserted meta-table inside the trigger, and do it based on the full list of primary key columns for the table (if they exist, and if they don't, then you're out of luck). It is quite possible, but extremely difficult--you could end up with nested, nested, nested dynamic SQL. I have such automated object-creating script that contains 13 quote marks in a row...

Related

How to pick a table_name value from one table and delete records from the table_name table based on a condition?

We have a table. Lets call it Table_A.
Table_A holds bunch of table_names and numeric value associated to each table_name. Refer to the picture below
Can someone help me write a query to:
Select table_names from TABLE_A one by one; go to that table, Check the Date_inserted of each record against NO_OF_DAYS in Table_A and if the record is older than NO_OF_DAYS in Table_A, then DELETE THAT RECORD from that specific table.
I'm guessing we have to create dynamic values for this query but I'm having a hard time.
So, in the above picture, the query should:
Select the first table_name (T_Table1) from Table_A
Go to that Table (T_Table1)
Check the date inserted of each record in (T_Table1) against the condition
If the condition (IF record was inserted prior to NO_OF_DAYS, which is 90 in this case THEN delete the record; ELSE move to next
record)
Move on to the next table (T_Table2) in Table_A
Continue till all the table_names in Table_A have been executed
What you posted as your attempt (in a comment), quite simply isn't going to work. Let's actually format that first, shall we:
SET SQL = '
DELETE [' + dbo + '].[' + TABLE_NAME + ']
where [Date_inserted ] < '
SET SQL = SQL + ' convert(varchar, DATEADD(day, ' + CONVERT(VARCHAR, NO_OF_DAYS) + ',' + '''' + CONVERT(VARCHAR, GETDATE(), 102) + '''' + '))'
PRINT SQL
EXEC (SQL)
Firstly, I actually have no idea what you're even trying to do here. You have things like [' + dbo + '], which means that you're referencing the column dbo; as you're using a SET, then no column dbo can exist. Also, variables are prefixed with a # in SQL Server; you have none.
Anyway, the solution. Some might not like this one, as I'm using a CURSOR, rather than doing it all in one go. I, however, do have my reasons. A CURSOR isn't actually a "bad" thing, like many believe; the problem is that people constantly use them incorrectly. Using a CURSOR to loop through records and create a hierarchy for example is a terrible idea; there are far better dataset approaches.
So, what are my reasons? Firstly I can parametrise the dynamic SQL; this would be harder outside a CURSOR as I'd need to declare a different parameter for every DELETE. Also, with a CURSOR, if the DELETE fails on one table, it won't on the others; one long piece of dynamic SQL would mean if one of the transactions fail, they would all be rolled back. Also, depending on the size of the deletes, that could be a very big DELETE.
It's important, however, you understand what I've done here; if you don't that's a problem unto itself. What happens if you need to trouble shoot it in the future? SO isn't a website for support like that; you need to support your own code. If you can't, understand the code you're given don't use it or learn what it's doing first (or you're doing the wrong thing).
Note I use my own objects, in the absence of consumable sample data:
CREATE TABLE TableOfTables (TableName sysname,
NoOfDays int);
GO
INSERT INTO TableOfTables
VALUES ('T1',10),
('T2',15),
('T3',5);
GO
DECLARE Deletes CURSOR FOR
SELECT TableName, NoOfDays
FROM TableOfTables;
DECLARE #SQL nvarchar(MAX), #TableName sysname, #Days int;
OPEN Deletes;
FETCH NEXT FROM Deletes
INTO #TableName, #Days;
WHILE ##FETCH_STATUS = 0 BEGIN
SET #SQL = N'DELETE FROM ' + QUOTENAME(#TableName) + NCHAR(10) +
N'WHERE DATEDIFF(DAY, InsertedDate, GETDATE()) >= #dDays;'
PRINT #SQL; --Say hello to your best friend. o/
--EXEC sp_executeSQL #SQL, N'#dDays int', #dDays = #Days; --Uncomment to run
FETCH NEXT FROM Deletes
INTO #TableName, #Days;
END
CLOSE Deletes;
DEALLOCATE Deletes;
GO
DROP TABLE TableOfTables;
GO

Dynamic SQL w/ Loop Over All Columns in a Table

I recently was pulled off of an ASP.net conversion project at my new job to help with a rather slow, mundane, but desperate task another department is handling. Basically, they are using a simple SQL script on every column of every table in every database (it's horrible) to generate a count of all of the distinct records on each table for each column. My SQL experience is limited and my dynamic SQL experience is zero, more or less, but since I have not been given permissions yet to even access this particular database I went to work attempting to formulate a more automated query to perform this task, testing on a database I do have access to.
In short, I ran into some issues and I was hoping someone might be able to help me fill in the blanks. It'll save this department more than a month of speculated time if something more automated can be utilized.
These are the two scripts I was given and told to run on each column. The first one was for any non-bit/boolean column and also for non-datetime columns. The second was to be used for any datetime column.
select columnName, count(*) qty
from tableName
group by columnName
order by qty desc
select year(a.columnName), count(*) qty
from tableName a
group by year(a.columnName)
order by qty desc
Doing this thousands of times doesn't seem like a lot of fun to me, so here is more or less some pseudo-code that I came up with that I think could solve the issue, I will point out which areas I am fuzzy on.
declare #sql nvarchar(2500)
set #sql = 'the first part(s) of statement'
[pseudo-pseudo] Get "List" of All Column Names in Table (I do not believe there is a Collection datatype in SQL code, but you get the idea)
[pseudo-pseudo] Loop Through "List" of Column Names
(I know this dot notation wouldn't work, but I would like to perform something similar to this)
IF ColumnName.DataType LIKE 'date%'
set #sql = #sql + ' something'
IF ColumnName.DataType = bit
set #sql = #sql + ' something else' --actually it'd be preferable to skip bit/boolean datatypes if possible as these aren't necessary for the reports being created by these queries
ELSE
set #sql = #sql + ' something other than something else'
set #sql = #sql + ' ending part of statement'
EXEC(#sql)
So to summarize, for simplicity's sake I'd like to let the user plug the table's name into a variable at the start of the query:
declare #tableName nvarchar(50)
set #tableName = 'TABLENAME' --Enter Query's Table Name Here
Based on this, the code will loop through every column of that table, checking for datatype. If the datatype is a datetime (or other date like datatype), the "year" code would be added to the dynamic SQL. If it is anything else (except bit/boolean), then it will add the default logic to the dynamic SQL code.
Again, for simplicity's sake (even if it is bad practice) I figure the end result will be a dynamic SQL statement with multiple selects, one for each column in the table. Then the user would simply copy the output to excel (which they are doing right now anyway). I know this isn't the perfect solution so I am open to suggestions, but since time is of the essence and my experience with dynamic SQL is close to null, I thought a somewhat quick and dirty approach would be tolerable in this case.
I do apologize for my very haphazard preparation with this question but I do hope someone out there might be able to steer me in the right direction.
Thanks so much for your time, I certainly appreciate it.
Here's an example working through all the suggestions in the comments.
declare #sql nvarchar(max);
declare stat_cursor cursor local fast_forward for
select
case when x.name not in ('date', 'datetime2', 'smalldatetime', 'datetime') then
N'select
' + quotename(s.name, '''') + ' as schema_name,
' + quotename(t.name, '''') + ' as table_name,
' + quotename(c.name) + ' as column_name,
count(*) qty
from
' + quotename(s.name) + '.' + quotename(t.name) + '
group by
' + quotename(c.name) + '
order by
qty desc;'
else
N'select
' + quotename(s.name, '''') + ' as schema_name,
' + quotename(t.name, '''') + ' as table_name,
year(' + quotename(c.name) + ') as column_name,
count(*) qty
from
' + quotename(s.name) + '.' + quotename(t.name) + '
group by
year(' + quotename(c.name) + ')
order by
qty desc;'
end
from
sys.schemas s
inner join
sys.tables t
on s.schema_id = t.schema_id
inner join
sys.columns c
on c.object_id = t.object_id
inner join
sys.types x
on c.system_type_id = x.user_type_id
where
x.name not in (
'geometry',
'geography',
'hierarchyid',
'xml',
'timestamp',
'bit',
'image',
'text',
'ntext'
);
open stat_cursor;
fetch next from stat_cursor into #sql;
while ##fetch_status = 0
begin
exec sp_executesql #sql;
fetch next from stat_cursor into #sql;
end;
close stat_cursor;
deallocate stat_cursor;
Example SQLFiddle (note this only shows the first iteration through the cursor. Not sure if this is a limitation of SQLFiddle or a bug).
I'd probably stash the results into a separate database if I was doing this. Also, I'd probably put the SQL building bits into user defined functions for maintainability (the slow bit will be running the queries, no point optimizing generating them).

How to create an alias of database in SQL Server

We have a very old software has been created around 10 years ago and we don't have source code.
The software uses two databases, DB01 and DB02 on the same SQL Server 2012 instance.
There is SQL statements such as db01..table1 join db02..table2, but the main issue is our processes don't allow us use db02 as a name of database.
The question is: how we can create an alias of for database?
I was trying to use CREATE SYNONYM
CREATE SYNONYM [db02] FOR [db02_new_name];
but it doesn't work for database names.
Please suggest how it can be solved without patching a binary files to correct SQL statements.
Create a database with the name you want to impersonate. Re-jigg the DDL code generator to create a view for every table in the database that has the tables I need to access via the hardcoded name. Basically, each view will have a statement that looks like this..
CREATE VIEW schemaname.tablename as SELECT * FROM targetdbname.schemaname.tablename
Example:
The target database name that is hardcoded is called ProdDBV1 and the Source DB you have is named ProductDatabaseDatabaseV1, schema is dbo and table name is customer
Create the database called ProdDBV1 using SSMS or script.
CREATE VIEW dbo.customer as SELECT * FROM ProductDatabaseDatabaseV1.dbo.customer
If you can enumerate each table in your "source" database and then create the DDL as above. If you want I can update this posting with a code example. (using the sp_msforeachtable procedure if possible)
I had a similar issue.
Solved with this workaround, using synonyms.
Short version: You flood your database with a synonym of every object you'll ever need to reference. Later you re-create every synonym with the other database name.
Here's a stored proc to do it. Simply add it to your database and call it with the target database. It will create synonyms for all tables in the target database, and create the schemas if they don't exist. I've left a commented out section in case someone knows of a way to get the create schemas working without a cursor.
CREATE PROCEDURE CreateSynonymsForTargetDatabase (
#databaseName sysname
)
AS BEGIN
DECLARE #TSQL nvarchar(max) = N''
DECLARE #rn char(2),
#SchemaName sysname;
SET #rn = char(13) + char(10)
CREATE TABLE #DBSynonym(
[Schema] sysname NOT NULL,
[Table] sysname NOT NULL
)
SET #TSQL = N'
INSERT INTO #DBSynonym ([Schema], [Table])
SELECT Schemas.name, Tables.name
FROM [' + #databaseName + '].sys.tables
INNER JOIN [' + #databaseName + '].sys.schemas on tables.schema_id = schemas.schema_id
'
EXEC (#TSQL)
SET #TSQL = N''
DECLARE MissingSchemasCursor CURSOR
READ_ONLY
FOR
SELECT newSchemas.[Schema]
FROM #DBSynonym newSchemas
LEFT JOIN sys.schemas on newSchemas.[Schema] = schemas.name
WHERE schemas.schema_id is null
GROUP BY newSchemas.[Schema]
OPEN MissingSchemasCursor
FETCH NEXT FROM MissingSchemasCursor INTO #SchemaName
WHILE (##fetch_status <> -1)
BEGIN
IF (##fetch_status <> -2)
BEGIN
SET #TSQL = N'CREATE SCHEMA ' + QUOTENAME(#SchemaName) + N';'
EXEC sp_executesql #TSQL
END
FETCH NEXT FROM MissingSchemasCursor INTO #SchemaName
END
CLOSE MissingSchemasCursor
DEALLOCATE MissingSchemasCursor
/*
SELECT #TSQL = #TSQL +
N'
GO
CREATE SCHEMA ' + QUOTENAME([Schema]) + N';'
FROM #DBSynonym newSchemas
LEFT JOIN sys.schemas on newSchemas.[Schema] = schemas.name
WHERE schemas.schema_id is null
GROUP BY newSchemas.[Schema]
PRINT 'CREATE SCHEMAS : ' + ISNULL(#TSQL,'')
EXEC sp_executesql #TSQL
*/
SET #TSQL = N''
SELECT #TSQL = #TSQL +
N'
CREATE SYNONYM ' + QUOTENAME([Schema]) + N'.' + QUOTENAME([Table]) + N'
FOR ' + QUOTENAME(#databaseName) + N'.' + QUOTENAME([Schema]) + N'.' + QUOTENAME([Table]) + N';'
FROM #DBSynonym
EXEC sp_executesql #TSQL
SET #TSQL = N''
END
GO
Use it as follows :
EXEC CreateSynonymsForTargetDatabase 'targetDbName'
The question is: how we can create an alias of for database?
I know this is an old post but...
This is why I only use the 2 part naming convention for SQL objects. It allows me to have 2 part synonyms that point to differently named databases depending on what environment I'm in. There are some places where it doesn't work so well but, for the most part, those places are very rare.
As for software that you don't have the source code of and if that software uses the 3 part naming convention, you're probably just out of luck unless you know what the 3 part naming convention is for each object and create a 3 part synonym for each object.
I found Charles' answer (and the linked workaround in the comment by maxcastaneda) very useful. I followed this approach and it works for me. I have streamlined it a bit and created the following query that brings up all required synonyms to create.
As a prerequisite for this snippet both the original DB and the synonym/alias db have to be on the same server otherwise in case you use linked server or so you have to modify it a bit.
It should be fairly easy to put this into a small sp to update the synonyms automatically.
USE <SYNONYMDB>
SELECT
'[' + TABLE_NAME + ']',
'[' + TABLE_SCHEMA + '].[' + TABLE_NAME + ']',
'IF EXISTS (SELECT * FROM sys.synonyms WHERE name = ''' + TABLE_NAME + ''') DROP SYNONYM ['+ TABLE_NAME + ']; CREATE SYNONYM [' + TABLE_NAME + '] FOR <ORIGINALDB>.' + TABLE_SCHEMA + '.[' + TABLE_NAME + ']' AS SynonymUpdateScript FROM <ORIGINALDB>.INFORMATION_SCHEMA.TABLES
Don't forget to enter you Db names at the <...> spots.
Just copy the content of the SynonymUpdateScript Column and execute it in the synonym DB - or create a stored procedure for this task.
Be aware there is an issue if you have views in place that refer to tables or other db objects without the 2 part naming convention. Those synonyms won't work. You should fix this in the original objects / views.
Go to the Database you wish to create Alias,
Create an Alias Folders table with the preferred design,
Go to unique IDs's table and check the last code sequence for the table created.
For example, if the last code is 10, then update it to 11.
Open Cabinets table and go right at the bottom and create the name of the Alias cabinet you want.
You can create an alias from 'SQL Server Configuration Manager' under Configuartion Tool in SQL Server Folder.
Detailed source : http://www.mssqltips.com/sqlservertip/1620/how-to-setup-and-use-a-sql-server-alias/
http://technet.microsoft.com/en-us/library/ms190445.aspx

Dynamically search columns for given table

I need to create a search for a java app I'm building where users can search through a SQL database based on the table they're currently viewing and a search term they provide. At first I was going to do something simple like this:
SELECT * FROM <table name> WHERE CAST((SELECT COLUMN_NAME
FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = '<table name>')
AS VARCHAR) LIKE '%<search term>%'
but that subquery returns more than one result, so then I tried to make a procedure to loop through all the columns in a given table and put any relevant fields in a results table, like this:
CREATE PROC sp_search
#tblname VARCHAR(4000),
#term VARCHAR(4000)
AS
SET nocount on
SELECT COLUMN_NAME
INTO #tempcolumns
FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = #tblname
ALTER TABLE #tempcolumns
ADD printed BIT,
num SMALLINT IDENTITY
UPDATE #tempcolumns
SET printed = 0
DECLARE #colname VARCHAR(4000),
#num SMALLINT
WHILE EXISTS(SELECT MIN(num) FROM #tempcolumns WHERE printed = 0)
BEGIN
SELECT #num = MIN(num)
FROM #tempcolumns
WHERE printed = 0
SELECT #colname = COLUMN_NAME
FROM #tempcolumns
WHERE num = #num
SELECT * INTO #results FROM #tblname WHERE CAST(#colname AS VARCHAR)
LIKE '%' + #term + '%' --this is where I'm having trouble
UPDATE #tempcolumns
SET printed = 1
WHERE #num = num
END
SELECT * FROM #results
GO
This has two problems: first is that it gets stuck in an infinite loop somehow, and second I can't select anything from #tblname. I tried using dynamic sql as well, but I don't know how to get results from that or if that's even possible.
This is for an assignment I'm doing at college and I've gotten this far after hours of trying to figure it out. Is there any way to do what I want to do?
You need to only search columns that actually contain strings, not all columns in a table (which may include integers, dates, GUIDs, etc).
You shouldn't need a #temp table (and certainly not a ##temp table) at all.
You need to use dynamic SQL (though I'm not sure if this has been part of your curriculum so far).
I find it beneficial to follow a few simple conventions, all of which you've violated:
use PROCEDURE not PROC - it's not a "prock," it's a "stored procedure."
use dbo. (or alternate schema) prefix when referencing any object.
wrap your procedure body in BEGIN/END.
use vowels liberally. Are you saving that many keystrokes, never mind time, saying #tblname instead of #tablename or #table_name? I'm not fighting for a specific convention but saving characters at the cost of readability lost its charm in the 70s.
don't use the sp_ prefix for stored procedures - this prefix has special meaning in SQL Server. Name the procedure for what it does. It doesn't need a prefix, just like we know they're tables even without a tbl prefix. If you really need a prefix there, use another one like usp_ or proc_ but I personally don't feel that prefix gives you any information you don't already have.
since tables are stored using Unicode (and some of your columns might be too), your parameters should be NVARCHAR, not VARCHAR. And identifiers are capped at 128 characters, so there is no reason to support > 257 characters for #tablename.
terminate statements with semi-colons.
use the catalog views instead of INFORMATION_SCHEMA - though the latter is what your professor may have taught and might expect.
CREATE PROCEDURE dbo.SearchTable
#tablename NVARCHAR(257),
#term NVARCHAR(4000)
AS
BEGIN
SET NOCOUNT ON;
DECLARE #sql NVARCHAR(MAX);
SET #sql = N'SELECT * FROM ' + #tablename + ' WHERE 1 = 0';
SELECT #sql = #sql + '
OR ' + c.name + ' LIKE ''%' + REPLACE(#term, '''', '''''') + '%'''
FROM
sys.all_columns AS c
INNER JOIN
sys.types AS t
ON c.system_type_id = t.system_type_id
AND c.user_type_id = t.user_type_id
WHERE
c.[object_id] = OBJECT_ID(#tablename)
AND t.name IN (N'sysname', N'char', N'nchar',
N'varchar', N'nvarchar', N'text', N'ntext');
PRINT #sql;
-- EXEC sp_executesql #sql;
END
GO
When you're happy that it's outputting the SELECT query you're after, comment out the PRINT and uncomment the EXEC.
You get into an infinite loop because EXISTS(SELECT MIN(num) FROM #tempcolumns WHERE printed = 0) will always return a row even if there are no matches - you need to EXISTS (SELECT * .... instead
To use dynamic SQL, you need to build up a string (varchar) of the SQL statement you want to run, then you call it with EXEC
eg:
declare #s varchar(max)
select #s = 'SELECT * FROM mytable '
Exec (#s)

Accessing 400 tables in a single query

I want to delete rows with a condition from multiple tables.
DELETE
FROM table_1
WHERE lst_mod_ymdt = '2011-01-01'
The problem is that, the number of table is 400, from table_1 to table_400.
Can I apply the query to all the tables in a single query?
If you're using SQL Server 2005 and later you can try something like this (other versions and RDMS also have similar ways to do this):
DECLARE #sql VARCHAR(MAX)
SET #sql = (SELECT 'DELETE FROM [' + REPLACE(Name, '''','''''') + '] WHERE lst_mod_ymdt = ''' + #lst_mod_ymdt + ''';' FROM sys.tables WHERE Name LIKE 'table_%' FOR XML PATH(''))
--PRINT #sql;
EXEC ( #sql );
And as always with dynamic sql, remember to escape the ' character.
This will likely fall over if you have say table_341 which doesn't have a lst_mod_ymdt column.