How do I update triggers across multiple databases? - sql

I have a query that I can select the databases from the sys.databases with the triggers that I wish to update. From there I can create a cursor. However when I go into the cursor to update my triggers using a dynamic db name #DatabaseExecuteName that is set to MyDatabaseName.dbo I receive the error ''CREATE/ALTER TRIGGER' does not allow specifying the database name as a prefix to the object name.' Because I am in a cursor I am not able to execute a USE MyDatabaseName ... GO, the GO statement is not allowed inside the CURSOR. I have tried SQLCMD MODE :setvar DatabaseName "MyDatabaseName" with USE [$(DatabaseName)]; to try to set the use database. I feel I am very close however my strength is not SQL queries. I could use some assistance on what I am missing.

You can nest EXEC calls so that you can use a USE and then execute a further statement and you don't need to use GO to seperate the batches. This is a complete script to demonstrate the technique:
create database DB1
go
create database DB2
go
use DB2
go
create table T1 (ID int not null)
go
create table T2 (ID int not null)
go
use DB1
go
exec('use DB2; exec(''create trigger T_T on T1 after insert as
insert into T2(ID) select i.ID from inserted i'')');
select DB_NAME()
insert into DB2..T1(ID) values (1),(2);
select * from DB2..T2
Which then shows that this connection is still in the DB1 database, but the trigger was successfully created on the T1 table within the DB2 database.
What you have to watch for is getting your quote-escaping correct.

Related

How to execute commands/functions in order

I'm writing a long SQL query that I will be using to automate the process of ingesting large-ish flat files (using python to flatten heavily nested JSON files) and normalizing them for scalability and ease of use with PowerBI reports and dashboards.
Currently, I've got a long process that slices the table into multiple tables, generates mapping tables between them and the primary table, remaps a PK/FK link back to the primary table and drops the old unneeded columns from the primary table.
I'm still building and debugging the script, and I'm getting really frustrated with something that I think I'm doing wrong as I'm not very proficient in SQL.
Currently, if I try to run all of my code at once it will fail saying I'm using invalid column names. The column names are invalid with the tables in their current state, but if it would simply execute from top to bottom, they would be valid by the time it got to them. I've got to highlight and execute my drop tables statement by itself every time I want to rerun the entire script even though I've got the same drop tables statement at the top.
Any advice on how to make the script simply execute from top to bottom or how to make it step through and ignore the "current" state of the tables (prior to execution) would be greatly helpful.
Some example pseudo of what I've got:
CREATE OR ALTER PROCEDURE DropTables
AS
BEGIN
DROP TABLE IF EXISTS
t1,
t2,
t3
END
GO
CREATE OR ALTER PROCEDURE GenerateTable1
AS
BEGIN
~make table~
END
GO
CREATE OR ALTER PROCEDURE GenerateTable2
BEGIN
~make table~
END
GO
CREATE OR ALTER PROCEDURE GenerateTable3
BEGIN
~make table~
ALTER TABLE t1 ADD ~fk from t3~
UPDATE t1
SET ~keys to match~
FROM t3 WHERE t1.old_col = t3.new_col
ALTER TABLE t1
DROP COLUMN old_col
END
GO
EXEC DropTables
GO
EXEC GenerateTable1
GO
EXEC GenerateTable2
GO
EXEC GenerateTable3
Upon executing this I get "Invalid column name old_col" because old_col currently doesnt exist, however, if it would just execute from top to bottom, old_col would exist when it got to it.
Current workaround is highlighting droptables and executing it by itself first, then I can execute everything at once
GO breaks the script into batches. You just need to scope the batches so each one compiles. Or use dynamic SQL which is just a different way to issue separate batches.
It may not be the most elegant solution, but simply wrapping everything in an exec block with single quotes seems to work. "EXEC(' stuff ');" SQL doesnt try to get ahead of itself and is forced to execute in order from top to bottom, example:
CREATE OR ALTER PROCEDURE DropTables
AS
BEGIN
EXEC('
DROP TABLE IF EXISTS
t1,
t2,
t3
')
;
END
GO
CREATE OR ALTER PROCEDURE GenerateTable1
AS
BEGIN
EXEC('
~make table~
')
;
END
GO
CREATE OR ALTER PROCEDURE GenerateTable2
BEGIN
EXEC('
~make table~
')
;
END
GO
CREATE OR ALTER PROCEDURE GenerateTable3
BEGIN
EXEC('
~make table~
ALTER TABLE t1 ADD ~fk from t3~
UPDATE t1
SET ~keys to match~
FROM t3 WHERE t1.old_col = t3.new_col
ALTER TABLE t1
DROP COLUMN old_col
')
;
END
GO
EXEC DropTables
GO
EXEC GenerateTable1
GO
EXEC GenerateTable2
GO
EXEC GenerateTable3

What is the process during re-naming and re-creating a MS-SQL table using stored procedure?

I have a table called myTable where continuous insertion is happening. I will rename that table by myTable_Date and create a new table, myTable through a Store Procedure.
I want to know what will happen during re-naming and re-creating the table, will it drop any packet?
SQL Server has sp_rename built in if you just want to change the name of a table.
sp_rename myTable, myTable_Date
Would change the name from myTable to myTable_Date
But it only changes the name reference in sys.Objects so make sure any references are altered and read the documentation about it :)
The Microsoft doc for it is HERE
When you rename the myTable to myTableDate, myTable won't exist anymore so if someone tries to insert something inside myTable it will fail.
When you create new myTable with the same name and columns everything will be fine and the insertion process will continue.
I suggest you to make a little script renaming the table and creating new one. Something like this:
sp_rename myTable, myTable_Date
GO
CREATE TABLE myTable(
-- Table definition
)
When you rename the table you will get warning like this: "Caution: Changing any part of an object name could break scripts and stored procedures." so you better create the new table fast.
Other option is you create a table exact like myTable and insert all data from myTable there and then can delete them from myTable. No renaming, no dropping and insertion process will not be interrupted.
I want to know what will happen during re-naming and re-creating the
table, will it drop any packet?
Inserts attempted after the table is renamed will err until the table is recreated. You can avoid that by executing the tasks in a transaction. Short term blocking will happen if an insert is attempted before the transaction is committed but no rows will be lost. For example:
CREATE PROC dbo.ReanmeMytableWithDate
AS
DECLARE #NewName sysname = 'mytable_' + CONVERT(nchar(8), SYSDATETIME(), 112);
SET XACT_ABORT ON;
BEGIN TRY;
BEGIN TRAN;
EXEC sp_rename N'dbo.mytable', #NewName;
CREATE TABLE dbo.mytable(
col1 int
);
COMMIT;
END TRY
BEGIN CATCH
THROW;
END CATCH;
GO
I don't know your use case for renaming tables like this but it seems table partitioning might be a better approach as #Damien_The_Unbeliever suggested. Although table partitioning previously required Enterprise Edition, the feature is available in Standard Edition beginning with SQL Server 2016 SP1 as well as Azure SQL Database.

How to redirect a request for DB1 to DB2

Suppose I have two databases named DB1 and DB2. In DB1, there is a table named Student, In DB2, there is a stored procedure named SP1. In SP1, I am selecting data of Student Table using below query :
Select *from DB1.dbo.Student.
I have more than 300 stored procedures having above said cross database communication. Now, I want to change my database from DB1 to DB3 that is identical to DB1 from data and schema perspective.
For this, I also have to modify all 300 stored procedures that are having fully-qualified database name. Now, the query will likely to be as follows :
Select *from DB3.dbo.Student
I don't want to change all stored procedure to point DB3 now, also don't want to change my queries written in stored procedure into dynamic SQL (I know this can be done by creating dynamic SQL).
Is it possible if We run DB1.dbo.Student, It will redirect to DB3.dbo.Student. Any intermediate layer or any SQL setting.
It'll be very big help for me. Thanks In Advance !!
If the purpose of your database renaming is to migrate a database, then why not rename the databases themselves?
e.g. say rename DB1 to DB1_old and then rename DB3 to DB1
I would simply script out all stored procedures using SQL Server script generator tool. Then do a find replace on the script and find text ‘DB1.dbo.’ and replace with ‘DB3.dbo.’
In the future you might want to consider using synonyms to reference external tables then you would only have to update the synonyms instead of all of your procedures. Please see following MSDN article on synonyms:
https://msdn.microsoft.com/en-us/library/ms187552.aspx
Example use of synonym:
USE [DB1]
GO
-- Create a synonym for table A located in DB2.
CREATE SYNONYM [dbo].[External_TableA] FOR [DB2].[dbo].[TableA]
GO
-- Synonym is pointing to TableA in DB2 , select statement will return data from DB2 tabla A.
SELECT *
FROM [External_TableA]
GO
-- Point the Synonym to same table but on DB3
DROP SYNONYM [dbo].[External_TableA]
CREATE SYNONYM [dbo].[External_TableA] FOR [DB3].[dbo].[TableA]
GO
-- No update was needed on views or stored procedure.
-- Synonym is pointing to TableA in DB3 , select statement will return data from DB3 tabla A.
SELECT *
FROM [External_TableA]
The follow query will generate the required DROP and CREATE script to remap your synonyms from the old database to the new database.
DECLARE #oldDB NVARCHAR(100) = 'DB2';
DECLARE #newDB NVARCHAR(100) = 'DB3';
SELECT 'DROP SYNONYM [dbo].[' + name + ']' AS [Drop Script]
,'CREATE SYNONYM [dbo].[' + name + '] FOR ' + REPLACE(base_object_name, #oldDB, #newDB) AS CreateScript
FROM sys.synonyms
ORDER BY name
its better to use USE Keyword
use [database name you want to access]
Queries and stored procedure you want to use
GO
eg
use [db1]
select *from yourTableName
exec yourStoredProcedure parm1,parm2,....
Go

how to perform same DML statements on two different server ( Oracle and SQL )

I have two servers ORACLE and SQL.
I have same database on both the servers .
I want to perform DML (insert,update,delete).
How can I perform same DML statements on both the servers simultaneously ?
If I insert one statement in SQL server then the same statement should be updated in the oracle database .
Thanks
you can use MS SQL (and Oracle probably has something similar) to create a linked server.
Please refer to these links:
http://msdn.microsoft.com/en-us/library/ms188279
http://msdn.microsoft.com/en-us/library/ff772782.aspx
http://msdn.microsoft.com/en-us/library/ms190479.aspx
This basically lets you use MS SQL as usual, but you can reference a completely different system thorough this functionality.
Lets say you have the following simple table scenario:
CREATE TABLE Tbl
(
ID int,
SOMETHING NVARCHAR(100)
);
I wouldn't want to call two different INSERT statements, so I would write a stored procedure that does the inserting on my behalf.
Lets assume the stored procedure name is SP_MyTest and has two parameters: #ID, #SOMETHING. I would use a transaction in order to assure I always insert into both tables.
But keep in mind - this implementation is synchronous which means: If one insert takes "forever" then the application will hold for the duration of that time unless you make some additions.
CREATE PROCEDURE SP_MyTest
#ID INT,
#SOMETHING NVARCHAR(100)
AS
BEGIN
SET NOCOUNT ON;
BEGIN TRANSACTION TRN
INSERT INTO Tbl(ID,SOMETHING) VALUES(#ID,#SOMETHING);
INSERT INTO <LinkedServerName>.<LinkedServerDBName>.<schema>.<table>(ID,SOMETHING) VALUES(#ID, #SOMETHING);
COMMIT TRANSACTION TRN
END
And after that, I would call this sproc with:
EXEC SP_MyTest 1, 'Test1'
EXEC SP_MyTest 2, 'Test2'
EXEC SP_MyTest 3, 'Test3'

Select Values From SP And Temporary Tables

I have a Stored Procedure in MSSQL 2008, inside of this i've created a Temporary Table, and then i executed several inserts into the temporary Table.
How can i select all the columns of the Temporary Table outside the stored procedure? I Mean, i have this:
CREATE PROCEDURE [dbo].[LIST_CLIENTS]
CREATE TABLE #CLIENT(
--Varchar And Numeric Values goes here
)
/*Several Select's and Insert's against the Temporary Table*/
SELECT * FROM #CLIENT
END
In another Query i'm doing this:
sp_configure 'Show Advanced Options', 1
GO
RECONFIGURE
GO
sp_configure 'Ad Hoc Distributed Queries', 1
GO
RECONFIGURE
GO
SELECT *
INTO #CLIENT
FROM OPENROWSET
('SQLOLEDB','Server=(local);Uid=Cnx;pwd=Cnx;database=r8;Trusted_Connection=yes;
Integrated Security=SSPI',
'EXEC dbo.LIST_CLIENTS ''20110602'', NULL, NULL, NULL, NULL, NULL')
But i get this error:
Msg 208, Level 16, State 1, Procedure LIST_CLIENTS, Line 43
Invalid object name '#CLIENT'.
I've tried with Global Temporary Tables and It doesn't work.
I know that is the scope of the temporary table, but, how can i get the table outside the scope of the SP?
Thanks in advance
I think there is something deeper going on here.
One idea is to use a table variable inside the stored procedure instead of a #temp table (I have to assume you're using SQL Server 2005+ but it's always nice to state this up front). And use OPENQUERY instead of OPENROWSET. This works fine for me:
USE tempdb;
GO
CREATE PROCEDURE dbo.proc_x
AS
BEGIN
SET NOCOUNT ON;
DECLARE #x TABLE(id INT);
INSERT #x VALUES(1),(2);
SELECT * FROM #x;
END
GO
SELECT *
INTO #client
FROM OPENQUERY
(
[loopback linked server name],
'EXEC tempdb.dbo.proc_x'
) AS y;
SELECT * FROM #client;
DROP TABLE #client;
DROP PROCEDURE dbo.proc_x;
Another idea is that perhaps the error is occurring even without using SELECT INTO. Does the stored procedure reference the #CLIENT table in any dynamic SQL, for example? Does it work when you call it on its own or when you just say SELECT * FROM OPENROWSET instead of SELECT INTO? Obviously, if you are working with the #temp table in dynamic SQL you're going to have the same kind of scope issue working with a #table variable in dynamic SQL.
At the very least, name your outer #temp table something other than #CLIENT to avoid confusion - then at least nobody has to guess which #temp table is not being referenced correctly.
Since the global temp table failed, use a real table, run this when you start your create script and drop the temp table once you are done to make sure.
IF OBJECT_ID('dbo.temptable', 'U') IS NOT NULL
BEGIN
DROP TABLE dbo.temptable
END
CREATE TABLE dbo.temptable
( ... )
You need to run the two queries within the same connection and use a global temp table.
In SQL Server 2008 you can declare User-Defined Table Types which represent the definition of a table structure. Once created you can create table parameters within your procs and pass them a long and be able to access the table in other procs.
I guess the reason for such behavior is that when you call OPENROWSET from another server it firstly and separately requests the information about procedure output structure (METADATA). And the most interesting thing is that this output structure is taken from the first SELECT statement found in the procedure. Moreover, if the SELECT statement follows the IF-condition the METADATA request ignores this IF-condition, because there is no need to run the whole procedure - the first met SELECT statement is enough. (By the way, to switch off that behavior, you can include SET FMTONLY OFF in the beginning of your procedure, but this might increase the procedure execution time).
The conclusions:
— when the METADATA is being requested from a temp table (created in a procedure) it does not actually exists, because the METADATA request does not actually run the procedure and create the temp table.
— if a temp table can be replaced with a table variable it solves the problem
— if it is vital for the business to use temp table, the METADATA request can be fed with fake first SELECT statement, like:
declare #t table(ID int, Name varchar(15));
if (0 = 1) select ID, Name from #t; -- fake SELECT statement
create table #T (ID int, Name varchar(15));
select ID, Name from #T; -- real SELECT statement
— and one more thing is to use a common trick with FMTONLY (that is not my idea) :
declare #fmtonlyOn bit = 0;
if 1 = 0 set #fmtonlyOn = 1;
set fmtonly off;
create table #T (ID int, Name varchar(15));
if #fmtonlyOn = 1 set fmtonly on;
select ID, Name from #T;
The reason you're getting the error is because the temp table #Client was not declared before you ran the procedure to insert into it. If you declare the table, then execute the list proc and use direct insert -
INSERT INTO #Client
EXEC LIST_CLIENTS