how to perform same DML statements on two different server ( Oracle and SQL ) - sql

I have two servers ORACLE and SQL.
I have same database on both the servers .
I want to perform DML (insert,update,delete).
How can I perform same DML statements on both the servers simultaneously ?
If I insert one statement in SQL server then the same statement should be updated in the oracle database .
Thanks

you can use MS SQL (and Oracle probably has something similar) to create a linked server.
Please refer to these links:
http://msdn.microsoft.com/en-us/library/ms188279
http://msdn.microsoft.com/en-us/library/ff772782.aspx
http://msdn.microsoft.com/en-us/library/ms190479.aspx
This basically lets you use MS SQL as usual, but you can reference a completely different system thorough this functionality.
Lets say you have the following simple table scenario:
CREATE TABLE Tbl
(
ID int,
SOMETHING NVARCHAR(100)
);
I wouldn't want to call two different INSERT statements, so I would write a stored procedure that does the inserting on my behalf.
Lets assume the stored procedure name is SP_MyTest and has two parameters: #ID, #SOMETHING. I would use a transaction in order to assure I always insert into both tables.
But keep in mind - this implementation is synchronous which means: If one insert takes "forever" then the application will hold for the duration of that time unless you make some additions.
CREATE PROCEDURE SP_MyTest
#ID INT,
#SOMETHING NVARCHAR(100)
AS
BEGIN
SET NOCOUNT ON;
BEGIN TRANSACTION TRN
INSERT INTO Tbl(ID,SOMETHING) VALUES(#ID,#SOMETHING);
INSERT INTO <LinkedServerName>.<LinkedServerDBName>.<schema>.<table>(ID,SOMETHING) VALUES(#ID, #SOMETHING);
COMMIT TRANSACTION TRN
END
And after that, I would call this sproc with:
EXEC SP_MyTest 1, 'Test1'
EXEC SP_MyTest 2, 'Test2'
EXEC SP_MyTest 3, 'Test3'

Related

Using results of one stored procedure in another stored procedure - SQL Server

Is there any way to use the results of one stored procedure in another stored procedure without using a table variable or temp table? The logic is as follows:
IF (#example = 1)
BEGIN
DECLARE #temp TABLE (Id INT);
INSERT INTO #temp
EXEC [Procedure1] #ItemId = #StockId
set #Cost = (select top 1 id from #temp)
Ideally i would like to know if there is a way to do this without having to use a temp table. Looked around online but can't find anything that works. Any advice would be great.
In general, if you want to use user-defined code in a SELECT, then it is better to phrase the code as a user-defined function rather than a user-defined procedure.
That is, procedures should be used for their side effects and functions should be used for their return values.
That said, you can use openquery (documented here) to run an exec on a linked server. The linked server can be the server you are running on.

How do I update triggers across multiple databases?

I have a query that I can select the databases from the sys.databases with the triggers that I wish to update. From there I can create a cursor. However when I go into the cursor to update my triggers using a dynamic db name #DatabaseExecuteName that is set to MyDatabaseName.dbo I receive the error ''CREATE/ALTER TRIGGER' does not allow specifying the database name as a prefix to the object name.' Because I am in a cursor I am not able to execute a USE MyDatabaseName ... GO, the GO statement is not allowed inside the CURSOR. I have tried SQLCMD MODE :setvar DatabaseName "MyDatabaseName" with USE [$(DatabaseName)]; to try to set the use database. I feel I am very close however my strength is not SQL queries. I could use some assistance on what I am missing.
You can nest EXEC calls so that you can use a USE and then execute a further statement and you don't need to use GO to seperate the batches. This is a complete script to demonstrate the technique:
create database DB1
go
create database DB2
go
use DB2
go
create table T1 (ID int not null)
go
create table T2 (ID int not null)
go
use DB1
go
exec('use DB2; exec(''create trigger T_T on T1 after insert as
insert into T2(ID) select i.ID from inserted i'')');
select DB_NAME()
insert into DB2..T1(ID) values (1),(2);
select * from DB2..T2
Which then shows that this connection is still in the DB1 database, but the trigger was successfully created on the T1 table within the DB2 database.
What you have to watch for is getting your quote-escaping correct.

Select Values From SP And Temporary Tables

I have a Stored Procedure in MSSQL 2008, inside of this i've created a Temporary Table, and then i executed several inserts into the temporary Table.
How can i select all the columns of the Temporary Table outside the stored procedure? I Mean, i have this:
CREATE PROCEDURE [dbo].[LIST_CLIENTS]
CREATE TABLE #CLIENT(
--Varchar And Numeric Values goes here
)
/*Several Select's and Insert's against the Temporary Table*/
SELECT * FROM #CLIENT
END
In another Query i'm doing this:
sp_configure 'Show Advanced Options', 1
GO
RECONFIGURE
GO
sp_configure 'Ad Hoc Distributed Queries', 1
GO
RECONFIGURE
GO
SELECT *
INTO #CLIENT
FROM OPENROWSET
('SQLOLEDB','Server=(local);Uid=Cnx;pwd=Cnx;database=r8;Trusted_Connection=yes;
Integrated Security=SSPI',
'EXEC dbo.LIST_CLIENTS ''20110602'', NULL, NULL, NULL, NULL, NULL')
But i get this error:
Msg 208, Level 16, State 1, Procedure LIST_CLIENTS, Line 43
Invalid object name '#CLIENT'.
I've tried with Global Temporary Tables and It doesn't work.
I know that is the scope of the temporary table, but, how can i get the table outside the scope of the SP?
Thanks in advance
I think there is something deeper going on here.
One idea is to use a table variable inside the stored procedure instead of a #temp table (I have to assume you're using SQL Server 2005+ but it's always nice to state this up front). And use OPENQUERY instead of OPENROWSET. This works fine for me:
USE tempdb;
GO
CREATE PROCEDURE dbo.proc_x
AS
BEGIN
SET NOCOUNT ON;
DECLARE #x TABLE(id INT);
INSERT #x VALUES(1),(2);
SELECT * FROM #x;
END
GO
SELECT *
INTO #client
FROM OPENQUERY
(
[loopback linked server name],
'EXEC tempdb.dbo.proc_x'
) AS y;
SELECT * FROM #client;
DROP TABLE #client;
DROP PROCEDURE dbo.proc_x;
Another idea is that perhaps the error is occurring even without using SELECT INTO. Does the stored procedure reference the #CLIENT table in any dynamic SQL, for example? Does it work when you call it on its own or when you just say SELECT * FROM OPENROWSET instead of SELECT INTO? Obviously, if you are working with the #temp table in dynamic SQL you're going to have the same kind of scope issue working with a #table variable in dynamic SQL.
At the very least, name your outer #temp table something other than #CLIENT to avoid confusion - then at least nobody has to guess which #temp table is not being referenced correctly.
Since the global temp table failed, use a real table, run this when you start your create script and drop the temp table once you are done to make sure.
IF OBJECT_ID('dbo.temptable', 'U') IS NOT NULL
BEGIN
DROP TABLE dbo.temptable
END
CREATE TABLE dbo.temptable
( ... )
You need to run the two queries within the same connection and use a global temp table.
In SQL Server 2008 you can declare User-Defined Table Types which represent the definition of a table structure. Once created you can create table parameters within your procs and pass them a long and be able to access the table in other procs.
I guess the reason for such behavior is that when you call OPENROWSET from another server it firstly and separately requests the information about procedure output structure (METADATA). And the most interesting thing is that this output structure is taken from the first SELECT statement found in the procedure. Moreover, if the SELECT statement follows the IF-condition the METADATA request ignores this IF-condition, because there is no need to run the whole procedure - the first met SELECT statement is enough. (By the way, to switch off that behavior, you can include SET FMTONLY OFF in the beginning of your procedure, but this might increase the procedure execution time).
The conclusions:
— when the METADATA is being requested from a temp table (created in a procedure) it does not actually exists, because the METADATA request does not actually run the procedure and create the temp table.
— if a temp table can be replaced with a table variable it solves the problem
— if it is vital for the business to use temp table, the METADATA request can be fed with fake first SELECT statement, like:
declare #t table(ID int, Name varchar(15));
if (0 = 1) select ID, Name from #t; -- fake SELECT statement
create table #T (ID int, Name varchar(15));
select ID, Name from #T; -- real SELECT statement
— and one more thing is to use a common trick with FMTONLY (that is not my idea) :
declare #fmtonlyOn bit = 0;
if 1 = 0 set #fmtonlyOn = 1;
set fmtonly off;
create table #T (ID int, Name varchar(15));
if #fmtonlyOn = 1 set fmtonly on;
select ID, Name from #T;
The reason you're getting the error is because the temp table #Client was not declared before you ran the procedure to insert into it. If you declare the table, then execute the list proc and use direct insert -
INSERT INTO #Client
EXEC LIST_CLIENTS

Why can't INSERT EXEC Stored procedures be nested?

I found a pretty good article detailing how to go about passing table data around and it mentions that the INSERT EXEC style table data sharing (http://www.sommarskog.se/share_data.html#INSERTEXEC) has the drawback of not being allowed to be nested?
In other words [in SQL Server 2005 at least], in the pseudocode below PROC1's INSERT EXEC would error out at runtime. I was wondering if anyone knows why this is.
CREATE PROC1
AS
--Fill table variable with data from somewhere
INSERT INTO #tbl EXECUTE spI_Return_Data
-- Do some stuff to the data
-- 'Return' it
SELECT * FROM #tbl
GO
CREATE PROC2
AS
--Fill table variable with data from PROC1
INSERT INTO #tbl EXECUTE PROC1
-- Do some stuff to the data
-- 'Return' it
SELEC * FROM #tbl
GO
Internal implementation restrictions.
If you need to capture the output of stored procedures, then those procedures should be a table valued function to start with. Ultimately you can work around the restriction by using CLR procedures.

How to prevent an Insert query from enrolling into a Distributed Transaction?

I have a SQL Insert query inside a stored proc, for inserting rows into a linked server table.
Since the stored proc is getting called within a parent transaction, this Insert statement tries to use a DTC for inserting rows into the linked server.
I would like to avoid DTC from getting involved.
Is there any way I can do that (like a hint) for the Insert SQL statement to ignore transactional scope?
My suggestion is that you store whatever you want to insert into a staging table, and once the procedure is over run the cross server insert. To my knowledge there is no way of ignoring the transaction you are in once you are within the SProc execution.
In contrast, if you use .NET 2.0's System.Transaction namespace, you can tell specific statements not to participate in any parent scope transaction. This would require you to write some of your logic in code rather than stored procedures, but would work.
Here's a relevant link.
Good luck,
Alan.
Try using openquery to call the linked server query/sp instead of direct calling
That worked for me
so instead of
insert into ...
select * from mylinkedserver.pubs.dbo.authors
e.g.
DECLARE #TSQL varchar(8000), #VAR char(2)
SELECT #VAR = 'CA'
SELECT #TSQL = 'SELECT * FROM OPENQUERY(MyLinkedServer,''SELECT * FROM pubs.dbo.authors WHERE state = ''''' + #VAR + ''''''')'
INSERT INTO .....
EXEC (#TSQL)