How to prevent an Insert query from enrolling into a Distributed Transaction? - sql

I have a SQL Insert query inside a stored proc, for inserting rows into a linked server table.
Since the stored proc is getting called within a parent transaction, this Insert statement tries to use a DTC for inserting rows into the linked server.
I would like to avoid DTC from getting involved.
Is there any way I can do that (like a hint) for the Insert SQL statement to ignore transactional scope?

My suggestion is that you store whatever you want to insert into a staging table, and once the procedure is over run the cross server insert. To my knowledge there is no way of ignoring the transaction you are in once you are within the SProc execution.
In contrast, if you use .NET 2.0's System.Transaction namespace, you can tell specific statements not to participate in any parent scope transaction. This would require you to write some of your logic in code rather than stored procedures, but would work.
Here's a relevant link.
Good luck,
Alan.

Try using openquery to call the linked server query/sp instead of direct calling
That worked for me
so instead of
insert into ...
select * from mylinkedserver.pubs.dbo.authors
e.g.
DECLARE #TSQL varchar(8000), #VAR char(2)
SELECT #VAR = 'CA'
SELECT #TSQL = 'SELECT * FROM OPENQUERY(MyLinkedServer,''SELECT * FROM pubs.dbo.authors WHERE state = ''''' + #VAR + ''''''')'
INSERT INTO .....
EXEC (#TSQL)

Related

Using results of one stored procedure in another stored procedure - SQL Server

Is there any way to use the results of one stored procedure in another stored procedure without using a table variable or temp table? The logic is as follows:
IF (#example = 1)
BEGIN
DECLARE #temp TABLE (Id INT);
INSERT INTO #temp
EXEC [Procedure1] #ItemId = #StockId
set #Cost = (select top 1 id from #temp)
Ideally i would like to know if there is a way to do this without having to use a temp table. Looked around online but can't find anything that works. Any advice would be great.
In general, if you want to use user-defined code in a SELECT, then it is better to phrase the code as a user-defined function rather than a user-defined procedure.
That is, procedures should be used for their side effects and functions should be used for their return values.
That said, you can use openquery (documented here) to run an exec on a linked server. The linked server can be the server you are running on.

sp_executesql with user defined table type not working with two databases [duplicate]

I'm using SQL Server 2008.
How can I pass Table Valued parameter to a Stored procedure across different Databases, but same server?
Should I create the same table type in both databases?
Please, give an example or a link according to the problem.
Thanks for any kind of help.
In response to this comment (if I'm correct and that using TVPs between databases isn't possible):
What choice do I have in this situation? Using XML type?
The purist approach would be to say that if both databases are working with the same data, they ought to be merged into a single database. The pragmatist realizes that this isn't always possible - but since you can obviously change both the caller and callee, maybe just use a temp table that both stored procs know about.
I don't believe it's possible - you can't reference a table type from another database, and even with identical type definitions in both DBs, a value of one type isn't assignable to the other.
You don't pass the temp table between databases. A temp table is always stored in tempdb, and is accessible to your connection, so long as the connection is open and the temp table isn't dropped.
So, you create the temp table in the caller:
CREATE TABLE #Values (ID int not null,ColA varchar(10) not null)
INSERT INTO #Values (ID,ColA)
/* Whatever you do to populate the table */
EXEC OtherDB..OtherProc
And then in the callee:
CREATE PROCEDURE OtherProc
/* No parameter passed */
AS
SELECT * from #Values
Table UDTs are only valid for stored procs within the same database.
So yes you would have to create the type on each server and reference it in the stored procs - e.g. just run the first part of this example in both DBs http://msdn.microsoft.com/en-us/library/bb510489.aspx.
If you don't need the efficency you can always use other methods - i.e. pass an xml document parameter or have the s.p. expect a temp table with the input data.
Edit: added example
create database Test1
create database Test2
go
use Test1
create type PersonalMessage as TABLE
(Message varchar(50))
go
create proc InsertPersonalMessage #Message PersonalMessage READONLY AS
select * from #Message
go
use Test2
create type PersonalMessage as TABLE
(Message varchar(50))
go
create proc InsertPersonalMessage #Message PersonalMessage READONLY AS
select * from #Message
go
use Test1
declare #mymsg PersonalMessage
insert #mymsg select 'oh noes'
exec InsertPersonalMessage #mymsg
go
use Test2
declare #mymsg2 PersonalMessage
insert #mymsg2 select 'oh noes'
exec InsertPersonalMessage #mymsg2
Disadvantage is that there are two copies of the data.
But you would be able to run the batch against each database simultaneously.
Whether this is any better than using a table table is really down to what processing/data sizes you have - btw to use a temp table from an s.p. you just access it from the s.p. code (and it fails if it doesn't exist).
Another way to solve this (though not necessarily the correct way) is to only utilize the UDT as a part of a dynamic SQL call.
USE [db1]
CREATE PROCEDURE [dbo].[sp_Db2Data_Sync]
AS
BEGIN
/*
*
* Presumably, you have some other logic here that requires this sproc to live in db1.
* Maybe it's how you get your identifier?
*
*/
DECLARE #SQL VARCHAR(MAX) = '
USE [db2]
DECLARE #db2tvp tableType
INSERT INTO #db2tvp
SELECT dataColumn1
FROM db2.dbo.tblData td
WHERE td.Id = ' + CAST(#YourIdentifierHere AS VARCHAR) '
EXEC db2.dbo.sp_BulkData_Sync #db2tvp
'
EXEC(#SQL)
END
It's definitely not a purist approach, and it doesn't work for every use case, but it is technically an option.

SELECT fieldnames FROM dynamic SQL query

I have a stored procedure that uses several parameters to build a dynamic query, which I execute. The query works fine, however, this procedure will be the data source for a Crystal Report which needs a "static" SELECT with field names it can reference. The Crystal Report is called from a Visual Basic application, and gets it's parameters passed to it from the application. It, in turn, passes the parameters to the SQL Server stored procedure.
Somehow I need to
SELECT fieldname1, fieldname2
FROM Exec(#MydynamcSQL)
after I build #MydynamcSQL. It is a complicated application accessing specific tables based on year, and specific databases based on the user. I am pretty new to SQL, so maybe there are other methods I could use that I am unaware of?
Try creating a temporary table to insert the data temporarily, then select from that table:
DECLARE #MydynamcSQL varchar(1000);
SET #MydynamcSQL = 'select fieldname1, fieldname1 from table1';
CREATE TABLE #Result
(
fieldname1 varchar(1000),
fieldname2 varchar(1000)
)
INSERT #Result Exec(#MydynamcSQL)
SELECT fieldname1, fieldname1 -- here you have "static SELECT with field names"
FROM #Result
DROP TABLE #Result
Did you try making the who thing dynamic, such as:
Exec( 'SELECT fieldname1, fieldname2 FROM ' + #MydynamcSQL)
It's worth noting although out of scope, ensure you are not vulnarable to sql injection attacks. A parameterized dynamic query potentially leaves you exposed.

how to perform same DML statements on two different server ( Oracle and SQL )

I have two servers ORACLE and SQL.
I have same database on both the servers .
I want to perform DML (insert,update,delete).
How can I perform same DML statements on both the servers simultaneously ?
If I insert one statement in SQL server then the same statement should be updated in the oracle database .
Thanks
you can use MS SQL (and Oracle probably has something similar) to create a linked server.
Please refer to these links:
http://msdn.microsoft.com/en-us/library/ms188279
http://msdn.microsoft.com/en-us/library/ff772782.aspx
http://msdn.microsoft.com/en-us/library/ms190479.aspx
This basically lets you use MS SQL as usual, but you can reference a completely different system thorough this functionality.
Lets say you have the following simple table scenario:
CREATE TABLE Tbl
(
ID int,
SOMETHING NVARCHAR(100)
);
I wouldn't want to call two different INSERT statements, so I would write a stored procedure that does the inserting on my behalf.
Lets assume the stored procedure name is SP_MyTest and has two parameters: #ID, #SOMETHING. I would use a transaction in order to assure I always insert into both tables.
But keep in mind - this implementation is synchronous which means: If one insert takes "forever" then the application will hold for the duration of that time unless you make some additions.
CREATE PROCEDURE SP_MyTest
#ID INT,
#SOMETHING NVARCHAR(100)
AS
BEGIN
SET NOCOUNT ON;
BEGIN TRANSACTION TRN
INSERT INTO Tbl(ID,SOMETHING) VALUES(#ID,#SOMETHING);
INSERT INTO <LinkedServerName>.<LinkedServerDBName>.<schema>.<table>(ID,SOMETHING) VALUES(#ID, #SOMETHING);
COMMIT TRANSACTION TRN
END
And after that, I would call this sproc with:
EXEC SP_MyTest 1, 'Test1'
EXEC SP_MyTest 2, 'Test2'
EXEC SP_MyTest 3, 'Test3'

How to get results from "exec sp_showpendingchanges" into a table

EDIT: The things I've tried below came directly from the alleged duplicate. The solutions actually do work fine with a user defined sp (and probably most system sp's), but for whatever reason it doesn't work with this one.
I can run exec sp_showpendingchanges on the distribution publication database without any issues. However I want to capture the results in a table
I've tried:
SELECT * INTO #tmpTable
FROM OPENROWSET('SQLNCLI', 'Server=SERVER; Trusted_Connection=yes;',
'EXEC sp_showpendingchanges')
and:
SELECT * INTO #tmpTable
FROM OPENQUERY(SERVER, 'exec sp_showpendingchanges')
Both of these statements return an error that says: Invalid object name 'sysmergepublications'.
I tried to specify the initial catalog in the connection string and even tried adding a USE statement in the last parameter of each statement (i.e. I used an embedded EXEC statement with double-single quotes and all that). But I still end up with the same error.
So how can I get the results from exec sp_showpendingchanges into a temporary table, and preferably without having to define the table myself? If all else fails I will make a program in C#, but really hoping there's a simpler way to just do this with just SQL.
Here is a working example
You create a table
DECLARE #result_table TABLE
(
destination_server SYSNAME ,
pub_name SYSNAME ,
destination_db_name SYSNAME ,
is_dest_subscriber BIT ,
article_name SYSNAME ,
pending_deletes INT ,
pending_ins_and_upd INT
)
execute the script
INSERT INTO #result_table
EXEC sp_showpendingchanges
view the results
SELECT * FROM #result_table
I read your question but definetly cannot understand what the problem to create temp table. Anyway, if you can execute SP but get an error when you do it through linkedserver or openrowset - than problem is in permissions.
Check permissions on sysmergepublications table. If user, which you use for linked server or openrowset, has no grant on do select this table you need to add this permission to user.
I hope it will help you.