I'm trying to create a temp table from stored procedures, from this link
In the string he defines the sql server version. Our clients have different types of sql servers, from 2005 until 2012.
String: 'SQLNCLI', 'Server=(local)\SQL2008;Trusted_Connection=yes;','EXEC getBusinessLineHistory'
How can I use that command independently from sql server plataform
The OPENROWSET creates a dynamic link to a remote server.
http://technet.microsoft.com/en-us/library/ms190312.aspx
You can create a dynamic TSQL call to a dynamic link with changing parameters. Below is sample code. This can be converted into a store procedure with a #my_Server passed as a parameter.
Please note, this does not support multiple calls at the same time since only one table exists.
You can not use a local temp table since there might be a scoping issue with EXEC calling sp_executesql inside a stored procedure.
These are things you will need to research.
-- Set the server info
DECLARE #my_Server SYSNAME;
SET #my_Server = 'Server=(local)\SQL2008';
-- Clear the staging table
truncate table STAGE.dbo.MYTABLE;
-- Allow for dynamic server location
DECLARE #my_TSQL NVARCHAR(2048);
SET #my_TSQL =
'INSERT INTO STAGE.dbo.MYTABLE SELECT * FROM OPENROWSET(''SQLNCLI'',' + #my_TSQL +
';Trusted_Connection=yes;'', ''EXEC usp_My_Stored_Procedure'')';
-- Run the dynamic remote TSQL
exec sp_executesql #my_TSQL;
Related
Please see the code below:
select top 1 * into #dbusers from dbusers
declare #tsql as varchar(1000)
set #tsql = 'select * from #dbusers'
exec (#tsql)
This works as I would expect i.e. one row is returned by the dynamic SQL. Is it possible to do this:
declare #tsql as varchar(1000)
set #tsql = 'select top 1 * into #dbusers from dbusers'
exec (#tsql)
select * from #dbusers
Here I get the error:
Invalid object name '#dbusers'
Is there a workaround?
I realise that you can have output parameters with dynamic SQL. However, I also know that when using stored procedures you cannot return tables as output parameters.
Is it possible to do this? Is there a workaround (except creating a physical table)?
Temporary tables are only available within the session that created them. With Dynamic SQL this means it is not available after the Dynamic SQL has run. Your options here are to:
Create a global temporary table, that will persist outside your session until it is explicitly dropped or cleared out of TempDB another way, using a double hash: create table ##GlobalTemp
Because this table persists outside your session, you need to make sure you don't create two of them or have two different processes trying to process data within it. You need to have a way of uniquely identifying the global temp table you want to be dealing with.
You can create a regular table and remember to drop it again afterwards.
Include whatever logic that needs to reference the temp table within the Dynamic SQL script
For your particular instance though, you are best off simply executing a select into which will generate your table structure from the data that is selected.
I am in the process of writing a SQL query file for Microsoft SQL Server 2008 R2 which will call a number of stored procedures to create a merge publication. The baseline script was generated through the New Publication Wizard in Microsoft SQL Server Management Studio.
The "problem" I am facing is that when creating the merge articles using the sp_addmergearticle stored procedure, I need to define a number of parameters which are common to all merge articles, such as the publication name, source owner, destination owner and so on.
The question, then: Is there a way to group a collection of named parameters and supply them in a common fashion so administering changes to these parameters would be simpler?
For example, consider the following query snippet:
use [MyDatabase]
exec sp_addmergearticle #publication=N'MyPub',
#article=N'MyTable#1',
#source_object=N'MyTable#1',
#source_owner=N'TheOwner',
#destination_owner=N'TheOwner',
#allow_interactive_resolver=N'true'
exec sp_addmergearticle #publication=N'MyPub',
#article=N'MyTable#2',
#source_object=N'MyTable#2',
#source_owner=N'TheOwner',
#destination_owner=N'TheOwner',
#allow_interactive_resolver=N'true'
etc...
GO
Now, I would like to make this piece of script easier to read and maintain so that the sp_addmergearticle calls would take a set of parameters which are common to all calls, and some specific parameters which are call-specific.
For example, like this:
use [MyDatabase]
-- Common parameters for all merge articles
DECLARE #common_parameters
-- #publication=N'MyPub'
-- #source_owner=N'TheOwner',
-- #destination_owner=N'TheOwner',
-- #allow_interactive_resolver=N'true'
exec sp_addmergearticle #common_parameters,
#article=N'MyTable#1',
#source_object=N'MyTable#1',
exec sp_addmergearticle #common_parameters,
#article=N'MyTable#2',
#source_object=N'MyTable#2',
etc...
GO
Does anyone know if this is possible? If possible, what means should I use to accomplish this?
You can just use local variables for some values (these can't, unfortunately, cross batch boundaries):
use [MyDatabase]
-- Common parameters for all merge articles
DECLARE #publication sysname
DECLARE #source_owner sysname
DECLARE #destination_owner sysname
DECLARE #allow_interactive_resolver nvarchar(5)
select #publication=N'MyPub',
#source_owner=N'TheOwner',
#destination_owner=N'TheOwner',
#allow_interactive_resolver=N'true'
exec sp_addmergearticle #publication=#publication,
#source_owner=#source_owner,
#destination_owner=#destination_owner,
#allow_interactive_resolver=#allow_interactive_resolver,
#article=N'MyTable#1',
#source_object=N'MyTable#1',
exec sp_addmergearticle #publication=#publication,
#source_owner=#source_owner,
#destination_owner=#destination_owner,
#allow_interactive_resolver=#allow_interactive_resolver,
#article=N'MyTable#2',
#source_object=N'MyTable#2',
And then at least there's only one place that these need to be updated, as required.
The problem:
I am using a linked server to call stored procedures from a remote server. While this works fine most of the times, I have a number of stored procedures that use table-valued parameters (TVPs). MSSQL cannot call remote SPs that use as parameters TVPs.
The workaround is to execute sql on the remote sql and build there the tvps. Again this works fine.
The problem is that I have to compose the string to call the SP. In the case when I have few TVPs, this is more or less easy, but I have SPs with a lot of TVPs.
Now, when profiling a Stored Procedure call, the call from .NET to sql in case of a TVP parameter stored procedure looks like:
declare #p1 <type>
insert into #p1 values(...)
insert into #p1 values(...)
...
exec myProc #p1
What I want to do is a wrapper on my server (identical with the sp remote) and within call the remote server with exec sql.
Does anyone now how can I (if I can) access this query from a stored procedure? Access it's own profiler like query so that I can just send it remote ?
Ok, so basically the solution is this (that kind of automates half of the problem) :
declare #tvpVal_string nvarchar(max) = 'declare #tvpVal myTVPType;'
set tvpVal_string += isnull(stuff((select ';insert into #tvpVal values('+...your values...+')' as [text()] from #tvpVal from xml path('')),1,1,'')+';','');
declare #sql nvarchar(max) = tvpVal_string +
'exec myProc #tvpVal=#tvpVal,
#OtherVal=#OtherVal'
exec [REMOTESRV].DB..sp_executesql #sql,'#OtherVal type',#OtherVal
I currently have a linked server that I am querying in a stored procedure. I have the query working just fine currently however this query will need to change for every branch of code I have. I would like to know what the best method is for derriving the database name I am calling in the cross server query.
Ex:
Server A has a link to server B. Server A contains 3 databases. SRV_A.DB1_DEV, SRV_A.DB2_Trunk, SRV_A.DB3_Prod Each are linked to their Server B counterpart... SRV_B.DB1_DEV, SRV_B.DB2_Trunk, SRV_B.DB3_Prod
Each database on Server A has the same stored procedure. The only thing that changes in the sproc is the cross server select. So SRV_A.DB1_Dev has a select in the sproc that reads:
SELECT foo FROM [SRV_B].[DB1_DEV].[foo_table] WHERE bar = 1
while the stored procedure on the trunk branch would be
SELECT foo FROM [SRV_B].[DB2_Trunk].[foo_table] WHERE bar = 1
Since I would like to have a VS project that will deploy the DB to every branch mentioned I would like to be able to fill in the database name dynamically. The solution I have came up with that is working is to use a series of IF checks with the CHARINDEX function, and then create the query with dynamic SQL, like this:
DECLARE #dSql NVARCHAR(4000);
DECLARE #databaseName NVARCHAR(100) = DB_NAME();
DECLARE #tableName NVARCHAR(100);
IF SELECT CHARINDEX('Dev', #databaseName, 0)
SET #tableName = '[SRV_B].[DB1_DEV].[foo_table]
...Same if & set for Trunk
...Same if & set for Prod
SET #dSql = 'DECLARE #retID INT;SELECT foo FROM ' + #tableName
+ ' WHERE bar = 1';SET #retID = SELECT SCOPE_IDENTITY()'
EXEC(#dSQL);
I would have to imagine there is a better solution though, if anyone can help me with one, it would be much appreciated. If by some outside shot this is the best way let me know as well.
Thanks,
James
One way to solve this problem might be to abstract the linked server name by wrapping it in a synonym:
note the extra part in the target table name - cross-server queries require a four-part name - I'm assuming this is a typo in the question and that foo_table is in the dbo schema
CREATE SYNONYM dbo.syn_foo_table
FOR [SRV_B].[DB1_DEV].[dbo].[foo_table]
which could then be referred to in the code as
SELECT foo FROM dbo.syn_foo_table WHERE bar = 1
You would then need to customise your deployment script to create the synonym(s) pointing at the correct server/database for the environment. This could use a similar dynamic SQL process to the one you've outlined above, but would only need to be executed once at deployment time (rather than on every execution).
Another possible solution is to use SQLCMD parameters in the stored procedure script, since (AFAIK) VS projects use SQLCMD to deploy database objects.
This feature allows you to parameterise SQL scripts with variables in the form $(variablename) - in your case:
SELECT foo FROM [SRV_B].[$(dbname)].[foo_table] WHERE bar = 1
The value of the variable can be set using an environment variable or passed into the command as an argument using the -v switch. See the SQLCMD MSDN link above for full details.
I was able to use a combination of enviornment variables as mentioned above for the db-name, and also dynamically generate the SRV name as well by using the following query:
DECLARE #ServerName NVARCHAR(100);
SET #ServerName = (SELECT name FROM sys.servers WHERE server_id = 1)
I have a sql server stored procedure that I use to backup data from our database before doing an upgrade, and I'd really like it to be able to run the stored procedure on multiple databases by passing in the database name as a parameter. Is there an easy way to do this? The best I can figure is to dynamically build the sql in the stored procedure, but that feels like its the wrong way to do it.
build a procedure to back up the current database, whatever it is. Install this procedure on all databases that you want to backup.
Write another procedure that will launch the backups. This will depend on things that you have not mentioned, like if you have a table containing the names of each database to backup or something like that. Basically all you need to do is loop over the database names and build a string like:
SET #ProcessQueryString=
'EXEC '+DatabaseServer+'.'+DatabaseName+'.dbo.'+'BackupProcedureName param1, param2'
and then just:
EXEC (#ProcessQueryString)
to run it remotely.
There isn't any other way to do this. Dynamic SQL is the only way; if you've got strict controls over DB names and who's running it, then you're okay just truncating everything together, but if there's any doubt use QUOTENAME to escape the parameter safely:
CREATE PROCEDURE doStuff
#dbName NVARCHAR(50)
AS
DECLARE #sql NVARCHAR(1000)
SET #sql = 'SELECT stuff FROM ' + QUOTENAME(#dbName) + '..TableName WHERE stuff = otherstuff'
EXEC sp_ExecuteSQL (#sql)
Obviously, if there's anything more being passed through then you'll want to double-check any other input, and potentially use parameterised dynamic SQL, for example:
CREATE PROCEDURE doStuff
#dbName NVARCHAR(50)
#someValue NVARCHAR(10)
AS
DECLARE #sql NVARCHAR(1000)
SET #sql = 'SELECT stuff FROM ' + QUOTENAME(#dbName) + '..TableName WHERE stuff = #pOtherStuff'
EXEC sp_ExecuteSQL (#sql, '#pOtherStuff NVARCHAR(10)', #someValue)
This then makes sure that parameters for the dynamic SQL are passed through safely and the chances for injection attacks are reduced. It also improves the chances that the execution plan associated with the query will get reused.
personally, i just use a batch file and shell to sqlcmd for things like this. otherwise, building the sql in a stored proc (like you said) would work just fine. not sure why it would be "wrong" to do that.
best regards,
don
MSSQL has an OPENQUERY(dbname,statement) function where if the the server is linked, you specify it as the first parameter and it fires the statement against that server.
you could generate this openquery statement in a dynamic proc. and either it could fire the backup proc on each server, or you could execute the statement directly.
Do you use SSIS? If so you could try creating a couple ssis packages and try scheduling them,or executing them remotely.