Here is the scenario:
Elastic Database A connected to database B
Database B contains stored procedure that returns result set needed in database A
As far as I can tell the title can't be done. Only tables or views seem to work from my testing.
So the next though was to yield the stored procedure as a view in database B then call the view from DB A.
But views cant call stored procedures, even tried looking into table valued functions between the view and the stored procedure, but that's not permitted either.
How can I get the result set from a stored procedure in DB B into DB A?
Currently the only way to execute remote Stored Procedure calls using Elastic Query is when your External Data Source is defined using a "sharded" setup. E.g. you have defined an external source with
CREATE EXTERNAL DATA SOURCE MyElasticDBQueryDataSrc WITH
(TYPE = SHARD_MAP_MANAGER, ... )
In that case, you have access to a utility procedure called SP_Execute_Fanout which can be used to invoke a stored procedure (or perform ANY SQL operation) on each shard, and return a UNION ALL result set. This utility proc is detailed here
This capability is not yet available with Elastic Query's "Vertically Partitioned" scenario (e.g. TYPE = RDBMS), however we will be adding this type of functionality as the preview continues, so that invoking a remote stored proc in a single DB becomes simple.
Related
I have a stored procedure on tempdb database (under System Databases) on server A. The stored procedure has 3 parameters, Param1, Param2, and Param3 which all accept varchars.
I would like to execute this stored procedure on server B for a database called SomeDB.
With the stored procedure, I'd like to pull data from different tables on SomeDB, and then put the results in a new table that will be created called SomeNewTable, which again will be located on SomeDb
Let's assume that the servers are linked.
How should I approach this?
Write the stored procedure on ServerB . Test the stored proc so that it can run completely on ServerB when you are connected to server B. So everything is happening in Server B.
Now just call the Server B stored proc from Server A.
On Server B exec SomeDb.dbo.myStoredProc(p1,p2,p3)
On Server A exec LinkedServerB.SomeDb.dbo.myStoredProc(p1,p2,p3)
Do not use tempDb for anything - you can not reliably put a stored proc in TempDb*
The fact that it will work in temp db for awhile is misleading.
If you are doing a coding exercise they may have you use tempDb but that is a special situation.
You will have to deal with the security jump from ServerA to ServerB depending on how the linked server was setup.
*(yes I know how to do it permanently - but I doubt it works in azure)
Thank you for reading.
Temp_server
- Temp_db1
. Table : Temp_table
. stored procedure (they refer Temp_table of Temp_db1)
- Temp_db2
. stored procedure (they refer Temp_table of Temp_db1)
Assume that:
there is a server (called Temp_server)
there are two databases (called Temp_db1, Temp_db2)
In Temp_db1, there is a Temp_table and some stored procedures that refer to Temp_table.
In this situation, I can view the list of stored procedure which refer to Temp_table.
But, in SSMS, it shows only the stored procedure of Temp_db1. This means, if there is a stored procedure that is saved on Temp_db2 and refers to Temp_table of Temp_db1, it doesn't show up.
Can I view this, too, somehow?
Can I view this, too, somehow?
You can use sys.sql_expression_dependencies
It shows you dependencies in other databases and even other(linked) servers.
Here is an example when it shows up my historical triggers that writes to another database, Storico.
You can see also test_powershell stored procedure that uses xp_cmdshell from master
and sp_write_execution_log procedure that uses loopback server to execute another sp, sp_write_execution_log_lnk.
Couldn't figure out how can we use a stored procedure as source dataset in Azure Data Factory copy activity? Is there a way to have a stored procedure as the source data in a copy data task?
Yes, ADF supports to read data from a stored procedure in Copy activity. See the below picture, we use an Azure SQL dataset as example, click the stored Procedure checkbox, select the stored procedure script in your database then fill the parameter if needed. This doc provides more information. Thanks.
Be careful as when using the stored procedure source with 'auto create' table set a schema infer step is performed which executes the code in the stored procedure in a VERY peculiar way that can cause pipeline errors - especially if you have any dynamic SQL and/or conditions in the stored procedure code! There is a work-around I have discovered which allows the stored procedure to be written in a fully functional way without breaking the pipeline. I will write a separate article on it perhaps.
I'm running a stored procedure on server1 from my application. The stored procedure does a bunch of stuff and populate a table on server2 with the result from the procedure.
I'm using linked server to accomplish this.
When the stored procedure is done running the application continues and tries to do some manipulation of the result from the stored procedure.
My problem is that the results from the stored procedure has not been completely inserted into the tables yet, so the manipulation of the tables fails.
So my question is. Is it possible to ensure the insert into on the linked server is done synchronous? I would like to have the stored procedure not return until the tables on the linked server actually is done.
You can use an output parameter of the first procedure. When the table is create on the second server the output parameter value will be return to your application and indicates the operation is ready.
If the things are difficult then this you can try setting a different isolation level of your store procedure:
http://msdn.microsoft.com/en-us/library/ms173763.aspx
I found the reason for this strange behavior. There was a line of code in my stored procedure added during debug that did a select on a temporary mem table before the data in the same table was written to the linked server.
When the select statement was run, the control was given back to my application and at the same time the stored procedure continued running. I guess the stored procedure was running synchronously from the start.
I have a stored procedure, in one database within my SQL server, that sets permissions to all stored procedures at once for that particulat database. Is there a way to create this stored procedure in a way were I can call it easily from any database within the SQL server and if so how do I go about doing such a thing
While the best solution to this specific question of granting execute to all procedures is the one provided by marc_s, the actual question was is there a way to create a single stored procedure and make it available to all databases.
The way to do this is documented at https://nickstips.wordpress.com/2010/10/18/sql-making-a-stored-procedure-available-to-all-databases/:
Create the stored procedure in the master database.
It must be named to start with sp_, e.g. sp_MyCustomProcedure
Execute sys.sp_MS_marksystemobject passing the name of the procedure, e.g. EXEC sys.sp_MS_marksystemobject sp_MyCustomProcedure
Here is a simple example which just selects the name of the current database:
use master
go
create procedure dbo.sp_SelectCurrentDatabaseName as begin
select db_name()
end
go
execute sys.sp_MS_marksystemobject sp_SelectCurrentDatabaseName
go
Calling exec dbo.sp_SelectCurrentDatabaseName on any database will then work.
To mark the procedure as not a system object, there a some nasty hacks suggested at https://social.msdn.microsoft.com/Forums/sqlserver/en-US/793d0add-6fd9-43ea-88aa-c0b3b89b8d70/how-do-i-undo-spmsmarksystemobject?forum=sqltools but it is safest and easiest to just drop and re-create the procedure.
Caveat
Of course creating system procedures like this is breaking the common rule of not naming your own procedures as sp_xxx, due to the possibility of them conflicting with built-in procedures in future versions of SQL Server. Therefore this should be done with care and not just create a load of randomly named procedures which you find useful.
A common simple way to avoid this is to add your own company/personal prefix to the procedure which Microsoft is unlikely to use, e.g. sp_MyCompany_MyCustomProcedure.
I have a stored procedure, in one database within my SQL server, that
sets permissions to all stored procedures at once for that particular
database.
You could archive the same result much easier:
create a new role, e.g. db_executor
CREATE ROLE db_executor
grant that role execute permissions without specifying any objects:
GRANT EXECUTE TO db_executor
This role now has execute permissions on all stored procedures and functions - and it will even get the same permissions for any future stored procedure that you add to your database!
Just assign this role to the users you need and you're done....
Have you tried a 3 or 4 part name?
InstanceName.DatabaseName.dbo.usp_Name
That procedure could in turn reference objects in other databases using the same conventions. So you could parameterize the name of the database to be operated on and use dynamic SQL to generate 4 part names to reference objects such as system tables.