I am trying to use dbtcloud as an orchestration tool for loading a data warehouse. The data is loaded via calling a series of stored procedures. When trying to call a single stored proc via dbt, the sql executes successfully, but when I check the table no new data has loaded. Here is a sample of how I am calling the proc. Any ideas? Thanks!
set autocommit=on;
call testschema.sample_proc(
'parm1',
'parm2',
'parm3'
);
Related
I know it's a weird scenario. But I need to execute a script inside a stored procedure in SQL Server.
I've created a SQL script to create a test table and populate it with some of the data from the original table. This script is in a separate file.
I also updated some stored procedures to use test mode. I created a parameter called #IsTestMode in the stored procedures.
If #IsTestMode is true, it checks whether the test table exists or not. If the test table does not exist, It should run that SQL script to create the test table and populate it with the data from the original table.
Is it possible to execute SQL script (in a separate file) inside a stored procedure?
Instead of storing the script in a file, create a new stored procedure using that code, say dbo.CreateTestData. Then, in your main proc:
CREATE PROC dbo.Whatever
#IsTestMode BIT = 0 /*Default to no*/
AS
BEGIN
IF #IsTestMode = 1
EXEC dbo.CreateTestData;
DoOtherStuff...
END
GO
Couldn't figure out how can we use a stored procedure as source dataset in Azure Data Factory copy activity? Is there a way to have a stored procedure as the source data in a copy data task?
Yes, ADF supports to read data from a stored procedure in Copy activity. See the below picture, we use an Azure SQL dataset as example, click the stored Procedure checkbox, select the stored procedure script in your database then fill the parameter if needed. This doc provides more information. Thanks.
Be careful as when using the stored procedure source with 'auto create' table set a schema infer step is performed which executes the code in the stored procedure in a VERY peculiar way that can cause pipeline errors - especially if you have any dynamic SQL and/or conditions in the stored procedure code! There is a work-around I have discovered which allows the stored procedure to be written in a fully functional way without breaking the pipeline. I will write a separate article on it perhaps.
Here is the scenario:
Elastic Database A connected to database B
Database B contains stored procedure that returns result set needed in database A
As far as I can tell the title can't be done. Only tables or views seem to work from my testing.
So the next though was to yield the stored procedure as a view in database B then call the view from DB A.
But views cant call stored procedures, even tried looking into table valued functions between the view and the stored procedure, but that's not permitted either.
How can I get the result set from a stored procedure in DB B into DB A?
Currently the only way to execute remote Stored Procedure calls using Elastic Query is when your External Data Source is defined using a "sharded" setup. E.g. you have defined an external source with
CREATE EXTERNAL DATA SOURCE MyElasticDBQueryDataSrc WITH
(TYPE = SHARD_MAP_MANAGER, ... )
In that case, you have access to a utility procedure called SP_Execute_Fanout which can be used to invoke a stored procedure (or perform ANY SQL operation) on each shard, and return a UNION ALL result set. This utility proc is detailed here
This capability is not yet available with Elastic Query's "Vertically Partitioned" scenario (e.g. TYPE = RDBMS), however we will be adding this type of functionality as the preview continues, so that invoking a remote stored proc in a single DB becomes simple.
Can i know is there any way to construct a stored procedure at run time. I will be dealing with stored procedure . So instead of creating a stored procedure manually, just passing the parameters to a generic sp which just gets those values and constructs a new stored procedure . Am just asking like a tool which generates new stored procedure and the tool is itself a stored procedure.
Hope am clear.
I'm running a stored procedure on server1 from my application. The stored procedure does a bunch of stuff and populate a table on server2 with the result from the procedure.
I'm using linked server to accomplish this.
When the stored procedure is done running the application continues and tries to do some manipulation of the result from the stored procedure.
My problem is that the results from the stored procedure has not been completely inserted into the tables yet, so the manipulation of the tables fails.
So my question is. Is it possible to ensure the insert into on the linked server is done synchronous? I would like to have the stored procedure not return until the tables on the linked server actually is done.
You can use an output parameter of the first procedure. When the table is create on the second server the output parameter value will be return to your application and indicates the operation is ready.
If the things are difficult then this you can try setting a different isolation level of your store procedure:
http://msdn.microsoft.com/en-us/library/ms173763.aspx
I found the reason for this strange behavior. There was a line of code in my stored procedure added during debug that did a select on a temporary mem table before the data in the same table was written to the linked server.
When the select statement was run, the control was given back to my application and at the same time the stored procedure continued running. I guess the stored procedure was running synchronously from the start.