how to orchestrate data lake activities? - azure-data-lake

How do we orchestrate the execution of stored procedures in data lake?
Example
1. execute sproc dbo.abc
2. execute sproc dbo.xyz
3. execute sproc dbo.aaa
The question could be more specifically restated: what integrations does Azure provide in order to be able to execute usql stored procedures? Azure Functions? Events?

I recommend you to use DataFactory. Its easy and powerfull.
You can create a pipeline of U-SQL activities.
Check this:
https://learn.microsoft.com/pt-pt/azure/data-factory/transform-data-using-data-lake-analytics

Related

How to execute insert queries in One database for another database in Azure

I need to execute update & insert queries in multiple database of azure in one stored procedure. please help
Actually, if you want to execute the same stored procedure in two different databases, you can keep the same stored procedure in both the databases. If you are referring to cross database queries, then it is not possible directly. However, you can make use of ADF/Databricks to achieve the same.

Azure Data Factory Copy Activity with Source as Stored Procedure

Couldn't figure out how can we use a stored procedure as source dataset in Azure Data Factory copy activity? Is there a way to have a stored procedure as the source data in a copy data task?
Yes, ADF supports to read data from a stored procedure in Copy activity. See the below picture, we use an Azure SQL dataset as example, click the stored Procedure checkbox, select the stored procedure script in your database then fill the parameter if needed. This doc provides more information. Thanks.
Be careful as when using the stored procedure source with 'auto create' table set a schema infer step is performed which executes the code in the stored procedure in a VERY peculiar way that can cause pipeline errors - especially if you have any dynamic SQL and/or conditions in the stored procedure code! There is a work-around I have discovered which allows the stored procedure to be written in a fully functional way without breaking the pipeline. I will write a separate article on it perhaps.

Cross database DML query in azure SQL

Is it possible to use DML commands like insert, update from cross database in azure SQL.
In my requirement I'm running one SP on DB1 after execution i want to update status in a table that belongs to DB2. I'm using azure SQL. Is there any way to call SPs from cross database?
Yes, you can leverage the elastic database query feature which supports Cross-Database Queries. Check sp_execute_remote for details of executing t-sql queries on remote Azure SQL Databases.
Here is a similar thread for your reference: Call stored procedure from Elastic Database in Azure

Execute SQL Task in TSQL

I have multiple SQL tasks that execute stored procedures. I intend to call these SQL tasks in some other stored procedure. How do I do this ?
In these SQL tasks, all I have is exec sp statement.All these SQL tasks need to start in a stored procedure only.
You can't call individual SSIS tasks, but you can call an SSIS package from a stored procedure. The procedure so is not totally straight-forwards and I won't put instructions here, as there are many sites which do so.
However, if all these tasks do is call an SP, why not just call the sp?

Testing Stored Procedures with MySQL

What is the best way to test MySQL stored procedures? How do you test stored procedures with output parameters in the MySql GUI?
My standard method is to create SQL scripts that create the test data, call the stored procedure, and automatically verify the post-conditions of the stored procedure. You could use the MySQL Query Browser to write/run the scripts.
HTH -
Chris