Azure SQL tools for reports - parametrised stored procedure - sql

I am thinking of using some Azure services to share SQL reports with external users. And now I am a little bit confused because there seems no good option for the thing I need.
I have an on-premise SQL database with parametrised stored procedure which returns a set of rows like a select statement. The final effect should be a solution which will give external user possibility to execute this stored procedure with provided input parameters and be able to save the results to Excel file.
So I've thought about migrating tables which are used by this procedure to Azure SQL database and then maybe build some application using PowerApp or Functions, but at this moment I am just stuck and don't know from where to start.

Related

How to run a T-SQL query daily on an Azure database

I am trying to migrate a database from a sql server into Azure. This database have 2 rather simple TSQL script that inserts data. Since the SQL Agent does not exist on Azure, I am trying to find an alternative.
I see the Automation thing, but it seems really complex for something as simple as running SQL scripts. Is there any better or at least easier way to do this ?
I was under the impression that there was a scheduller for that for I can't find it.
Thanks
There are several ways to run a scheduled Task/job on the azure sql database for your use case -
If you are comfortable using the existing on-premise sql sever agent you can connect to your azure sql db(using linked servers) and execute jobs the same way we used to on on-premise sql server.
Use Automation Account/Runbooks to create sql jobs. If you see marketplace you can find several examples on azure sql db(backup,restore,indexing jobs..). I guess you already tried it and does not seem a feasible solution to you.
Another not very famous way could be to use the webjobs(under app service web app) to schedule tasks(can use powershell scripts here). The disadvantage of this is you cannot change anything once you create a webjob
As #jayendran suggested Azure functions is definitely an option to achieve this use case.
If some how out of these if you do not have options to work with the sql directly , there is also "Scheduler Job Collection" available in azure to schedule invocation of HTTP endpoints, and the sql operation could be abstracted/implemented in that endpoint. This would be only useful for less heavy sql operations else if the operation takes longer chances are it might time out.
You can use Azure Functions to Run the T-SQL Queries for Schedule use Timely Trigger.
You can use Microsoft Flow (https://flow.microsoft.com) in order to create a programmed flow with an SQL Server connector. Then in the connector you set the SQL Azure server, database name, username and password.
SQL Server connector
There are many options but the ones that you can use to run a T-SQL query daily are these:
SQL Connector options
Execute a SQL Query
Execute stored procedure
You can also edit your connection info in Data --> Connections menu.

Is there an easier way to deploy a Stored Procedure into multiple database?

I have a stored proc saved as a SQL file and would like to deploy it to several databases.
What are my current options as I am not sure how to start creating a script for this. Should I use dynamic SQL?
You might want to look at sql server change management tools such as Red Gate. I don't know how dynamic sql would play into deploying sp's across multiple networks.

How to separate Stored Procedures (i.e. all the business logic) from Client Database so that it only contains client data?

Scenario:
Our application database (in SQL Server 2012) contains entire business logic in stored procedures. Every time we have to publish the database to the client, it unnecessarily results in copying the stored procedures to the client database.
Problem:
All the business logic gets copied to the client side and results in proprietary issues.
Solutions tried earlier:
Using WITH ENCRYPTION
CREATE PROCEDURE Proc_Name WITH ENCRYPTION
This method results in encrypted and non-maintainable stored procedure code. We cannot judge which version of code is running at the client side, hence cannot debug. Version control cannot be applied. Moreover, client cannot perform database replication since encrypted stored procedures do not get replicated.
Creating synonyms
CREATE SYNONYM SchemaName.Proc_Name FOR LinkedServerDB.SchemaName.Proc_Name
This allows for creation of references (synonyms) at Client_DB which access the actual stored procedures residing on a Remote_Linked_Server_DB. In each stored procedure call, entire data is accessed from Client_DB and transmitted to Remote_Linked_Server_DB where the calculations are done and the result is sent back. This results in acute performance issues. Also requires 24x7 internet connectivity to the remote linked server.
Requirement
We are looking for a solution whereby the stored procedure code could be compiled (secured) and separated from the client database. Also, the compiled stored procedure code should be available at the client-end so that client does not require 24x7 connection to a remote location for accessing the stored procedures. Maybe Visual Studio database projects could be used to compile stored procedure code or something else.
[Edit]
I got to know that SQL Server 2014 allows for natively compiled stored procedure code. Can that be of help? msdn link And is the SQL Server 2014 RTM version stable enough?

Directing ADO queries away from the database

I've got a large number of old Delphi apps accessing a remote SQL Server database using ADO. I would like to get direct those queries to a middleware layer instead of said database. The Delphi clients must run unchanged; I am not the owner of most of them.
Is it possible to do this? If so, how should I go about it?
Don't worry about parsing the T-SQL (both raw T-SQL and stored proc calls, incidentally).
Create a new SQL database, and use a combination of views, T-SQL and managed code to fake up enough database objects for the application to work.
Technique 1: Use tables, but populate them asyncrhonously from the new data source.
Technique 2: Fake the tables and procedures
E.g. you can have a stored procedure which calls managed code to your middleware, to replace the existing stored procedure.
Where the application reads directly from a table, you can use a view, which references a managed table-valued function.
-
You should have no trouble wherever stored procedures are used. If the application sends dynamic SQL you have more of an uphill struggle however.

Deploying SQL Server Stored Procedure to multiple servers

I manage several SQL Servers 2000, 2005 and 2008; I maintain several scripts in the form of stored procedures and functions for may daily activities; as the need arise I do changes to my local copies of those scripts then I need to apply those changes to the same stored procedures and functions on every database server; it's time consuming and painful to connect to every database server and replace what's in there.
I've been trying to use sqlcmd utility with option -i but it's not working well; it keeps erroring out.
Is there a way or a tool that I can use to deploy my local stored procedures and functions to multiple sql servers?
stored procedures and functions have this structure
use dba
go
check if stored procedure or function exists
if so, drop it
create stored procedure or function