I'm pretty new to Synapse but we are looking for a script to shutdown/startup Synapse during non business hours. This is to help reduce our costs during evenings and weekends.
Any assistance would be appreciated.
Is anyone aware of a script that could assist with this?
You can utilize Azure Synapse pipelines to automate the pausing and resuming of your Synapse Dedicated SQL pool.
In order to do that first you will have to create a pipeline, then identify the list of dedicated SQL pools in your Azure Synapse workspace. Next filter any dedicated SQL pools that you don't want to pause or resume from the list and then loop over each dedicated SQL pool and next check the state of each dedicated SQL pool, evaluate the state of the dedicated SQL pool and make a decision whether to pause or resume the dedicated SQL pool based on the business need.
The pipeline flow looks like below:
This pipeline flow will utilize below REST API's to list, check state and then pause or resume your Dedicated SQL pools.
https://learn.microsoft.com/en-us/rest/api/synapse/sql-pools
https://learn.microsoft.com/en-us/rest/api/sql/
Here is a public documentation which is a sample demonstration on how to setup the pipeline to automate the process using Synapse pipelines.
Pause and resume dedicated SQL pools with Synapse Pipelines
Here is a techcommunity article by one of the MSFT which uses script to automate the pausing and resuming of Synapse Dedicate SQL pools: Automatic pause all Synapse Pools and keeping your subscription costs under control
Related
I want to use my SQL script (present under Develop hub) file inside a Pipeline (present under Integrate hub). Currently I do not see any Activities available solving this purpose.
There is one Script activity under General section which only have a Query & NonQuery option, not for referring any SQL script file created earlier.
Is that feature available at all in Azure Synapse Analytics? Can we refer to SQL script by some other means?
If your Synapse workspace is paired with Azure DevOps then I imagine it’s easy to get the file content with a REST API call (eg here). However then you have to parse the file as GO is not supported by the Script activity. ADF / Synapse Pipeline functions do not support a RegEx style split eg word boundary and GO (\bGO\b) so it starts to get kind of fiddly. I had some success with replace and uriComponent functions.
However you would be better of using Stored Procedures and the Stored Proc activity in Synapse Pipelines - much simpler implementation.
I try to use Serverless Sql Pool Integration dataset in Azure Analitycs DataFlow as a source but I can't. SQL Pool is unavailable as a Source in DataFlow, but I don't know why?
What is a problem? I use SQL Pool datasets in Azure Synapse Pipelines and it works. Is it problem with my licence, version or maybe I do something wrong?
Have you tried selecting Synapse Dataset instead of SQL
You will need a Synapse Linked Service as well
how to schedule a Query in Azure Synapse On-demand and save the result to a azure storage every 1 hour
my idea is to materialize the results into a separate storage and use PowerBI to access the results
Besides the fact that PowerBI can directly access your Synapse instance, if you want to go this route you have several options:
This can be done using a pipeline in the new Synapse Workspace. You should be aware that this technology is still in preview.
Use Polybase and Stored Procedures on a Job Scheduler to INSERT to a Blob Storage location. There is a lot of configuration in this option.
At present, I would recommend Azure Data Factory (ADF) on a Schedule Trigger. This is the simplest and most reliable of the current options. Based on the scenario you described, a single Copy activity could easily perform this task.
I'm working on backup and recovery for Data Lake Store. In a nutshell, we need to back up one Data Lake Store to another. I've chosen AdlCopy for that purpose (if you want to know why, check out my previous post: Backup of Data Lake Store).
According to https://learn.microsoft.com/en-us/azure/data-lake-store/data-lake-store-best-practices#resiliency-considerations, AdlCopy supports orchestration through either Azure Automation or Windows Task Scheduler. I'm more keen on using Azure Automation however. Can someone help clarify how I'm supposed to use Azure Automation to run AdlCopy on a schedule? Do I need a VM? AdlCopy only supports Windows 10 and I can't figure out how Azure Automation will help me to achieve a serverless approach (without Data Factory if possible).
If you are going to have scheduled copies, it will be best to do it using Azure Data Factory (ADF). AdlCopy works great for quick one-off transfers of data. But for scheduled ones which need full monitoring support, built-in retries etc, ADF will be best. If there are reasons you cannot use ADF, please do let us know.
Thanks,
Sachin Sheth,
Program Manager, Azure Data Lake.
I have one SQL Agent maintenance job which checks the index fragmentation within a database and rebuilds indexes if required.
This is running well in my test server (Microsoft Sql Server 2012). But my production server is in Azure. Now I want to schedule that job to Azure.
SQL Agent does not exist in Azure SQL Database so how can I schedule a SQL job in Azure DB?
Since this question was first asked, there is now another alternative to handle this problem:
Azure Functions
Here are a couple of examples that could easily be modified to call a stored procedure that rebuilds your indexes
Create a function in Azure that is triggered by a timer
Use Azure Functions to connect to an Azure SQL Database
Also see
How to maintain Azure SQL Indexes and Statistics - this page has an example stored procedure for rebuilding your indexes that you can download.
Reorganize and Rebuild Indexes
A few things to keep in mind with Azure functions
They are built on top of Azure Web Jobs SDK and offer additional functionality
There are two different pricing models:
App Service plan (attach it to an existing plan)
Predictable cost model
It puts extra load on the same VM used by your web site
Consumption plan
You get some free processing every month
The default maximum run time is 5 minutes to prevent billing problems, but it can be changed via the host.json file.
Edit September 5, 2021 to add additional information
It should be noted that if you need SQL Agent, you have another option now. I would suggest reading up on Azure SQL Managed Instances. You can see a comparison of Azure SQL to Azure SQL Managed instance here in the Microsoft Documentation. With Azure SQL Managed Instances, your transition to the cloud could be a lot simpler since a lot of the on-premise features you are used to are already there (including SQL Server Agent, DB Mail, etc.).
This feature has been rejected by Microsoft (link no longer available).
To quote their response:
Today in Azure there are several alternatives,
SQL Database Elastic Jobs
https://learn.microsoft.com/en-us/azure/azure-sql/database/elastic-jobs-overview
The Azure job scheduler
http://www.windowsazure.com/en-us/services/scheduler/
The new
preview of Azure Automation
http://azure.microsoft.com/en-us/services/automation/.
SQL Server
in a VM
Option 1 requires an additional dedicated cloud service, which increases cost. Option 2 is free (I think) as long as you don't run more than once per hour.
Azure SQL does not support sql jobs. From documentation:
Microsoft Azure SQL Database does not support SQL Server Agent or
jobs. You can, however, run SQL Server Agent on your on-premise SQL
Server and connect to Microsoft Azure SQL Database.
WebJobs: If you have a website you can create webjob and run it on schedule. See more here
Other alternatives - Scheduling job on SQL Azure
Another option is rovergo, a service that allows you to schedule sql jobs with a cron expression. This is nice because you don't have to create a web job or azure function. You can simply schedule a sql script.
(I'm a developer on rovergo)
You can use Azure automation to schedule jobs on an Azure-DB like the on premise SQL Agent.
See https://azure.microsoft.com/nl-nl/blog/azure-automation-your-sql-agent-in-the-cloud/ for more information.
Available for a couple of years now, Elastic Jobs for azure db...
docs:
https://learn.microsoft.com/en-us/azure/azure-sql/database/job-automation-overview?view=azuresql
tutorial:
https://www.youtube.com/watch?v=JIMgqkXZFOQ
Currently seems to use the 2017 version of the sqlagent sp (or a close approximation), but elastic links are now already pointing to SQL2022 preview which contains a newer version of the agent sps