Our client trying to move the Azure sql server from one resource group to another - azure-sql-database

Our client trying to move the Azure SQL database resources from one resource group to another . The Azure SQL server having around 1600 databases. 4 Elastic pool have exist and databases segregated into 500+500+500+100 (1600 databases).
When our client moving the server to another resource group , he getting below error.
Error : The number write requests for subscription exceeded the limit of 1200 for the interval(1:00:00). Please wait for 303seconds(code:subscription requests throttled)
I know due to the maximum threshold reached, the error got triggered. Is there way to achieve our client requirement to move the Azure SQL server from one to another resource group.

You can try other methods to copy the databases to another resource group.
Bacpac file: You can export bacpac from one Azure SQL DB and import it to another.
Export bacpac
Import bacpac
Using Azure Data Factory: You can use ADF to copy to and from Azure SQL Database in different resource group.
Copy and transform data in Azure SQL Database by using Azure Data Factory

Related

Move Azure SQL Database to new Azure Account

I want to move a large 1TB Azure database (PAAS) to a new Azure account. I am trying to use Export but that is going to result in a lot of downtime for the database.
Will upping the price tier currently P1 (DTU) improve export performance?
Yes, upgrading the service tier may improve the import/export performance.
Refer the official document to get better understanding.
Alternatively, You can easily Import and Export the database from one SQL Server to another in the same or different Azure Account using SQL Server Management Studio(SSMS).
Connect to source Azure accounts in SSMS.
Right click on the source database and do Tasks -> Export data tier application
Save as a .bacpac file to your local disk
Now connect to destination Azure accounts in SSMS.
Right click on Databases and select Import Data-tier Application
Select Import from local disk and browse for the .bacpac file saved in step 3
Configure the database settings (give database name and other required settings)
Import the database

How to get Azure SQL transactional log

How to get the transaction logs for a Azure SQL db? I'm trying to find log from portal of azure but not getting any luck.
If there is no way to get the log where that is saying in Microsoft docs. any help is appriciate
You don't as it is not exposed in the service. Please step back and describe what problem you'd like to solve. If you want a DR solution, for example, then active geo-replication can solve this for you as part of the service offering.
The log format in Azure SQL DB is constantly changing and is "ahead" of the most recent version of SQL Server. So, it is probably not useful to expose the log (the format is not documented). Your use case will likely determine the alternative question you can ask instead.
Azure SQL Database auditing tracks database events and writes them to an audit log in your Azure storage account, or sends them to Event Hub or Log Analytics for downstream processing and analysis.
Blob audit
Audit logs stored in Azure Blob storage are stored in a container named sqldbauditlogs in the Azure storage account. The directory hierarchy within the container is of the form ////. The Blob file name format is _.xel, where CreationTime is in UTC hh_mm_ss_ms format, and FileNumberInSession is a running index in case session logs spans across multiple Blob files.
For example, for database Database1 on Server1 the following is a possible valid path:
Server1/Database1/SqlDbAuditing_ServerAudit_NoRetention/2019-02-03/12_23_30_794_0.xel
Read-only Replicas audit logs are stored in the same container. The directory hierarchy within the container is of the form ////RO/. The Blob file name shares the same format. The Audit Logs of Read-only Replicas are stored in the same container.

Creating Feeds between local SQL servers and Azure SQL servers?

We are wanting to use Azure servers to run our Power Apps applications, however we have local SQL servers which contains our data warehouse we want only certain tables to be on Azure and want to create data feeds between the two with information going from one to the other.
Does anyone have any insight into how I can achieve this?
I have googled but there doesn't appear to be a wealth of information on this topic.
It depends on how fast after a change in your source (the on premise SQL Server) you need that change reflected in your Sink (Azure SQL).
If you have some minutes or even only need to update it every day I would suggest a basic Data Factory Pipeline (search on google for data factory upsert). Here it depends on your data on how you can achieve this.
If you need it faster or it is impossible to extract an incremental update from your source you would need to either use triggers and write the changes from one database to the other or get a program that does change data capture that does that.
It looks like you just want to sync the data in some table between local SQL Server and Azure SQL database.
You can use the Azure SQL Data Sync.
Summary:
SQL Data Sync is a service built on Azure SQL Database that lets you synchronize the data you select bi-directionally across multiple SQL databases and SQL Server instances.
With Data Sync, you can keep data synchronized between your on-premises databases and Azure SQL databases to enable hybrid applications.
A Sync Group has the following properties:
The Sync Schema describes which data is being synchronized.
The Sync Direction can be bi-directional or can flow in only one
direction. That is, the Sync Direction can be Hub to Member, or
Member to Hub, or both.
The Sync Interval describes how often synchronization occurs.
The Conflict Resolution Policy is a group level policy, which can be
Hub wins or Member wins.
Next step, you need to learn how to configure the Data Sync. Please reference this Azure document:Tutorial: Set up SQL Data Sync between Azure SQL Database and SQL Server on-premises.
In this tutorial, you learn how to set up Azure SQL Data Sync by creating a sync group that contains both Azure SQL Database and SQL Server instances. The sync group is custom configured and synchronizes on the schedule you set.
Hope this helps.
The most robust solution here is Transactional Replication. You can also use SSIS or Azure Data Factory for copying tables to/from Azure SQL Database. And Azure SQL Data Sync also exists.

Keeping my local database updated .

I have a Cloud hosted Database, maintained by a different company. Here is my scenario:
I would like to bring down the entire database, schema and data, and keep my Local database updated real-time(sql server 2008r2).
I cannot setup replication inside SQL server, I do not have permissions on the cloud server.
I cannot use triggers.
I cannot use linked servers
They would provide me a copy of backup (nightly)and access to the transacrion logs every 1 hour
How can I use these to update my entire database.
Thanks in advance

Azure Copy data from 1 db to another using SSIS

I have 2 azure sql db and I've created SSIS job to transfer some data from 1 db to another.
The db has millions of records
The SSIS is hosted on premise and if I execute the package on my pc,
will it directly copy the data from 1 azure db to another on the FLY
OR
Fetch the data from 1 azure db to my local and then upload the data to another azure db
A trip from azure to local and again from local to azure will be too costly if I have millions of records.
I am aware of azure data sync but my requirements requires ssis for transferring particular data.
Also, do azure data sync have option to sync only particular tables?
Running the SSIS package on your local machine will cause the data to be moved to your machine before being sent out to the destination database.
When you configure a sync group in SQL Azure Data Sync you should be able to select which tables to synchronize.
I'm pretty sure the SQL Azure Data Sync does have the option to select just the tables you need to transfer. However I don't think there is an option to do transformation over the data being transfered.
As for SSIS, I don't see how would a transfer be possible without data first coming to your premises. You have to connections established - 1 connection with the first SQL Azure, and then the other connestion with the second SQL Azure server. And SSIS will pull the data from the first stream (Connection) then push it to the second one.
I would suggest exploring SQL Azure Data Sync, as it might be the best choise for your scenario. Any other option would require data to first come on premise, then being transfered back to the cloud.
Well, there is 3rd option. You create a simple worker based on ADO.NET and SqlBulkCopy class. Put your worker in a worker role in the Cloud, and trigger it by message in an Azure queue or so. Hm. That would seem to be some of the best solution, as you have total control of what is being copied. Thus all the data will stay in MSFT datacenter which means:
Fast transfer
No bandwidth charges (as long as all the 3 - 2 x SQL Azure server + 1 x Worker role are deployed in same affinity group)