I'm familiar with creating MDS databases once MDS is installed, but is it possible to install/enable/configure MDS for databases that existed prior to the installation of MDS?
I am unaware of any way to "apply" MDS to an existing database. A possible migration strategy is to recreate the database schema as an MDS model and use the entity based staging feature to migrate data from the original database to the MDS model. Here's a primer on EBS and how it works: http://msdn.microsoft.com/en-us/sqlserver/hh802433.aspx.
Related
Due to reasons (I've been told it's a networking issue with MIs; regardless, we can't fix it, we're waiting on a solution from MS that may or may not come out this year), we cannot talk from on-prem to managed instances. However, we can reach Azure SQL Databases.
We would like to replicate lookup data from on-prem to Azure Managed Instances (MIs) as well as ASDs. Is there any way to use the ASD as a "jump" box for replication, maybe by putting the Distributor on an MI that can talk to the ASD?
Looked at Azure Data Sync, but the 5-minute-minimum makes it a no-go.
Otherwise, our current fallback is to run an Azure VM/AKS instance, replicate to it, then from there to the ASDs/MIs. But man, I'd rather not have to do that.
Any suggestions appreciated.
One Way Transactional replication using SQL Data Sync for Azure.
If they wish to maintain the replication running after the migration to Managed Instances, transactional replication will be the best option at this time. Replication to Azure SQL Database
Or using ETL via Azure DataFactory
Transfer data from a SQL Server database to an Azure SQL Database using Azure Blob Storage and the Azure Data Factory (ADF): this is a supported legacy technique that benefits from a replicated staging copy.
ADF pipeline consisting of two data migration processes. They work together to transfer data between a SQL Server database and an Azure SQL Database on a regular basis. The two actions are as follows:
Data should be copied from a SQL Server database to an Azure Blob Storage account
I have a need to load data into Azure Hyperscale incrementally.
Source data is in Azure VM that has SQL server installed in it.
Source database is about 6Tb in size and has about 370 tables.
We need a way to get incremental changes in the last X amount of hours and sync them into the same database in Hyperscale.
Ideally, we would extend our database with the availability group setup but since Hyperscale does not support that, we need to find a way to keep these in sync.
Source database does have change data capture enabled.
The best on-line migration option is to use the Azure Database Migration Service (link) where the Online (continuous sync) migration support scenario (link) you need is supported:
The sync will essentially run in the background until completed while being able to access the data that has been migrated. I believe this is a continuous copy scenario and is not incremental. With PaaS database services, you do not have access to perform snapshot replication operations from external data sources. The Hyperscale instance is built upon snapshot replication but it currently only serves the hosted database functionality.
Regards,
Mike
Is there a way to directly migrate your database in Azure SQL database to the Azure PostgreSQL database (HyperScale-Citus).
I have looked into the Azure migration services but it does not support this particular migration route.
I have an approach in mind but don't know if it will work?
We can make a backup of the Azure SQL database on the cloud itself
and then load that backup to Azure PostgreSQL database
But I do not where to make a backup. In azure blob storage or something else?
Frist way, you could try the tutorial #ffffff01 provided for you.
There this another way can help you achieve that: Data Factory can help you migrate the database/data from Azure SQL database to Azure PostgreSQL database directly.
Ref bellow tutorial:
Copy data to and from Azure Database for PostgreSQL by using Azure
Data Factory
Copy and transform data in Azure SQL Database by using Azure Data
Factory
Create Azure SQL database as source dataset and Azure PostgreSQL database as sink.
Hope this helps.
I have a local SQL database (database first EF application) and an Azure database which are synced by the azure data sync service.
Now what do I have to do to when I update the local schema of an table? Of course I have to update the schema in the sync service but that is not enough. The schema of the azure database isn't updated by the sync service itself.
Before I used azure data sync I could simply call the sql schema compare from Visual Studio but now there are so many new tables that I don't know what to update and what not.
When I update the azure database manually with the management portal the sync does work. But isn't this also possible via Visual Studio schema compare (or SQL SMMS)?
i think you already answered your own question, you said you could do it via VS except that there's too many new tables.
just skip any object that has a DSS prefix to it, those are used by SQL Azure Data Sync Service.
but as you already mentioned, you still have to edit the dataset definition in your Data Sync Service sync group.
Is it possible in any way to just update the database schema from entity model..? So that all the information in the database stays in the database? When you generate database from model the information will get lost.. I work against SQL Azure, and I have not found any tool to manage the tables and realations in the SQL Azure database in a proper way.. It would be soo nice if it could be done by the Entity Framework designer.
I think this will be resolved (from an entity framework point of view) in the next EF release.
From an azure point of view you can use SQL Azure Migration Wizard.