Restore a large SQL Managed Instance to On Premise - azure-sql-managed-instance

We are trying to restore a large SQL Managed Instance database to On Premise running SQL Server 2016. We tried SQL Export Dump, since the database is large it is taking a long time to complete.
Are there alternative approaches to copy SQL Managed Instance DB to On premise Server running SQL Server 2016.

Good question. I am sure there is a better way of doing this, but the only way I have managed this so far is to
a) select the tables into one or more separate databases (so no indexes etc.)
b) Export data tier application(s) as a bacpac.
c) Import data tier application (it has to be Management Studio 18) into SQL 2016. If more than one database select * into again.
d) Add the users, then all the indexes and primary keys. We are fortunate as we have our database defined in a database project so we can just apply that.

You can try having a powerful VM in same region to speed up this process or as a alternate you can generate schema of your database and use BCP utility for big tables.

Use sqlpackage.exe in a Azure VM hosted in same region. I used this for a 10 GB database and it took 120 minutes to complete the task of export and then import.
SQLpackage

Related

what is the best way to copy a large sql database from azure managed instance to azure single database?

Hello folks first post in stack, btw wonderful community and helps out a lot.
like mentioned in the title what is the best way to copy such a large database? we got an ~ 500 GB Database and im currently moving this database from managed instance to a azure single database using smss:smss copy via deploy to microsoft azure sql database and it takes me right now 22 hours. i feel like im back in early 20s.
it's all in the same subscription and also in the same network configuration. afaik the process of that is that smss creates a bacpac file and then import it back to the single database. but 16 hours is just too long. so do you know any better option to do this quicker because i've a hell of more and partly larger databases to copy.
Did you think about using ETL tools, such as Azure Data Factory? It has good performance to migrate the big data. Ref this performance table:
It supports SQL database and Azure SQL MI. Ref these tutorial:
Copy and transform data in Azure SQL Database by using Azure Data Factory
Copy and transform data in Azure SQL Managed Instance by using Azure
Data Factory
It may takes some money but save much time. As we all know, time is money.
HTH.

Azure Growing Sql Database

1.I am new in azure, I want to know can we have same replication mechanism provided by on premise sql on azure sql db?
2 .Issue we are facing is, few of the tables are growing fast, daily insert around 10k records, so we are planning to keep only few months say 6 data on main DB and copy all data to other DB using replication (not sure if feasible).
We need to read data from backup as well in application for some reports.
Please suggest on this if replication will work or any other solution.
Geo-replication uses a version of AlwaysOn with async replicas under the hood. It is very similar to a distributed Availability Group in SQL 2016, but you cannot control it, you can only turn it on or off.
Replication will work for that, but it would replicate all the data in the DB, not just the tables you want.
Link to Azure Documentation: https://azure.microsoft.com/en-us/documentation/articles/sql-database-geo-replication-overview/

importing data from one database to other when schemas are different

I have this SQL Server 2005 DB and we are upgrading to a new DB in SQL Server 2008. The schemas would be slightly different between the databases. What would be a best option to copy data from the old DB to the new DB.
Define "slightly different". Integration Service is probably the way to go.
Since the schemas are only slightly different, I would suggest making a full backup of the SQL2005 DB and restoring it on the SQL2008 server. Once you have the direct copy, use a script to migrate the data on the few tables where the schema is different.
It really depends on the definition of slightly.
Use Integration Service to design a whole package, or you can connect to the 2005 instance from within 2008 SSMS and right click on the tables and select "Import Data". This will bring up an Wizard that will do the work for you (through SSIS) and let you copy the data right over.
Another option is to use SQL Server Data Tools, which is a new tool from MS that has a schema compare tool where you can actually generate the scripts to create the tables and related objects.
http://msdn.microsoft.com/en-us/data/gg427686

SQL Script for SQL Azure

I'm developing an app that uses SQL Azure. I don't have an account to access Windows Azure and I want to start writing SQL script.
Can I use my SQL Server 2008 to test my SQL Azure script?
I would also recommend a SQL Azure account. However, if you dont have access to one, you can create the DB in SQL 2008, then export a SQL Azure compatable script. Then use that script for testing purposes.
Here is a link: http://blogs.msdn.com/b/cesardelatorre/archive/2010/06/04/importing-exporting-data-to-sql-azure-databases-using-bcp-and-sql-scripts.aspx
One thing you should accomodate for when coding for SQL Azure is a failover or retry policy per this article: http://blogs.msdn.com/b/appfabriccat/archive/2010/12/11/sql-azure-and-entity-framework-connection-fault-handling.aspx#comments
Not really, because certain SQL statements are not supported, or partially supported. So unless you are already very familiar with the differences between SQL Azure and SQL Server the general recommendation is to create your scripts against a SQL Azure database.
Opening an Account is really simple. Remember that when you create a SQL Azure database your charges are pro-rated daily. So if you create a development database of 1GB is size (the minimum) you will pay $9.99 per month (plus a really low transfer cost), or "roughly" 33 cents per day. If you create a 1GB database on a Monday and drop it the following Wednesday, you will pay roughly a buck. There are no charges for the master database.
I would agree with Herve. It would likely be best if you actually use SQL Azure as there are some differences between SQL Sever and SQL Azure. You can get a free 30 day account (with no credit card) using the following:
Use this link: http://www.windowsazurepass.com/?campid=9FE3DB53-E4F0-DF11-B2EA-001F29C6FB82
Use this passcode: promo code = DPEWE01
What you want to do is actually the right way, you can create a database in you local SQL Server 2008 instance, I would recommend using latest community edition for this purpose.
Create database, tables.. work on it, then generate a script to later export to SQL Azure.
Make sure to see documentation for pressing changes or things not to use while doing your job that might not be working on SQL Azure.

Sync between Sql Server and Mysql Server

I have 2 big tables in sql server that i need to sync to mysql.
now, i need that as an ongoing process.
the tables are 1 GB each and getting new/update/delete row every 0.1 second.
Can you recommend me a tool that can do it that is not resource expensive.
you can offer OPEN SOURCE and commercial as well
Thanks
You could create a linked server instance in SQL Server, pointing to the MySQL instance. This article gives the step-by-step process. Once that is in place, providing you grant the MySQL user you connect on behalf of proper permissions, you can write to the MySQL instance as you like. So you could easily update stored procedures to do an additional step to insert records into MySQL.