I have a simple Azure website and Azure SQL database. I have now created a new (empty) Azure SQL database and I want to copy the contents of the old database into the new one. The data is only a few kB in size but it would be a pain to do it manually. What is a quick and easy way to do this, using simple tools like Visual Studio and Azure portal?
So just to be clear I want to copy all tables and rows to the new DB.
It turns out you can do this using the Azure portal.
On the original database, choose Export. You need a storage account in the same region as the database for this.
After exporting the database to a .bapac file, choose New / SQL Database / Import and point it to the bapac file. If this is in a different data centre, it will incur data bandwidth charges.
Very simple and this worked great; just a little difficult to find at first.
When I need to such operation, I use Sql Server Data Tools. It`s a plugin for Visual Studio that allow you to copy data, schema and migrate from one version to another.
http://blogs.msdn.com/b/ssdt/
Related
what is the best way to transfer database copy as backed up file for outside maintenance on the application? But that copy should not have any sensitive data and it can only have dummy data. What is the efficient and best practice to erase all data in the tables and populate with dummy data? ( sql server 2019)
This is not a trivial task. A 3rd party solution would probably be easiest.
There are several answers available here that discuss copying objects in SQL Server Management Studio. Example: Backup SQL Schema Only?.
If you have access to SQL Server Integration Services, you can copy selected objects using the Transfer SQL Server Objects Task. I have tried this once a long time ago, so I have very little experience to describe how it works.
Another option is to create a job that runs a copy-only backup, restores the database, and then runs a manual series of SQL queries to clear or mask sensitive data.
Good morning! I've found TONS of articles, questions, and guides on how to import data from local excel documents to Azure SQL databases, or how to pull from an Azure database to Excel, but nothing about how I could use SQL to query an excel online document (which would be hosted on SharePoint).
I'm fairly new in my learning - I'd be setting this up via a query in SQL written/executed via Azure Data Studio. The excel file is one that I'd be creating, and hosting via our company's SharePoint system. The Azure SQL database will also be one that I'm constructing myself, which is in progress. I've tried to find walkthroughs, scripts, explanations, something. But it's totally silent. Granted, that could be an indicator that it can't be done, but I figured I'd ask here. Overall, I'm just trying to figure out what is possible, so I can come up with a decent range of simple, easy-to-use means of data input for my team, or, in this case, to capture some of the ways they're tracking their work.
Not sure if this is sufficient detail, please feel free to ask any follow-up questions.
Azure Data Studio is a tool to work with SQL databases, most notably MS SQL. Though you can connect to some other types as well.
Therefore, in order to use Data Studio to query your data, it needs to be in a SQL database. To accomplish that, you need to setup a process to load the data from your Excel document into a table in you database and run that process on a regular basis to update the table. You could look into Azure Data Factory to do that, though I don't see why you should bother to do that just to use Azure Data Studio, when you can browse the data in Excel, use PowerBI, Qlik or any other tool that can connect directly to Excel.
Hello folks first post in stack, btw wonderful community and helps out a lot.
like mentioned in the title what is the best way to copy such a large database? we got an ~ 500 GB Database and im currently moving this database from managed instance to a azure single database using smss:smss copy via deploy to microsoft azure sql database and it takes me right now 22 hours. i feel like im back in early 20s.
it's all in the same subscription and also in the same network configuration. afaik the process of that is that smss creates a bacpac file and then import it back to the single database. but 16 hours is just too long. so do you know any better option to do this quicker because i've a hell of more and partly larger databases to copy.
Did you think about using ETL tools, such as Azure Data Factory? It has good performance to migrate the big data. Ref this performance table:
It supports SQL database and Azure SQL MI. Ref these tutorial:
Copy and transform data in Azure SQL Database by using Azure Data Factory
Copy and transform data in Azure SQL Managed Instance by using Azure
Data Factory
It may takes some money but save much time. As we all know, time is money.
HTH.
I need to create a database solely for analytical purposes. The idea here is for it to start off as a 1:1 replica of a current SQL Server database but we will then add in additional tables. The idea here is to be able to have read-write access to a db without dropping anything in production inadvertently.
We would ideally like to set a daily refresh schedule to update all tables in the new tb to match the tables in the live environment.
In terms of the DBMS for the new database, I am very easy - MySQL, SQL Server, PostgreSQL would be great -- I am not hugely familiar with the Google Storage/BigQuery stack but if this is an easy option, I'm open to it.
You could use a standard HA/DR solution with a readable secondary (Availability Groups/mirroring /log shipping).
then have a second database on the new server for your additional tables.
Cloud Storage and BigQuery are not RDBMS services themselves, but could be used in this case to store the backups/exports/dumps from the replica, and then have the analytical work performed on those backups.
Here is an example workflow:
Perform a backup and restore in a different database
Add the new tables in the new database
Export the database as a CSV file on your local machine
Here you could either directly load the CSV file in BigQuery, or upload that file in a Cloud Storage bucket previously created
Query the data
I suggest to take a look at the multiple methods for loading data in BigQuery, as well as the methods for querying against external data sources which may help to determine which database replication/export method might be best for your use case.
I am new to Azure SQL.
We have a client db which is in Azure SQL. We need to set up a process automation which extract query results to .CSV files and load it in our server (on premise SQL server 2008 R2).
What is the best method to generate csv files from Azure sql and make it accessible for the on premise server?
Honestly the best in terms of professional approach is to use Azure Data Factory and installation of Integration Runtime on the on premises.
You of course can use BCP but it will be cumbersome in the long run. A lot of scripts, tables, maintenance. No logging, no metrics, no alerts... Don't do it honestly.
SSIS is another option butin my opinion it takes more effort than ADF solution.
Azure Data Factory will allow you to do this in professional way using user interface with no coding. It also can be parametrized so you just change name of table name parameter and suddenly you are exporting 20, 50 or 100 tables at ease.
Here is video example and intro into data factory if you want to see quick overview. In this overview there is also demo which imports CSV to Azure SQL, you can just change it a little bit to make Azure SQL -> CSV and CSV > SQL server or just directly Azure SQL > SQL server.
https://youtu.be/EpDkxTHAhOs
It really is straightforward.
Consider using simple bcp from the on prem environment save the results to csv and then load the csv into the on prem server.
You can also use SSIS to implement an automated task.
Though I would like to know why you need the intermediate csv file? you can simply just copy data between databases (cloud -> On prem) with a scheduled SSIS package.
If you have on-prem SQL access then a simple SSIS package is probably the quickest and easiest way to go. If your source is Azure SQL and the ultimate destination is On-Prem SQL, you could use SSIS and skip the CSV all together.
If you want to stick to an Azure PAAS solution you could consider using Azure Data Factory. You can setup a gateway to access the on-prem SQL server directly or if you really want to stick to a CSC then look into using a Logic App.
Azure Data Factory is surely option.
Simple solution would be pyodbc driver with little bit of python. https://learn.microsoft.com/en-us/sql/connect/python/python-driver-for-sql-server?view=sql-server-2017
You can also try sqlcmd and bit of powershell or bash on top.
https://learn.microsoft.com/en-us/sql/tools/sqlcmd-utility?view=sql-server-2017