Import an SQL file to Azure - sql

I am new to the field and I need some guidance. I want to import an SQL file containing data (~8gb if that matters) in Azure. I have created an SQL database and I want to import it there. Can someone give me some guidance?

I'd suggest BCP to connect to your Azure SQL DB and upload the data directly.
The other way of doing it would be to upload your CSV data to a storage account and create an external table on top of it. See here.
This link lists some additional ways to do it.

Related

I'm trying to migrate the data from a table in a Azure DB to another table in a different AWS DB. What is the best way to achieve this?

I need to migrate the data from one table in Azure DB to another table in a different AWS DB. What is the best way to copy all the information from a table in one database to a table that resides within a different database?
I am using SQL management studio, and the option to script table results in the error invalid version 16 (Microsoft.SQlServer.Smo)
I could copy all the data in the table and add it into an insert statement. The problem is that I would have to format the data, which is subject to error manually. I do not have any former training on how to work with SQL. What is the best way to migrate the data? If anyone can assist, it would be greatly appreciated.
AFAIK your SSMS version is not compatible for this activity.
When I try with version 16, I got below error:
I upgraded the version and tried with 18.12.1 It worked fine for me.
Image for reference:
The file created successfully.
Migrate Azure SQL database table from AWS database table create a .bacpac file using export option to azure blob storage.
Image for reference:
Copy the .bacpac file from Azure Storage to Amazon EC2 EBS storage and import the .bacpac file to Amazon RDS for SQL Server. In this way you can migrate the data of database from Azure SQL to AWS.

Excel into Azure Data Factory into SQL

I read a few threads on this but noticed most are outdated, with excel becoming an integration in 2020.
I have a few excel files stored in Drobox, I would like to automate the extraction of that data into azure data factory, perform some ETL functions with data coming from other sources, and finally push the final, complete table to Azure SQL.
I would like to ask what is the most efficient way of doing so?
Would it be on the basis of automating a logic app to extract the xlsx files into Azure Blob, use data factory for ETL, join with other SQL tables, and finally push the final table to Azure SQL?
Appreciate it!
Before using Logic app to extract excel file Know Issues and Limitations with respect to excel connectors.
If you are importing large files using logic app depending on size of files you are importing consider this thread once - logic apps vs azure functions for large files
Just to summarize approach, I have mentioned below steps:
Step1: Use Azure Logic app to upload excel files from Dropbox to blob storage
Step2: Create data factory pipeline with copy data activity
Step3: Use blob storage service as a source dataset.
Step4: Create SQL database with required schema.
Step5: Do schema mapping
Step6: Finally Use SQL database table as sink

importing csv file into SQL using Logic apps 2020

I have seen several older posts talking about how to import a csv file into SQL using Logic Apps however i have not seen anything thats easy to implement and doesn't require 85+ steps to do.
Has anyone come up with an easy way? I've done this million times using SQL SSIS or other tools to automate but nothing in Logic Apps?
Please let me know if you have a simple solution.
As of now there is no out-of box connector or function in LogicApp which parses a CSV file .
You can always vote for new features feedback: Read a csv file and bulk load into Azure SQL table
Logic App Teams suggest us look at Azure Data Factory to assist with this.
For example: Copy data from Azure Blob storage to a SQL Database by using the Copy Data tool and the blog https://marczak.io/posts/azure-loading-csv-to-sql/ #Hury Shen provided in comment.
We can't import the csv file to into SQL directly, that's these posts look very complex and involve many steps. Just for Logic Apps, there isn't such an easy way for now.

Azure SQL: Archive a sql database before deleting

I have a number of sql databases in azure sql which I believe are no longer in use.
I'm planning on deleting them however, as a precaution, I would like to take some kind of backup or archive copies that I can quickly use to restore each database if necessary.
I've googled around but haven't found anything concrete. I found one mention of making a copy in a storage account so that it can be recovered but haven't been able to find how to do it - the copy command makes a copy of the database to another database. The "restore" option disappears after you remove a database.
The Database's in question are all less than 10mb in size.
Please consider using an export of the database as a bacpac to a cheap Blob Storage account.
On the Storage field on below image you can provide an existent Storage Account or create a new one.
If you need to recover one of those databases you just need to import them and specify the location of the bacpac you want to import.
You can export your Azure SQL database to the BACPAC backup files, store these backup file to Azure Blob Storage or your on-premise computer.
You can restore the your Azure SQL database from the bacpac files when you need.
Alberto Morllo provides the way about export the database to Blob Storage On Portal.
Beside this, there are many ways can help you do that, please reference:
Export to a BACPAC file using the SQLPackage utility;
Export to a BACPAC file using SQL Server Management Studio (SSMS)
Export to a BACPAC file using PowerShell
Get more details, please reference:
Quickstart: Import a BACPAC file to a database in Azure SQL Database
Export an Azure SQL database to a BACPAC file:
You can choose the way you like best.
Hope this helps.

Import large table to azure sql database

I want to transfer one table from my SQL Server instance database to newly created database on Azure. The problem is that insert script is 60 GB large.
I know that the one approach is to create backup file and then load it into storage and then run import on azure. But the problem is that when I try to do so than while importing on azure IO have an error:
Could not load package.
File contains corrupted data.
File contains corrupted data.
Second problem is that using this approach I cant copy only one table, the whole database has to be in the backup file.
So is there any other way to perform such an operation? What is the best solution. And if the backup is the best then why I get this error?
You can use tools out there that make this very easy (point and click). If it's a one time thing, you can use virtually any tool (Red Gate, BlueSyntax...). You always have BCP as well. Most of these approaches will allow you to backup or restore a single table.
If you need something more repeatable, you should consider using a backup API or code this yourself using the SQLBulkCopy class.
I don't know that I'd ever try to execute a 60gb script. Scripts generally do single inserts which aren't very optimized. Have you explored using various bulk import/export options?
http://msdn.microsoft.com/en-us/library/ms175937.aspx/css
http://msdn.microsoft.com/en-us/library/ms188609.aspx/css
If this is a one-time load, using a IaaS VM to do the import into the SQL Azure database might be a good alternative. The data file, once exported could be compressed/zipped and uploaded to blob storage. Then pull that file back out of storage into your VM so you can operate on it.
Have you tried using BCP in the command prompt?
As explained here: Bulk Insert Azure SQL.
You basically create a text file with all your table data in it and bulk copy it your azure sql database by using the BCP command in the command prompt.