How to import a transactionally-inconsistent bacpac - azure-sql-database

It is well-known that creating a bacpac on SQL Azure does not guarantee transactional consistency when doing an export of a live, changing database.
The accepted workaround is to create a snapshot of the database first, by copying it, and then doing an export.
This approach is pretty ridiculous, because it forces users to spend extra money for relational DB storage. In fact, in the older days of SQL Azure, databases were billed by the day, so creating daily bacpacs from production databases essentially used to double the costs (it's now billed by the hour, if I'm not mistaken).
However, my question is not about this. My question is as follows - if it is acceptable for me to have a transactionally inconsistent bacpac, is there any way of actually restoring (i.e. importing it)? The problem is simple - because some constraints are no longer satisfied, the import fails (say, with a FK exception). While the bacpac restore is nothing more than re-creating the DB from the schema, followed by bulk imports, the entire process is completely opaque and not much control is given to the user. However, since Azure SQL tools are always in flux, I would not be surprised if this became possible.
So, to recap, the question: given a potentially inconsistent bacpac (i.e. some constaints won't hold), is there a way (without writing tons of code) to import it into an on-premise database?

Try using BCP.exe to import the data.
bacpac is a zip file. You can open the bacpac by changing its file
extension to .zip. All data is captured in .bcp file format in
‘Data’ folder.
Move Data folder out from the zip file and save it for step 4 below.
Change the .zip extension back to .bacpac and
import it. It creates a database with schema only.
Using bcp.exe, import .bcp files to tables in the database.
https://msdn.microsoft.com/en-us/library/ms162802.aspx
Troubleshoot and fix the data inconsistency.
If you already know which table contains inconsistent data, you can move out bcp files for that tables only and import them using bcp.

Related

Quickest way to import a large (50gb) csv file into azure database

I've just consolidated 100 csv.files into a single monster file with a total size of about 50gb.
I now need to load this into my azure database. Given that I have already created my table in the database what would be the quickest method for me to get this single file into the table?
The methods I've read about include: Import Flat File, blob storage/data factory, BCP.
I'm looking for the quickest method that someone can recommend please?
Azure data factory should be a good fit for this scenario as it is built to process and transform data without worrying about the scale.
Assuming that you have the large csv file stored somewhere on the disk you do not want to move it to any external storage (to save time and cost) - it would be better if you simply create a self integration runtime pointing to your machine hosting your csv file and create linked service in ADF to read the file. Once that is done, simply ingest the file and point it to the sink which is your SQL Azure database.
https://learn.microsoft.com/en-us/azure/data-factory/connector-file-system

How to mend specific records in a live SQL Azure instance? Best Practice?

I am trying to work out the best practice for recovering specific records that get corrupted in a live SQL Azure DB.
I have got the automatic BacPac reports working.
Currently I would import the relevant BacPac into a recovery DB instance and then compare this instance with the live instance using MS SQL Data Tools or Red Gate Data Compare, and then create a sync script for the required records. Not sure if this is the best practice approach though.
This is also part of a slightly bigger question which is how to do deal with certain backup situations ie
1) Complete DB corruption
Recover using complete BacPac
2) Significant number of records damaged
difficult to answer!!
3) Small number of corrupt records to mend, using user specific records.
Use Data compare tools such as MS SQL Data Tools or RedGate Data Compare.
A colleague also mentioned to me about "periodic transaction log backups". Not sure if SQL Azure does these?
Thoughts?
Thanks in advance.

Import large table to azure sql database

I want to transfer one table from my SQL Server instance database to newly created database on Azure. The problem is that insert script is 60 GB large.
I know that the one approach is to create backup file and then load it into storage and then run import on azure. But the problem is that when I try to do so than while importing on azure IO have an error:
Could not load package.
File contains corrupted data.
File contains corrupted data.
Second problem is that using this approach I cant copy only one table, the whole database has to be in the backup file.
So is there any other way to perform such an operation? What is the best solution. And if the backup is the best then why I get this error?
You can use tools out there that make this very easy (point and click). If it's a one time thing, you can use virtually any tool (Red Gate, BlueSyntax...). You always have BCP as well. Most of these approaches will allow you to backup or restore a single table.
If you need something more repeatable, you should consider using a backup API or code this yourself using the SQLBulkCopy class.
I don't know that I'd ever try to execute a 60gb script. Scripts generally do single inserts which aren't very optimized. Have you explored using various bulk import/export options?
http://msdn.microsoft.com/en-us/library/ms175937.aspx/css
http://msdn.microsoft.com/en-us/library/ms188609.aspx/css
If this is a one-time load, using a IaaS VM to do the import into the SQL Azure database might be a good alternative. The data file, once exported could be compressed/zipped and uploaded to blob storage. Then pull that file back out of storage into your VM so you can operate on it.
Have you tried using BCP in the command prompt?
As explained here: Bulk Insert Azure SQL.
You basically create a text file with all your table data in it and bulk copy it your azure sql database by using the BCP command in the command prompt.

Scheduled export of database structure (table, view, sp) to file

I'm using SQL 2005. I can right click on a database and create scripts for the database that will recreate the structure (tables, views, stored procedures) elsewhere. Or just as a backup, version, etc.
But, is there a way I can schedule it to do this? And output to a folder I choose?
I really appreciate the help.
Don
You could schedule this using SMO probably, though it may take some work to get up and running.
However, a more elegant approach might be to schedule a full backup to a new file (with today's timestamp), and archive it. This way retrieving the scripts is as simple as restoring that version of the database somewhere, and extracting manually.
An even better approach: if you store your change scripts in source control, you should always be able to pull any version of the database.
I've used both SMO's predecessor (SQL-DMO) from VB as well as ApexSQLScript from the command line to do scheduled scripting of objects.
This is fine for very large databases where you do not have ability to quickly restore a database just to look at schema versioning information for small tables/views/procs which happen to live in the same database.
In fact, this is a good argument for separating out small fast-changing schemas into separate databases from large-slowly changing schemas.

Easiest way to copy a MySQL database?

Does anyone know of an easy way to copy a database from one computer to a file, and then import it on another computer?
Here are a few options:
mysqldump
The easiest, guaranteed-to-work way to do it is to use mysqldump. See the manual pages for the utility here:
http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html
Basically, it dumps the SQL scripts required to rebuild the contents of the database, including creation of tables, triggers, and other objects and insertion of the data (it's all configurable, so if you already have the schema set up somewhere else, you can just dump the data, for example).
Copying individual MyISAM table files
If you have a large amount of data and you are using the MyISAM storage engine for the tables that you want to copy, you can just shut down mysqld and copy the .frm, .myd, and .myi files from one database folder to another (even on another system). This will not work for InnoDB tables, and may or may not work for other storage engines (with which I am less familiar).
mysqlhotcopy
If you need to dump the contents of a database while the database server is running, you can use mysqlhotcopy (note that this only works for MyISAM and Archive tables):
http://dev.mysql.com/doc/refman/5.0/en/mysqlhotcopy.html
Copying the entire data folder
If you are copying the entire database installation, so, all of the databases and the contents of every database, you can just shut down mysqld, zip up your entire MySQL data directory, and copy it to the new server's data directory.
This is the only way (that I know of) to copy InnoDB files from one instance to another. This will work fine if you're moving between servers running the same OS family and the same version of MySQL; it may work for moving between operating systems and/or versions of MySQL; off the top of my head, I don't know.
You may very well use SQL yog - a product of web yog.. it uses similar techniques mentioned above but gives you a good GUI making you know what you are doing. You can get a community project of the same or a trial version from site
http://www.webyog.com/en/downloads.php#sqlyog
This has option for creating backups to a file and restoring the file into new server. Even better option of exporting database from one server to another is there..
Cheers,
RDJ