I'd like to get dump of a HANA DB using the browser based "SAP HANA Web-based Development Workbench".
I'm especially interessted in exporting:
the structure of the tables including primary and foreign key constraints
the data inside the tables
Once I log into the "SAP HANA Web-based Development Workbench", I'm able to open the "catalog" and execute SQL commands like e.g. SELECT * FROM MY_TABLE;. This allows me to download the data from one table as a CSV. But is there also something similar to pg_dump in postgres, a command that exports both table structure and data as for example a tar-compressed .sql file?
You can right click on the database which you would like to backup and select Export.
Be sure to activate the checkbox Including data. I am not sure if it is also necessary to check the Including dependencies checkbox.
You get a zip file which contains the sql-commands to create the tables and seperate data files which contains the content of the tables. Each table is saved in a seperate directory.
The export command seems relevant.
The server will generate .sql files for structure and .csv for data.
If the database is a managed service such as HANA Cloud, you don't have access to the filesystem and should dump the files to an S3 bucket or an azure blob store.
Otherwise, just grab the files from the server box.
Related
In my application I use multi tenant design in my SQL Server database.
My question is: what if one of my customer comes and say, I want my own data as a backup file, how can I create backup file for just that tenant?
Is there is any way to do that?
Do I need any 3rd party tool? Can I do this in a .NET console application?
how can I create backup file for just that tenant?
There is no way to do this without running an ETL job either to extract the tenant's data, or to remove the other tenants' data from a restored copy of the database.
This is one of the (many) reasons why you should favor using a database-per-tenant architecture.
We are government organisation and receiving data from different stakeholders usually data is not in single for format some like CSV, Excel, database views Also schema is not same always for source data. Some data is continuous streaming of data in CSV format at FTP folders. Is there any software that will automate this all work.
This is where ETL tools comes into picture. Microsoft has SSIS(SQL Server Integration Services) which provides connectors to these systems like another SQL Server or CSV Files or Excel or a shared location where the data files are getting dumped. You might have to design the ETL pipeline to get all the data into your SQL Server database.
SSIS
I am looking for a means to programmatically pass in a Database name and have it return the Schema and Data as a SQL file. We work with several clients who have complex database configurations and we would like to be able to back up those configurations for our records. The people doing the backups would not know how to use a tool such as SQL Server Manager, as well the chance of producing erroneous SQL files is higher.
I want to transfer one table from my SQL Server instance database to newly created database on Azure. The problem is that insert script is 60 GB large.
I know that the one approach is to create backup file and then load it into storage and then run import on azure. But the problem is that when I try to do so than while importing on azure IO have an error:
Could not load package.
File contains corrupted data.
File contains corrupted data.
Second problem is that using this approach I cant copy only one table, the whole database has to be in the backup file.
So is there any other way to perform such an operation? What is the best solution. And if the backup is the best then why I get this error?
You can use tools out there that make this very easy (point and click). If it's a one time thing, you can use virtually any tool (Red Gate, BlueSyntax...). You always have BCP as well. Most of these approaches will allow you to backup or restore a single table.
If you need something more repeatable, you should consider using a backup API or code this yourself using the SQLBulkCopy class.
I don't know that I'd ever try to execute a 60gb script. Scripts generally do single inserts which aren't very optimized. Have you explored using various bulk import/export options?
http://msdn.microsoft.com/en-us/library/ms175937.aspx/css
http://msdn.microsoft.com/en-us/library/ms188609.aspx/css
If this is a one-time load, using a IaaS VM to do the import into the SQL Azure database might be a good alternative. The data file, once exported could be compressed/zipped and uploaded to blob storage. Then pull that file back out of storage into your VM so you can operate on it.
Have you tried using BCP in the command prompt?
As explained here: Bulk Insert Azure SQL.
You basically create a text file with all your table data in it and bulk copy it your azure sql database by using the BCP command in the command prompt.
I am wanting to duplicate an existing schema with the table structure, but not any of the existing data. Essentially, we are separating two companies that currently share a single schema in the database, and they have the exact same data structure, but we want them in different schemas (for access control purposes).
It is possible to copy the entire table structure of one schema into a new schema without bringing over any of the data?
You can do that in SSMS (Sql Server Management Studio)
Right-click on the database
Script Database as
Create to
File
Do a global search-and-replace in the resulting file, changing your schema name to their desired schema name.
I suggest going forward that you maintain change scripts to apply any needed changes to the DB as the application is further developed. That way, you can just share the change scripts and each apply them when you are ready to upgrade the app version.