Backup db scheme (structure) easily - sql

I have a pretty big db in the terms of amount of different db object, not the size. I want to backup it (its structure, scheme). Preferably, I'd like to get the sql code. Of course, I can navigate to each its object and just get it but, as I said, there are a lot of them.
How do I do this easily?

You can use pg_dump --schema-only --format=plain > dump_file.sql to dump your schema (db objects without data) into a sql file.
Details here: pg_dump.

Related

Copy Database from server to local DB (I need copy only structure, without data)

I have a one problem. I have a database(MSSQL) on some server. I need copy this DB on my local DB. But! This DB is huge 7gb. I don't need data from this DB. I need just copy structure of this DB (What I mean about structure - all DB, all tables, connections, etc. (!!!But I don't need a data from this tables)). How I understand it - It will clean DB with the same structure.
I have no idea how implement it. I read what I can do it using PowerShell, Git and SQL manager. But i didn't find example. Could you tell me something about it?
Right click on your Database => Tasks => Generate Scripts
Screenshots shows this on a System Database. Don't do this on system databases.
Select the Tables / Views / Stored Procedures you want.
Change the scripting options depending on your preferences/requirements.
Generate the script
If you use MS SQL Server Management Studio, you can right-click on the tables you want to have in your other db -> Generate Script for -> Create -> and then choose one opion.
Only drawback is, that you have to do that for every table individually.

Sybase: Generate a script Between two SQL?

I've a Sybase database where I will make a LOT of changes, and I would like to make my changes (currently using PowerDesigner 16), save it as .SQL then generate a .SQL to migrate my initial database to the new database structure.
I don't care about the data, I just want to update the structure with this script.
Any idea how to create this script?
EDIT: It has nothing to do with the given "duplicate", the other ticket is speaking on how to export the sql for SP/... In my case I only want the difference between the two sql
If you don't have one, you can create a model from your initial database, with File > Reverse Engineer > Database.
Create a copy of this model with File > Save As (as type Archived PDM), say copy.apm.
Then modify the model as you wish.
Then use Database > Apply Model Changes to Database, using the option Using an archive model (select the above copy.apm), to generate the ALTER script.
Anybody looking for a program agnostic way of generating the DDL for a database could use the ddlgen program that comes with ASE
See http://infocenter.sybase.com/help/index.jsp?topic=/com.sybase.infocenter.dc30191.1572/html/utilityguide/CHDBBGGC.htm
So to create the DDL of database pubs2, you could do something like
ddlgen -Usa -Ppassword123 -SSERVERNAME -TDB -Npubs2 -Ooutput_file.sql

SQL, moving million records from a database to other database

I am a C# developer, I am not really good with SQL. I have a simple questions here. I need to move more than 50 millions records from a database to other database. I tried to use the import function in ms SQL, however it got stuck because the log was full (I got an error message The transaction log for database 'mydatabase' is full due to 'LOG_BACKUP'). The database recovery model was set to simple. My friend said that importing millions records using task->import data will cause the log to be massive and told me to use loop instead to transfer the data, does anyone know how and why? thanks in advance
If you are moving the entire database, use backup and restore, it will be the quickest and easiest.
http://technet.microsoft.com/en-us/library/ms187048.aspx
If you are just moving a single table read about and use the BCP command line tools for this many records:
The bcp utility bulk copies data between an instance of Microsoft SQL Server and a data file in a user-specified format. The bcp utility can be used to import large numbers of new rows into SQL Server tables or to export data out of tables into data files. Except when used with the queryout option, the utility requires no knowledge of Transact-SQL. To import data into a table, you must either use a format file created for that table or understand the structure of the table and the types of data that are valid for its columns.
http://technet.microsoft.com/en-us/library/ms162802.aspx
The fastest and probably most reliable way is to bulk copy the data out via SQL Server's bcp.exe utility. If the schema on the destination database is exactly identical to that on the source database, including nullability of columns, export it in "native format":
http://technet.microsoft.com/en-us/library/ms191232.aspx
http://technet.microsoft.com/en-us/library/ms189941.aspx
If the schema differs between source and target, you will encounter...interesting (yes, interesting is a good word for it) problems.
If the schemas differ or you need to perform any transforms on the data, consider using text format. Or another format (BCP lets you create and use a format file to specify the format of the data for export/import).
You might consider exporting data in chunks: if you encounter problems it gives you an easier time of restarting without losing all the work done so far.
You might also consider zipping the exported data files up to minimize time on the wire.
Then FTP the files over to the destination server.
bcp them in. You can use the bcp utility on the destination server for the BULK IMPORT statement in SQL Server to do the work. Makes no real difference.
The nice thing about using BCP to load the data is that the load is what is described as a 'non-logged' transaction, though it's really more like a 'minimally logged' transaction.
If the tables on the destination server have IDENTITY columns, you'll need to use SET IDENTITY statement to disable the identity column on the the table(s) involved for the nonce (don't forget to reenable it). After your data is imported, you'll need to run DBCC CHECKIDENT to get things back in synch.
And depending on what your doing, it can sometimes be helpful to put the database in single-user mode or dbo-only mode for the duration of the surgery: http://msdn.microsoft.com/en-us/library/bb522682.aspx
Another approach I've used to great effect is to use Perl's DBI/DBD modules (which provide access to the bulk copy interface) and write a perl script to suck out the data from the source server, transform it and bulk load it directly into the destination server, without having to save it to disk and move it. Also means you can trap errors and design things for recovery and restart right at the point of failure.
Use BCP to migrate data.
Another approach i have used in the past is to take a backup of the transaction log and shrink the log Prior to the migration. Split the migration script in parts and run the log backup- shrink - migrate iteration a few times.

Batch file to get sql backup scripts

Is there any way where I can use Batch files to get backup of the selected scripts from the SQL database...?
Say - I have one stored procedure, one function and one view in a folder.
sp1.sql
vie1.sql
fn1.sql
Before run the batch file I want to take the backup of these files.
Kindly note: I do not want to take entire database backup. Just the provided scripts alone.
Help me to achieve this one pls...
The specific answer depends entirely on the flavor of your database engine. But the general answer is you need to SELECT the definition from your database's data catalog (meta data). The function and procedure definition will probably come out intact. But the view definition may come out as just the SELECT statement - you might have to prefix it with the CREATE VIEW XXXXXXX AS part.

Copy all data from one SQLServer database to another on same machine

I want to copy all data from one database to another which has the same structure. The databases reside on the same machine & on same sql server.
I have googled a bit & have found solutions like this
INSERT states (statecode, statename)
SELECT statecode, statename
FROM server1.database1.dbo.states
But the problem is they are copying table by table & I have like more then 100 tables. I was thinking that is there a way to copy all of the data at once.
Views & stored procedures all should be copied.
Or should I be looking in some other direction to achieve this ...?
If this is a one-time need, use the (Database) > Tasks > Generate Scripts menu option in SQL Server Management Studio.
Some options:
Use the DB back up and restore tools to just move a big backup file. This is the simplest option.
Slave the 2nd instance off of the 1st. It'll keep it up to date, but can be a pain.
Use import export wizard to transfer the data from one DB to another DB and use Generate script for the Transfer the Procedure and views.
Check out tools like Red-Gate SQL Compare (for structural comparison) and SQL Data Compare (for data content compare). With Data Compare, you can also easily update one database from another (or a database backup, even).
They're not free - but if you have to do this several times over and over, just the time (not to speak of the hassle) you save yourself will easily outweigh the cost of purchasing these tools. Excellent stuff - highly recommended!