[pg_dump]Extract only tables and views from a schema - sql

I am trying to extract the ddl of all the tables and views which are present in a schema of a postgres db.
I am able to export, but in the export it is also including create function and other objects.
Is there any way using which we can only extract tables and views from a schema? Either via limiting the access to objects or by editting a file?
Thanks for the help!!

some options maybe redundant.
pg_dump --dbname=test15 --schema=public --schema-only -N '*catalog*' -t 'public.*' > test.sql
main gottcha:
-t pattern
--table=pattern
Dump only tables with names matching pattern.
The manual have many examples. You can follow it through.

Related

Get SQL code to generate the schema for an existing database in PostgreSQL

Given an existing database in PostgreSQL, I want to get the required SQL code to generate an identical database with no records.
Is there any easy way to do so?
You can use pg_dump command to do that. The option -s dumps only the schema and no data from the database.

Exporting from one schema and importing to another with pg_dump

I have a table called units, which exists in two separate schemas within the same database (we'll call them old_schema, and new_schema). The structure of the table in both schemas are identical. The only difference is that the units table in new_schema is presently empty.
I am attempting to export the data from this table in old_schema and import it into new_schema. I used pg_dump to handle the export, like so:
pg_dump -U username -p 5432 my_database -t old_schema.units -a > units.sql
I then attempted to import it using the following:
psql -U username -p 5432 my_database -f units.sql
Unfortunately, this appeared to try and reinsert back in to the old_schema. Looking at the generated sql file, it seems there is a line, which I think is causing this:
SET search_path = mysql_migration, pg_catalog;
I can, in fact, alter this line to read
SET search_path = public;
And this does prove successful, but I don't believe this is the "correct" way to accomplish this.
Question: When importing data via a script generated through pg_dump, how can I specify in to which schema the data should go without altering the generated file?
There are two main issues here based on the scenario you described.
The difference in the schemas, to which you alluded.
The fact that by dumping the whole table via pg_dump, you're dumping the table definition also, which will cause issues if the table is already present in the destination schema.
To dump only the data, if the table already exists in the destination database (which appears to be the case based on your scenario above), you can dump the table using pg_dump with the --data-only flag.
Then, to address the schema issue, I would recommend doing a search/replace (sed would be a quick way to do it) on the output sql file, replacing old_schema with new_schema.
That way, it will apply the data (which is all that would be in the file, not the table definition itself) to the table in new_schema.
If you need a solution on a broader level to support, say, dynamically named schemas, you can use the same search/replace trick with sed, but instead of replacing it with new_schema, replace it with some placeholder text, say, $$placeholder_schema$$ (something highly unlikely to appear as as token elsewhere in the file), and then, when you need to apply that file to a particular schema, use the original file as a template, copy it, and then modify the copy using sed or similar, replacing the placeholder token with the desired on-the-fly schema name.
You can set some options for psql on the command line, such as --set AUTOCOMMIT=off, however, a similar approach with SEARCH_PATH does not appear to have any effect.
Instead, it needs the form \set SEARCH_PATH to <path>, which can be specified with the -c option, but not in combination with -f (it's either or).
Given that, I think modifying the file with sed is probably the best all around option in this case for use with -f.

Getting MySQL Schemas for All Tables

I want to download/backup the schema of an entire MySQL database. Is there a way to easily do this? I haven't had much luck using a wildcard, but that could be an error on my part.
I would use the --no-data option to mysqldump to dump the schema and not table data.
mysqldump --no-data [db_name] -u[user] -p[password] > schemafile.sql
Login as root, then
show databases; # lists all databases
use information_schema; # connect to the mysql schema
show tables;
select * from tables;
Everything you need is in the information_schema schema. If all you want to do is backup the db, use the builtin dump/restore capabilities.
How about using mysqldump?
mysqldump -u root -p[password] [db_name] > backupfile.sql
If you want to see all the tables from a schema in MySQL then you can use
SHOW TABLES FROM MY_DATABASE;

Iterate over multiple MySQL tables, export 1 table from each

I have around 150 MySQL databases, I need to export 1 table from each of the databases.
Is this possible, username and password are identical for each DB.
I'm sure there's a more compact way to do it but this should work.
#!/bin/bash
mysql -B -e "show databases" | egrep -v "Database|information_schema" | while read db;
do
echo "$db";
mysqldump $db TableName > $db.sql
done
You may need to tweak the mysql and mysqldump calls depending on your connection information.
I think in this case, iteration would be more appropriate (rather than recursion).
If you are on Linux, I'd suggest writing a simple bash script that cycles the 150 DB URLs and calls mysqldump on each one.
See link text, it generates meta data for all databases and all tables, you may be able to adapt it to export data for you. However this is in PHP and I am not certain of the language you wish to use..

Is there a tool to generate a full database DDL for SQL Server? What about Postgres and MySQL?

Using Toad for Oracle, I can generate full DDL files describing all tables, views, source code (procedures, functions, packages), sequences, and grants of an Oracle schema. A great feature is that it separates each DDL declaration into different files (a file for each object, be it a table, a procedure, a view, etc.) so I can write code and see the structure of the database without a DB connection. The other benefit of working with DDL files is that I don't have to connect to the database to generate a DDL each time I need to review table definitions. In Toad for Oracle, the way to do this is to go to Database -> Export and select the appropriate menu item depending on what you want to export. It gives you a nice picture of the database at that point in time.
Is there a "batch" tool that exports
- all table DDLs (including indexes, check/referential constraints)
- all source code (separate files for each procedure, function)
- all views
- all sequences
from SQL Server?
What about PostgreSQL?
What about MySQL?
What about Ingres?
I have no preference as to whether the tool is Open Source or Commercial.
For SQL Server:
In SQL Server Management Studio, right click on your database and choose 'Tasks' -> 'Generate Scripts'.
You will be asked to choose which DDL objects to include in your script.
In PostgreSQL, simply use the -s option to pg_dump. You can get it as a plain sql script (one file for the whole database) on in a custom format that you can then throw a script at to get one file per object if you want it.
The PgAdmin tool will also show you each object's SQL dump, but I don't think there's a nice way to get them all at once from there.
For mysql, I use mysqldump. The command is pretty simple.
$ mysqldump [options] db_name [tables]
$ mysqldump [options] --databases db_name1 [db_name2 db_name3...]
$ mysqldump [options] --all-databases
Plenty of options for this. Take a look here for a good reference.
In addition to the "Generate Scripts" wizard in SSMS you can now use mssql-scripter which is a command line tool to generate DDL and DML scripts.
It's an open source and Python-based tool that you can install via:
pip install mssql-scripter.
Here's an example of what you can use to script the database schema and data to a file.
mssql-scripter -S localhost -d AdventureWorks -U sa --schema-and-data > ./adventureworks.sql
More guidelines: https://github.com/Microsoft/sql-xplat-cli/blob/dev/doc/usage_guide.md
And here is the link to the GitHub repository: https://github.com/Microsoft/sql-xplat-cli
MySQL has a great tool called MySQL workbench that lets you reverse and forward engineer databases, as well as synchronize, which I really like. You can view the DDL when executing these functions.
I wrote SMOscript which does what you are asking for (referring to MSSQL Server)
Following what Daniel Vassallo said, this worked for me:
pg_dump -f c:\filename.sql -C -n public -O -s -d Moodle3.1 -h localhost -p 5432 -U postgres -w
try this python-based tool: Yet another script to split PostgreSQL dumps into object files