I want to download/backup the schema of an entire MySQL database. Is there a way to easily do this? I haven't had much luck using a wildcard, but that could be an error on my part.
I would use the --no-data option to mysqldump to dump the schema and not table data.
mysqldump --no-data [db_name] -u[user] -p[password] > schemafile.sql
Login as root, then
show databases; # lists all databases
use information_schema; # connect to the mysql schema
show tables;
select * from tables;
Everything you need is in the information_schema schema. If all you want to do is backup the db, use the builtin dump/restore capabilities.
How about using mysqldump?
mysqldump -u root -p[password] [db_name] > backupfile.sql
If you want to see all the tables from a schema in MySQL then you can use
SHOW TABLES FROM MY_DATABASE;
Related
I am trying to extract the ddl of all the tables and views which are present in a schema of a postgres db.
I am able to export, but in the export it is also including create function and other objects.
Is there any way using which we can only extract tables and views from a schema? Either via limiting the access to objects or by editting a file?
Thanks for the help!!
some options maybe redundant.
pg_dump --dbname=test15 --schema=public --schema-only -N '*catalog*' -t 'public.*' > test.sql
main gottcha:
-t pattern
--table=pattern
Dump only tables with names matching pattern.
The manual have many examples. You can follow it through.
I have tables from different databases , and i want to create a data warehouse database that contains table replicas from different tables from different databases. I want the data in the warehouse to be synced with the data from the other tables everyday.I am using postgresql
I tried to do this using psql :
pg_dump -t table_to_copy source_db | psql target_db
However it didnt work as it keeps stating errors like table does no exist.
It all worked when i dumped the whole dabatase not only a single table, but however i want the data to be synced and i want to copy tables from different databases not the whole database.
How can i do this?
Thanks!
Probably you need FDW - Foreign Data Wrapper. You can create foreign tables for different external db in different schemas on local db. All tables accessible by local queries. For storing snap you can use local tables with just INSERT INTO local_table_YYYY_MM SELECT * FROM remote_table; .
1
pg_dump -t <table name> <source DB> | psql -d <target DB>
(Check the table name correctly and it says for you , table doesn't exist)
2
pg_dump allows the dumping of only select tables:
pg_dump -Fc -f output.dump -t tablename databasename
(dump 'tablename' from database 'databasename' into file 'output.dump' in pg_dumps binary custom format)
You can restore that pg_restore:
pg_restore -d databasename output.dump
If the table itself already exists in your target database, you can import only the rows by adding the --data-only flag.
Dblink
You can not perform cross database query like SQL Server, PostgreSQL does not support this. DbLink extension of PostgreSQL which is used to connect one database to another database. You have to install and configure DbLink to execute cross database query.
Here is the step by step script and example for executing cross database query in PostgreSQL. Please visit this post:
Given an existing database in PostgreSQL, I want to get the required SQL code to generate an identical database with no records.
Is there any easy way to do so?
You can use pg_dump command to do that. The option -s dumps only the schema and no data from the database.
I have around 150 MySQL databases, I need to export 1 table from each of the databases.
Is this possible, username and password are identical for each DB.
I'm sure there's a more compact way to do it but this should work.
#!/bin/bash
mysql -B -e "show databases" | egrep -v "Database|information_schema" | while read db;
do
echo "$db";
mysqldump $db TableName > $db.sql
done
You may need to tweak the mysql and mysqldump calls depending on your connection information.
I think in this case, iteration would be more appropriate (rather than recursion).
If you are on Linux, I'd suggest writing a simple bash script that cycles the 150 DB URLs and calls mysqldump on each one.
See link text, it generates meta data for all databases and all tables, you may be able to adapt it to export data for you. However this is in PHP and I am not certain of the language you wish to use..
Using Toad for Oracle, I can generate full DDL files describing all tables, views, source code (procedures, functions, packages), sequences, and grants of an Oracle schema. A great feature is that it separates each DDL declaration into different files (a file for each object, be it a table, a procedure, a view, etc.) so I can write code and see the structure of the database without a DB connection. The other benefit of working with DDL files is that I don't have to connect to the database to generate a DDL each time I need to review table definitions. In Toad for Oracle, the way to do this is to go to Database -> Export and select the appropriate menu item depending on what you want to export. It gives you a nice picture of the database at that point in time.
Is there a "batch" tool that exports
- all table DDLs (including indexes, check/referential constraints)
- all source code (separate files for each procedure, function)
- all views
- all sequences
from SQL Server?
What about PostgreSQL?
What about MySQL?
What about Ingres?
I have no preference as to whether the tool is Open Source or Commercial.
For SQL Server:
In SQL Server Management Studio, right click on your database and choose 'Tasks' -> 'Generate Scripts'.
You will be asked to choose which DDL objects to include in your script.
In PostgreSQL, simply use the -s option to pg_dump. You can get it as a plain sql script (one file for the whole database) on in a custom format that you can then throw a script at to get one file per object if you want it.
The PgAdmin tool will also show you each object's SQL dump, but I don't think there's a nice way to get them all at once from there.
For mysql, I use mysqldump. The command is pretty simple.
$ mysqldump [options] db_name [tables]
$ mysqldump [options] --databases db_name1 [db_name2 db_name3...]
$ mysqldump [options] --all-databases
Plenty of options for this. Take a look here for a good reference.
In addition to the "Generate Scripts" wizard in SSMS you can now use mssql-scripter which is a command line tool to generate DDL and DML scripts.
It's an open source and Python-based tool that you can install via:
pip install mssql-scripter.
Here's an example of what you can use to script the database schema and data to a file.
mssql-scripter -S localhost -d AdventureWorks -U sa --schema-and-data > ./adventureworks.sql
More guidelines: https://github.com/Microsoft/sql-xplat-cli/blob/dev/doc/usage_guide.md
And here is the link to the GitHub repository: https://github.com/Microsoft/sql-xplat-cli
MySQL has a great tool called MySQL workbench that lets you reverse and forward engineer databases, as well as synchronize, which I really like. You can view the DDL when executing these functions.
I wrote SMOscript which does what you are asking for (referring to MSSQL Server)
Following what Daniel Vassallo said, this worked for me:
pg_dump -f c:\filename.sql -C -n public -O -s -d Moodle3.1 -h localhost -p 5432 -U postgres -w
try this python-based tool: Yet another script to split PostgreSQL dumps into object files