How to export a Postgresql db into SQL that can be executed into other pgAdmin?
Exporting as backup file, doesn't work when there's a difference in version
Exporting as SQL file, does not execute when tried to run on a different pgAdmin
I tried exporting a DB with pgAdmin III but when I tried to execute the SQL in other pgAdmin it throws error in the SQL, when I tried to "restore" a Backup file, it says there's a difference in version that it can't do the import/restore.
So is there a "safe" way to export a DB into standard SQL that can be executed plainly in pgAdmin SQL editor, regardless of which version it is?
Don't try to use PgAdmin-III for this. Use pg_dump and pg_restore directly if possible.
Use the version of pg_dump from the destination server to dump the origin server. So if you're going from (say) 8.4 to 9.2, you'd use 9.2's pg_dump to create a dump. If you create a -Fc custom format dump (recommended) you can use pg_restore to apply it to the new database server. If you made a regular SQL dump you can apply it with psql.
See the manual on upgrading your PostgreSQL cluster.
Now, if you're trying to downgrade, that's a whole separate mess.
You'll have a hard time creating an SQL dump that'll work in any version of PostgreSQL. Say you created a VIEW that uses a WITH query. That won't work when restored to PostgreSQL 8.3 because it didn't support WITH. There are tons of other examples. If you must support old PostgreSQL versions, do your development on the oldest version you still support and then export dumps of it for newer versions to load. You cannot sanely develop on a new version and export for old versions, it won't work well if at all.
More troubling, developing on an old version won't always give you code that works on the new version either. Occasionally new keywords are added where support for new specification features are introduced. Sometimes issues are fixed in ways that affect user code. For example, if you were to develop on the (ancient and unsupported) 8.2, you'd have lots of problems with implicit casts to text on 8.3 and above.
Your best bet is to test on all supported versions. Consider setting up automated testing using something like Jenkins CI. Yes, that's a pain, but it's the price for software that improves over time. If Pg maintained perfect backward and forward compatibility it'd never improve.
Export/Import with pg_dump and psql
1.Set PGPASSWORD
export PGPASSWORD='123123123';
2.Export DB with pg_dump
pg_dump -h <<host>> -U <<username>> <<dbname>> > /opt/db.out
/opt/db.out is dump path. You can specify of your own.
3.Then set again PGPASSWORD of you another host. If host is same or password is same then this is not required.
4.Import db at your another host
psql -h <<host>> -U <<username>> -d <<dbname>> -f /opt/db.out
If username is different then find and replace with your local username in db.out file. And make sure on username is replaced and not data.
If you still want to use PGAdmin then see procedure below.
Export DB with PGAdmin:
Select DB and click Export.
File Options
Name DB file name for you local directory
Select Format - Plain
Ignore Dump Options #1
Dump Options #2
Check Use Insert Commands
Objects
Uncheck tables if you don't want any
Import DB with PGAdmin:
Create New DB.
By keeping selected DB, Click Menu->Plugins->PSQL Console
Type following command to import DB
\i /path/to/db.sql
If you want to export Schema and Data separately.
Export Schema
File Options
Name schema file at you local directory
Select Format - Plain
Dump Options #1
Check Only Schema
Check Blobs (By default checked)
Export Data
File Options
Name data file at you local directory
Select Format - Plain
Dump Options #1
Check Only Data
Check Blobs (By default checked)
Dump Options #2
Check Use Insert Commands
Check Verbose messages (By default checked)
Note: It takes time to Export/Import based on DB size and with PGAdmin it will add some more time.
Related
I have a SQL dump that I need to import using postgresql into pgadmin4, however when I run the command, the schema gets created but none of the data comes with it, I have the database set up in pgadmin4. This is my first time using postgresql and pgadmin so I know I have to be missing something.
The SQL dump file was sent to me directly, I did not use pg_dump to migrate anything, the file is in my downloads and I need to plug it in to pgadmin.
I need this SQL dump because I need to log into several portals locally for a large project.
On Windows, using postgres version 14, I've tried several ways from other solutions on stack overflow, first using the command line in both bash and powershell
This here is the command I was told to use that should add the tables and data for the app from a coworker, and it worked fine for him.
C:\Program Files\PostgreSQL\14\bin>psql -h localhost -U postgres -d the_database -f PATH_TO_YOUR_DOWNLOADS\data_dump.sql
This command will create the schema in the pgadmin database but no data comes with it. (I know the data is missing because I cant use my dummy logins to get into the project)
Second, I tried using the built in restore and backup methods in pgadmin and both of those end in an error
`Process failed Restoring backup on the server 'PostgreSQL 14 (localhost:5432)
Third I tried using the query tool and link the sql file that way, but when I hit execute I get an error there as well.
Using the query tool, when I link the download file, I can see the data in the Query, but it is not in the database.
ERROR: syntax error at or near "2"
LINE 3285: 2 Some Test 2020-11-13 07:42:29.356827 2020-11-13 04:32:...
^
SQL state: 42601
Character: 87447
Any advice?
Do I need the SQL file formatted in any certain way?
I just need the data to be imported into pgadmin4 database WITH my schema.
I am tryig to migrate my cockroachdb into postgresql for some reason :
I have dumbs of cockroachdb data in .sql format like booking.sql etc .
I tried many ways ways to solve this problem
tried direct import of dump file using psql but since the dump file was of cockroachdb it is showing some syntactical error
my second plan was to restore the dump file back into cockroachdb system and try running pgdump from there. But I am not able to restore the database in cockroachdb.
ERROR: failed to open backup storage location: unsupported storage scheme: "" - refer to docs to find supported storage schemes
I tried doing again with import statement from cockroachdb but of no use .
with my little knowledge I also searched google and youtube but of their little documentation I didnt found anything useful
Any help will be appreciated . Thank you
for exporting data from cockroachDB there are some limitations. you can't export your data into SQL directly in new versions.
the first way of exporting is using the cockroach dump command, but it's been deprecated from version 20.2 so if you are using a newer version, this won't work.
cockroach dump <database> <table> <table...> <flags>
sample:
cockroach dump startrek --insecure --user=maxroach > backup.sql
in new versions, you can export your data into CSV files using SQL commands like EXPORT
EXPORT DATABASE bank INTO 's3://{BUCKET NAME}/{PATH}?AWS_ACCESS_KEY_ID={KEYID}&AWS_SECRET_ACCESS_KEY={SECRET ACCESS KEY}' \ AS OF SYSTEM TIME '-10s';
to export into local nodes
EXPORT DATABASE bank INTO ('nodelocal://1/{PATH}');
the other alternative way of exporting is using database clients such as DBeaver.
you can download and install DBeaver from https://dbeaver.io/download/.
after adding the connection you can export the database from this path Right-click on db>tools>Backup
the fastest and easiest way of exporting is using a database tool like DBeaver.
I hope this answer would have been helpful
I am under strict corporate environment and don't have access to Postgres' psql. Therefore I can't do what's shown e.g. in the SO Convert SQLITE SQL dump file to POSTGRESQL. However, I can generate the sqlite dump file .sql. The resulting dump.sql file is 1.3gb big.
What would be the best way to import this data into Postgres? I also have DBeaver and can connect to both databases simultaneously but unfortunately can't do INSERT from SELECT.
I think the term for that is 'absurd', not 'strict'.
DBeaver has an 'execute script' feature. But who knows, maybe it will be blocked.
EnterpriseDB offers binary downloads. If you unzip those to a local drive you might be able to execute psql from the bin subdirectory.
If you can install psycopg2 or pg8000 for python, you should be able to connect to the database and then loop over the dump file sending each line to the database with cur.execute(line) . It might take some fiddling if the dump file has any multi-line commands, but the example you linked to doesn't show any of those.
I have a working function that was created and want to just transfer it to another newly installed Postgres DB.
I have checked around and seeing command line options which seems kinda of dry for a common task?
Is there a module import / export option?
I have exported the function into a .sql file from the database, so just looking for an import option.
I know very little about Postgres :)
How did you export it? If it exported as sql into a file named file_name.sql, you would just do:
psql -f file_name.sql
With whatever additional parameters are needed to connect to the correct server and database.
Using Toad for Oracle, I can generate full DDL files describing all tables, views, source code (procedures, functions, packages), sequences, and grants of an Oracle schema. A great feature is that it separates each DDL declaration into different files (a file for each object, be it a table, a procedure, a view, etc.) so I can write code and see the structure of the database without a DB connection. The other benefit of working with DDL files is that I don't have to connect to the database to generate a DDL each time I need to review table definitions. In Toad for Oracle, the way to do this is to go to Database -> Export and select the appropriate menu item depending on what you want to export. It gives you a nice picture of the database at that point in time.
Is there a "batch" tool that exports
- all table DDLs (including indexes, check/referential constraints)
- all source code (separate files for each procedure, function)
- all views
- all sequences
from SQL Server?
What about PostgreSQL?
What about MySQL?
What about Ingres?
I have no preference as to whether the tool is Open Source or Commercial.
For SQL Server:
In SQL Server Management Studio, right click on your database and choose 'Tasks' -> 'Generate Scripts'.
You will be asked to choose which DDL objects to include in your script.
In PostgreSQL, simply use the -s option to pg_dump. You can get it as a plain sql script (one file for the whole database) on in a custom format that you can then throw a script at to get one file per object if you want it.
The PgAdmin tool will also show you each object's SQL dump, but I don't think there's a nice way to get them all at once from there.
For mysql, I use mysqldump. The command is pretty simple.
$ mysqldump [options] db_name [tables]
$ mysqldump [options] --databases db_name1 [db_name2 db_name3...]
$ mysqldump [options] --all-databases
Plenty of options for this. Take a look here for a good reference.
In addition to the "Generate Scripts" wizard in SSMS you can now use mssql-scripter which is a command line tool to generate DDL and DML scripts.
It's an open source and Python-based tool that you can install via:
pip install mssql-scripter.
Here's an example of what you can use to script the database schema and data to a file.
mssql-scripter -S localhost -d AdventureWorks -U sa --schema-and-data > ./adventureworks.sql
More guidelines: https://github.com/Microsoft/sql-xplat-cli/blob/dev/doc/usage_guide.md
And here is the link to the GitHub repository: https://github.com/Microsoft/sql-xplat-cli
MySQL has a great tool called MySQL workbench that lets you reverse and forward engineer databases, as well as synchronize, which I really like. You can view the DDL when executing these functions.
I wrote SMOscript which does what you are asking for (referring to MSSQL Server)
Following what Daniel Vassallo said, this worked for me:
pg_dump -f c:\filename.sql -C -n public -O -s -d Moodle3.1 -h localhost -p 5432 -U postgres -w
try this python-based tool: Yet another script to split PostgreSQL dumps into object files