How can I get a plain text postgres database dump on heroku? - sql

Due to version incompatibilities of my postgres database on heroku (9.1) and my local installation (8.4) I need a plain text sql database dump file so I can put a copy of my production data on my local testing environment.
It seems on heroku I can't make a dump using pg_dump but can instead only do this:
$ heroku pgbackups:capture
$ curl -o my_dump_file.dump `heroku pgbackups:url`
...and this gives me the "custom database dump format" and not "plain text format" so I am not able to do this:
$ psql -d my_local_database -f my_dump_file.sql

You could just make your own pg_dump directly from your Heroku database.
First, get your postgres string using heroku config:get DATABASE_URL.
Look for the Heroku Postgres url (example: HEROKU_POSTGRESQL_RED_URL: postgres://user3123:passkja83kd8#ec2-117-21-174-214.compute-1.amazonaws.com:6212/db982398), which format is postgres://<username>:<password>#<host_name>:<port>/<dbname>.
Next, run this on your command line:
pg_dump --host=<host_name> --port=<port> --username=<username> --password --dbname=<dbname> > output.sql
The terminal will ask for your password then run it and dump it into output.sql.
Then import it:
psql -d my_local_database -f output.sql

Assuming you have a DATABASE_URL configured in your environment, there is a far simpler method:
heroku run 'pg_dump $DATABASE_URL' > my_database.sql
This will run pg_dump in your container and pipe the contents to a local file, my_database.sql. The single quotes are important. If you use double quotes (or no quotes at all), DATABASE_URL will be evaluated locally rather than in your container.
If your whole purpose is to load the contents into a local database anyways, you might as well pipe it straight there:
createdb myapp_devel # start with an empty database
heroku run 'pg_dump -xO $DATABASE_URL' | psql myapp_devel
The addition of -xO avoids dumping GRANT, REVOKE, and ALTER OWNER statements, which probably don't apply to your local database server. If any of your COPY commands fail with the error ERROR: literal carriage return found in data (mine did), see this answer.
It's quite possible this didn't work two and a half years ago when this question was originally asked, but for those looking for a way to easily get a dump of your Heroku Postgres database, this appears to be the simplest possible way to do this today.

Heroku's PGBackups actually uses pg_dump behind the scenes, and the "custom format" is actually pg_dump's custom format (-Fc parameter), not Heroku's own custom format.
This means you can use pg_restore, which is part of Postgres, to restore your Heroku backup into another database directly:
pg_restore -d mydatabase my_dump_file.dump
In addition, if you call pg_restore without specifying a database to restore to, it'll print SQL statements to standard out, so you can turn your Heroku backup into a SQL file that way:
pg_restore my_dump_file.dump > sql_statements.sql
UPDATE: on more recent versions of postgres, the following command is required (thanks to comment from PatKilg)
pg_restore latest.dump -f - > sql_statements.sql

for people like me that stumble into this problem in 2020:
heroku pg:backups:capture -a app-name
heroku pg:backups:download -a app-name
the tool will actually tell what command to use after the capture. To get SQL from your latest.dump file:
pg_restore -f sqldump.sql latest.dump
and that's it.

pg_dump accepts a connection string so you don't need to deconstruct it manually like mentioned here: https://stackoverflow.com/a/22896985/3163631.
Let's say your connection string looks like this (I randomized the username and pass and added fillers for the remaining. The "shape" of the connection string is correct):
postgres://Nb6n8BTA4rPK5m:DzEPtwZUkJfgbMSdYFUbqupvJeEekihiJNzqGXa3wN2pmYRGcLQ8Sa69ujGn2RSkb#ec2-00-000-000-000.compute-1.amazonaws.com:5432/j4aaaaaaaaaam1
Even though it is in the postgres://<username>:<password>#<host_name>:<port>/<dbname> format, you can use it directly like so:
pg_dump postgres://Nb6n8BTA4rPK5m:DzEPtwZUkJfgbMSdYFUbqupvJeEekihiJNzqGXa3wN2pmYRGcLQ8Sa69ujGn2RSkb#ec2-00-000-000-000.compute-1.amazonaws.com:5432/j4aaaaaaaaaam1 > output.sql
Maybe this was not possible with pg_dump at the time Alex(https://stackoverflow.com/users/3457661/alex) answered in 2014.

Here's what worked for me:
heroku pg:backups:capture
heroku pg:backups:download
pg_restore latest.dump -f latest.sql
psql -f 'latest.sql' -d '<DEV_DB_NAME>'
Explanation:
First we create a snapshot of the database on Heroku
Then we download the snapshot as 'latest.dump' (the name can be changed using -o '<name>.dump')
Convert the binary dump into plain SQL, which can be imported without raising "pg_dump: error: aborting because of server version mismatch"
Import the file into the local database
Of course, if your running version of postgresql is compatible with Heroku's, heroku pg:pull DATABASE_URL <DEV_DB_NAME> is simpler to type and remember.

Heroku pg:backups:capture
Heroku pg:backups:download
Taken from https://devcenter.heroku.com/articles/heroku-postgres-import-export.
Now you have a binary file. To obtain the file in plain text format, the following worked for me. Note: You will need to install PostgreSQL.
pg_restore latest.dump > latest.sql

You could just download the Heroku dump file and convert it into plain text format.
In newer versions, directly redirecting the output of pg_restore to an SQL file won't work. Doing so will produce an error:
pg_restore my_dump_file.dump > my_dump_file.sql
pg_restore: error: one of -d/--dbname and -f/--file must be specified
Instead, to output the result in plain text format, -f should be used:
pg_restore my_dump_file.dump -f my_dump_file.sql
This will convert the heroku "custom database dump format" to "plain text format".
Then import this file:
psql -d my_local_database -f my_dump_file.sql

Related

Errors ("invalid command") when opening a .sql file

I am trying to open a random .sql file off the internet using the following command:
psql -h localhost -d database_name -U postgres < file_name.sql
But when I run this command I just get errors like the following:
invalid command 's
invalid command 's
invalid command 'll
invalid command 'Moving
invalid command 's
invalid command "frequently
It just continuously prints out these invalid command error messages. I thought it might be an encoding problem but I confirmed the file is UTF-8 encoded.
Any suggestions on how I can open this file
To expand and clarify on a_horse_with_no_name's comment - the psql command you are running should be run directly in your shell, not inside pgadmin4.
youruser#yourmachine:~$ psql -h localhost -d database_name -U postgres < file_name.sql
That command should load the contents of file_name.sql in to database_name. Once it's complete, you can use pgadmin4 as normal to interact with the database.
One possibility is that the file contains tabulator keys, which are expanded if you read redirect standard input to the SQL script.
Try using the -f option:
psql -h localhost -d database_name -U postgres -f file_name.sql
Apparently the .sql file was generated through a MySQL dump. I thought it would not matter whether I used PostgreSQL or MySQL but it did. Once I installed MySQL my problem got resolved and I now have a Database ready :)

Import dump file containing a database dump to dbeaver

I have a database dump in thisdb_2022.dump binary file that I'm trying to import to dbeaver, but I haven't found a way to import the database so I can see it.
I found the below in the dbeaver forum but when I try to follow the instructions and create a new connection I don't see any option I can select that will open this document.
https://dbeaver.io/forum/viewtopic.php?f=2&t=895
Edit: The database and version is PostgreSQL 12
. I'm not trying to
dump it to an existing db rather I want to create a new one with this
dump.
the dump command looks like this: pg_dump -h blah.amazonaws.com -Fc -v --dbname="blah2" -f "/tmp/dump/20220203.dump".
And it will be the same version PostgreSQL 12
The easiest way to not use DBeaver at all.
Do:
UPDATED with correct command.
--In psql
CREATE DATABASE new_db;
--Exit psql
--At command line
pg_restore -d new_db -h <the_host> -p <the_port> -U postgres /tmp/dump/20220203.dump
To work in Dbeaver directly see Backup/Restore.

pg_restore: [archiver] input file appears to be a text format dump. Please use psql

I was trying to create a new database (analyses_db) on a remote server from a sql file by the command:
pg_restore -d analyses_db byoryn_resource.sql
I received the error message
pg_restore: [archiver] input file appears to be a text format dump. Please use psql.`
When I tried to follow the instruction: (from https://stackoverflow.com/a/40632316/15721796)
To reload such a script into a (freshly created) database named newdb:
$ psql -d newdb -f db.sql
I received:
psql: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
I have no idea how to solve this, as a rookie. The SQL file should be alright as it is provided.
After the connection error being solved, I tried the command
sudo -u postgres psql db_name < 'file_path'
(from https://stackoverflow.com/a/26610212/15721796)
which works just fine.
Hope this can help someone who has the same problem.
Also some useful doc here https://www.postgresql.org/docs/9.1/backup-dump.html

How can I import a SQL Server RDS backup into a SQL Server Linux Docker instance?

I've followed the directions from the AWS documentation on importing / exporting a database from RDS using their stored procedures.
The command was similar to:
exec msdb.dbo.rds_backup_database
#source_db_name='MyDatabase',
#s3_arn_to_backup_to='my-bucket/myBackup.bak'
This part works fine, and I've done it plenty of times in the past.
However what I want to achieve now; is restoring this database to a local SQL Server instance; however I'm struggling at this point. I'm assuming this isn't a "normal" SQL Server dump - but I'm unsure what the difference is.
I've spun up a new SQL Server for Linux Docker instance; which seems all set. I have made a few changes so that the sqlcmd tool is installed; so technically the image I'm running is comprised of this Dockerfile; not much different.
FROM microsoft/mssql-server-linux:2017-latest
RUN apt-get update && \
apt-get install -y curl && \
curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add - && \
apt-get update && \
apt-get install -y mssql-tools unixodbc-dev
This image works fine; I'm building it via docker build -t sql . and running it via docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=myPassword1!' -p 1433:1433 -v $(pwd):/backups sql
Within my local folder, I have my backup from RDS downloaded, so this file is now in /backups/myBackup.bak
I now try to run sqlcmd to import the data with the following command; and I'm running into an issue which makes me assume this isn't a traditional SQL dump. Unsure what a traditional SQL dump looks like, but the majority of the file looks garbled with ^#^#^#^# and of course other things.
/opt/mssql-tools/bin/sqlcmd -S localhost -i /backups/myBackup.bak -U sa -P myPassword1! -x
And finally; I get this error:
Sqlcmd: Error: Syntax error at line 56048 near command 'GO' in file '/backups/myBackup.bak'.
Final Answer
My final solution for this mainly came from using -Q and running a RESTORE query rather than importing with the file, but I also needed to include some MOVE options as they were pointing at Windows file paths.
/opt/mssql-tools/bin/sqlcmd -U SA -P myPassword -Q "RESTORE DATABASE MyDatabase FROM DISK = N'/path/to/my/file.bak' WITH MOVE 'mydatabase' TO '/var/opt/mssql/mydatabase.mdf', MOVE 'mydatabase_log' TO '/var/opt/mssql/mydatabase.ldf', REPLACE"
You should use the RESTORE DATABASE command to interact with your backup file instead of specifying it as an input file of commands to the database:
/opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P myPassword1! -Q "RESTORE DATABASE MyDatabase FROM DISK='/backups/myBackup.bak'"
According to the sqlcmd Docs, the -i flag you used specifies:
The file that contains a batch of SQL statements or stored procedures.
That flag likely won't work properly if given a database backup file as an argument.

Create and import pgsql database after pg_dump

I am new to postgres. I have exported a large, complex database with the following command in the terminal
pg_dump -U USERNAME DBNAME > dbexport.pgsql
Now that I have transferred this .pgsql file to a different computer, what is the right command to automatically create and restore the exact same database as was exported? Any suggestions would be appreciated
The way you dumped the database, the information about the database itself is not included in the dump (which is a plain SQL file).
You can either use the -C option to include CREATE DATABASE in the dump (the dump has to be restored with psql), or you use the custom format:
pg_dump -F c -U postgres DBNAME -f dbexport.pgsql
That can be restored with pg_restore like this:
pg_restore -C -d postgres -U postgres dbexport.pgsql