PostgreSQL dump from Ruby on Rails controller - sql

I have implemented ruby on rails app and I want to have a very easy way to create save points and load them from the view of this app.
For now, before I implement a huge undo-stack, I want to do a SQL dump into a file by a ruby on rails controller method and also load the dumped file back into the database. How can I perform this?

First, I wonder what the problem you are actually trying to solve is - maybe something like database transactions would make sense here?
Assuming they don't, however, and you do need to get a full snapshot of the database and restore it, it is going to depend on what database you are using. I'm most familiar with Postgres and I know there exists pg_dump and pg_restore commands to do this type of thing.
https://coderwall.com/p/2e088w/rails-rake-tasks-to-dump-restore-postgresql-databases has a walkthrough of the actual commands needed, and does it in the form of a Rake task. If you are wanting to call it from the controller, however, I would pull those out into a new class that the controller can tell to dump or restore as needed.

Finally I work out an answer, how to dump and restore if you are using a PG Database in Rails.
The problem with pg_restore is that the command cannot drop the database while other users (eg. rails server) are accessing it:
To dump the database:
cmd = "pg_dump -F c -v -U YOUR_ROLE -h localhost YOUR_DATABASE_NAME -f db/backups/hello.psql"
To restore:
system "rake environment db:drop"
system "rake db:create"
system "pg_restore -C -F c -v -U YOUR_ROLE -d YOUR_DATABASE_NAME db/backups/hello.psql"
Finally to get the rake environment db:drop to work you have to use this monkeypatch taken from
# config/initializers/postgresql_database_tasks.rb
module ActiveRecord
module Tasks
class PostgreSQLDatabaseTasks
def drop
establish_master_connection
connection.select_all "select pg_terminate_backend(pg_stat_activity.pid) from pg_stat_activity where datname='#{configuration['database']}' AND state='idle';"
connection.drop_database configuration['database']
end
end
end
end

Related

PostgreSQL Query To Create A Directory

Files are being written to a directory using the COPY query:
Copy (SELECT * FROM animals) To '/var/lib/postgresql/data/backups/2020-01-01/animals.sql' With CSV DELIMITER ',';
However if the directory 2020-01-01 does not exist, we get the error
could not open file "/var/lib/postgresql/data/backups/2020-01-01/animals.sql" for writing: No such file or directory
PostgeSQL server is running inside a Docker container with the volume mapping /mnt/backups:/var/lib/postgresql/data/backups
The Copy query is being sent from a Node.js app outside of the Docker container.
The mapped host directory /mnt/backups was created by Docker Compose and is owned by root, so the Node.js app sending the COPY query is unable to create the missing directories due to insufficient permissions.
The backup file is meant to be transferred out of the Docker container to the Docker host.
Question: Is it possible to use an SQL query to ask PostgreSQL 11.2 to create a directory if it does not exist? If not, how will you recommend the directory creation be done?
Using Node.js 12.14.1 on Ubuntu 18.04 host. Using PostgreSQL 11.2 inside container, Docker 19.03.5
An easy way to solve it is to create the file directly into the client machine. Using STDOUT from COPY you can let the query output be redirected to the client standard output, which you can catch and save in a file. For instance, using psql in the client machine:
$ psql -U your_user -d your_db -c "COPY (SELECT * FROM animals) TO STDOUT WITH CSV DELIMITER ','" > file.csv
Creating an output directoy in case it does not exist:
$ mkdir -p /mnt/backups/2020-01/ && psql -U your_user -d your_db -c "COPY (SELECT * FROM animals) TO STDOUT WITH CSV DELIMITER ','" > /mnt/backups/2020-01/file.csv
On a side note: try to avoid exporting files into the database server. Although it is possible, I consider it a bad practice. Doing so you will either write a file into the postgres system directories or give the postgres user permission to write somewhere else, and it is something you shouldn't be comfortable with. Export data directly to the client either using COPY as I mentioned or follow the advice from #Schwern. Good luck!
Postgres has its own backup and restore utilities which are likely to be a better choice than rolling your own.
When used with one of the archive file formats and combined with pg_restore, pg_dump provides a flexible archival and transfer mechanism. pg_dump can be used to backup an entire database, then pg_restore can be used to examine the archive and/or select which parts of the database are to be restored. The most flexible output file formats are the “custom” format (-Fc) and the “directory” format (-Fd). They allow for selection and reordering of all archived items, support parallel restoration, and are compressed by default. The “directory” format is the only format that supports parallel dumps.
A simple backup rotation script might look like this:
#!/bin/sh
table='animals'
url='postgres://username#host:port/database_name'
date=`date -Idate`
file="/path/to/your/backups/$date/$table.sql"
mkdir -p `dirname $file`
pg_dump $url -w -Fc --table=$table -f $file
To avoid hard coding the database password, -w means it will not prompt for a password and instead look for a password file. Or you can use any of many Postgres authentication options.

T-SQL - Scripting variable not defined

I'm new to T-SQL and I'm trying to backup my databases (using SQL Server 2008).
When I try to run the script via sqlcmd -i inputfile I got this error messages:
'DATE' Scripting variable not defined.
The problem is I have a line like this:
...TO DISK = "FileName_$(ESCAPE_NONE(DATE)).BAK" ...
With a date in a filename, it will prevent it from replacing my old backups.
If I run it in management studio, it works, but if I run it in command line with the sqlcmd -i command, then it doesn't work.
EDIT:
I looked at the job history and I saw this error message:
"For SQL Server 2005 SP1 or later, you must use the appropriate ESCAPE_xxx
macro to update job steps containing tokens before the job can run"
I don't quite understand what that means. I've already used $ESCAPE_NONE(DATE), what's wrong?
Old question I know but this is one of the first results and if anyone else has the same problem the answer isn't particularly easy to find.
Including the -x switch to disable environment variables fixed the problem for me;
sqlcmd -x -i inputfile
If you're trying to backup your sql server databases and append the date to them using sqlcmd there's an easy thing you can try.
First, create the sp called sp_BackupDabases which you can find here:
http://support.microsoft.com/kb/2019698
You can invoke it from sql cmd using some command like this:
sqlcmd -U Damieh -P ilovechocolate -S (local) -Q "EXEC sp_BackupDatabases #backupLocation ='C:\MyBackups\', #BackupType='F'"
I'm sure you know this already, but just in case: -U is the user, -P is password, -S is server, and -Q is query. You can either backup all of your databases or some of them, there are parameters for that. You can find the stored proc parameters details on the same link I gave you.
The date will be automatically appended and you can play with the sp's code if you want it in a different place/way/format. I use this regularly on servers which don't have a non-express sqlserver (meaning that I can't schedule backups without using a .bat and task scheduler) with great success.
I apologize if this wasn't the answer you were looking for =). Have a nice day!
I know I'm coming along late on this thread, but you can use the following:
SQLCMD -S YourServer -E -d YourDatabase -i YourScript 2> nul
That will send the StdErrorOut to the bit bucket.

Vaccum full and Reindex Heroku database

I want to perform a full vacuum and reindex on my database for my app hosted on Heroku.
I can't work out how to do it via the heroku command line remotely.
I can do it on my local Mac osx machine via the below commands in terminal...
psql database_name
>> vaccuum full;
>> \q
reindex database database_name
How can i perform a full vaccuum and reindex all my tables for my app on Heroku?
If possible I would like to do it without exporting the database.
Okay so it seems Heroku doesn't support this functionality unless you pay up. Looks like i'll have to pull the database, perform the actions and push it back upstream! Fun times.
You can use the psql interactive terminal with Heroku. From Heroku PostgreSQL:
If you have PostgreSQL installed on your system, you can open a direct psql console to your remote db:
$ heroku pg:psql
Connecting to HEROKU_POSTGRESQL_RED... done
psql (9.1.3, server 9.1.3)
SSL connection (cipher: DHE-RSA-AES256-SHA, bits: 256)
Type "help" for help.
rd2lk8ev3jt5j50=>
You can also pass-in the parameters at the psql command-line, or from a batch file. The first statements gather necessary details for connecting to your database.
The final prompt asks for the constraint values, which will be used in the WHERE column IN() clause. Remember to single-quote if strings, and separate by comma:
#echo off
echo "Test for Passing Params to PGSQL"
SET server=localhost
SET /P server="Server [%server%]: "
SET database=amedatamodel
SET /P database="Database [%database%]: "
SET port=5432
SET /P port="Port [%port%]: "
SET username=postgres
SET /P username="Username [%username%]: "
"C:\Program Files\PostgreSQL\9.0\bin\psql.exe" -h %server% -U %username% -d %database% -p %port% -e -v -f cleanUp.sql
Now in your SQL code file, add the clean-up SQL, vacuum full (note the spelling). Save this as cleanUp.sql:
VACUUM FULL;
In Windows, save the whole file as a DOS BATch file (.bat), save the cleanUp.sql in the same directory, and launch the batch file. Thanks for Dave Page, of EnterpriseDB, for the original prompted script.
Also Norto, check out my other posting if you want to add parameters to your script, that can be evaluated in the SQL. Please vote it up.

Rails3: How to execute Rails commands (like "rails generate") with shell script

I need to be able to generate a model (and later a migration) by executing a Linux shell script.
The script is located directly in the app folder and looks like this:
#!/bin/bash
cd /home/<my_user_profile>/Websites/<my_app_name>
rails g model my_model name:string accepted:boolean [etc...]
The problem is: When I execute the script, the model does not get created. Any ideas why?
Try
exec "rails g model my_model name:string accepted:boolean"
To make sure it's executing in the same context your shell is in, remove the shebang to avoid starting up another bash which may or may not be the same as your current shell.
If you're using rvm/similar, you'd need to either (a) have a default, (b) specify version/gemset, or (c) rely on rvm-like cd fudgery.
Otherwise, should work just fine--it is for me (w/o the shebang, so it'll use whatever current rvm environment I'm in).

rake aborted! ERROR: must be owner of database

I am working through Michael Hartl's excellent tutorial but when trying to prepare the test database with the command:
bundle exec rake db:test:prepare
I get this error message:
ERROR: must be owner of database sample_app_test...
which I never got when using the development database, because I had created the following database role for my Rails app:
CREATE ROLE demo_app WITH CREATEDB LOGIN
(this is using Postgresql)
Does anyone understand why this is failing in the test environment?
TIA...
Did you ensure the ownership of the test DB? try running the \l command on Postgres console client and check the ownerships. you can also try the following query:
ALTER DATABASE sample_app_test OWNER TO demo_app;
First post, writing this down for posterity. I was having the same problem but was able to fix it. You just have to make sure you were/are signed in as a superuser when you create your databases (or the one that is throwing the error).
I was logging into psql with this code:
sudo sudo -u postgres psql
And created my databases. This is bad. You want to log in with these superuser credentials:
sudo su - postgres
And then after you're logged in to postgres:
psql
Then create your databases. You can kill your old databases with the command
DROP DATABASE "database_to_drop";
Recreate them and you should be good to go!