mysqldump blocking Ruby on Rails - ruby-on-rails-3

This is a trickier problem than it sounds. I run Ruby/MySQL on many servers and use mysqldump w/o any problems. However, I'm using an Ubuntu setup and it appears to be acting quite different than Fedora and RHEL. When I backup the production server using mysqldump it causes Rails to become inaccessible. Apache is still able to serve up images and CPU/memory usage is low, so it appears to be purely a contention with RoR and MySQL. I am using InnoDB, which has allowed mysqldump to backup this exact same database on a Fedora server w/o any downtime. However, the other server was running Ruby 1.8.7 and Rails 2.3.
Here are the complete server specs:
Ubuntu 10
Rails 3
RVM
Ruby 1.9.2
Passenger
Apache
MySQL
Additional "clues":
I can connect fine to the production database and access records using mysql client
I can use Rails console to load the production environment and query tables using ActiveRecord
I cannot access the production webserver through Apache/Passenger, nor can I access it when I run a production instance with Webrick (through 'rails s -e production')
Any thoughts as to why mysqldump would be blocking Rails (and only Rails)?

So this thing has an answer (from comments above):
please use mysqldump with params --single-transaction – Neo Apr 22 at
14:59

Related

How do I create postgre SQL database locally with DataGrip

I'm getting started with Data Grip and I'm stuck before I started doing anything. First thing I've done was creating postgre Data Source with default parameters.
Now I opened a console, and tried to run a script:
drop table table1;
The console then prints:
Connection to localhost:5432 refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections.
What can I do about that, so that I could create my project in e.g.: D:\Database\ instead of connecting to other hosts?
Sounds like you're not actually running PostgreSQL locally. Is it installed and running? PostgreSQL is a client/server-based database system, so it requires the server process to be running. This is not like SQLite or HSQLDB that run self-contained within your application (or DataGrip in this case). Please see here for a tutorial on how to get started with Postgres on a Mac. There are similar tutorials for other operating systems.

Wrap CLI tool around interactive CLI (mysql) process

I've made a simple CLI tool to manage mysql privileges. It's written in PHP, and uses a PHP lib to talk to mysql. The first thing the tool needs, is mysql credentials, because the PHP lib needs those to talk to the mysql daemon (probably on the same server). That works, BUT
I need it to work without credentials. On some servers, I can only run mysql with sudo mysql and then be logged in as mysql root. That's fine too, but my tool can't use that.
Can I wrap my tool around the mysql client, instead of initiating a mysql client inside the tool?
If I can do that, the tool works with any auth method:
sudo mysql | tool
mysql --login-path=~/.mysql-creds | tool
mysql -ufoo -pbar | tool
mysql -hdb.server.com -upublic | tool
But how? How do I get the tool talking to the process around/outside it?
I could use PHP's proc functions to init the process in PHP and read/write from there, but I'm interested in any tool around any process, without knowing the process' name/location. (The tool shouldn't have to know mysql -ufoo -pbar or sudo mysql.)
These don't work:
mysql | tool
mysql < tool
tool < mysql
Or without PHP. How would Python do this? Or even Bash?

How to set the application database to be used by Resque?

I have a Rails application running with Resque, but I'm facing a problem when running it on production environment.
The application runs using development's database. How can I set the database to be used by Resque according to application environment?
Thanks!
When starting the Rails server, you should also provide the environment:
rails server -e production

Engine Yard Push/Load Database

I am thinking of deploying my Rails app to Engine Yard. I have a MySql db with all of the data for the site. When I deploy to engine yard cloud, will I be able to "push" this database to the server somehow?
Something like this (?):
https://blog.heroku.com/archives/2009/3/18/push_and_pull_databases_to_and_from_heroku/
Or can I somehow put the mysql database in the git repo so it is pushed to the server?
See: https://support.cloud.engineyard.com/entries/20996676-Restore-or-load-a-database
Use scp to ssh copy your database to the server.

Rails 3 Postgres working on Heroku but no longer locally PGError could not connect to server

I'm using postgres 8.4 and Rails 3.0.9
A couple of days ago I was using this combo fine locally and on Heroku. Today I load my Rails server locally and I get two new messages, a Deprecation Warning and a PGError.
The warning was that this:
config.action_view.debug_rjs
will no longer be supported in Rails 3.1. I checked and I'm still using 3.0.9 just to be sure.
That was easily taken care of. However the PGError is leaving me scratching my head.
could not connect to server: Connection refused (0x0000274D/10061)
Is the server running on host "???" and accepting
TCP/IP connections on port 5432?
could not connect to server: Connection refused (0x0000274D/10061)
Is the server running on host "???" and accepting
TCP/IP connections on port 5432?
I tried a bundle install and a bundle update. The "is the server running on host '???'" seems to be strange but what does that mean?
Everything is working fine on Heroku. I checked all my init and configure files and they all look the same as they were a few days ago. Have any updates happened that might have caused this?
Have you checked if postgres database is running on specified host? Is database you're trying to connect works on localhost or on external sever?