Engine Yard Push/Load Database - ruby-on-rails-3

I am thinking of deploying my Rails app to Engine Yard. I have a MySql db with all of the data for the site. When I deploy to engine yard cloud, will I be able to "push" this database to the server somehow?
Something like this (?):
https://blog.heroku.com/archives/2009/3/18/push_and_pull_databases_to_and_from_heroku/
Or can I somehow put the mysql database in the git repo so it is pushed to the server?

See: https://support.cloud.engineyard.com/entries/20996676-Restore-or-load-a-database
Use scp to ssh copy your database to the server.

Related

deploy Rshiny application connected to an sql database

Please how can i deploy my shiny application developed with R studio and connected an sql database. I created the mysql database on a local server (wampserver) and the application connects to it perfectly but i have to deploy it with the database.
Consider deploying your sql database in the same place that you will be deploying your application. Otherwise, you will need to look into options for exposing your database in such a way that your app will be able to access it.
For example, you could deploy both the database and app on an AWS EC2 instance. Alternatively, you could deploy the database to an RDS instance and connect to it remotely with your app on EC2. Those examples are only pertinent to Amazon resources, but the logic applies regardless of your platform.

Replication for local and global databases

I have a server which is running a large database. Many computers and laptops will connect to my server and download this large database to run in localhost. They will create, modify or delete this local database. When everything is done, these databases need to be synchronized to the main database on the server.
I am using PostgreSQL, by the way.
Here I come up with a solution using replication. rubyrep is my choice. However, rubyrep requires every local database to have a global IP. I cannot find a way for my server to connect to those local databases. How can I solve that problem?

Can I use NAS to access database on a network?

I want to ask if I can use NAS to store the main database like for example Firebird or MySQL and access this database from network ? if so how to install the database server software ?
Sure!
What's your nas model?
For example in my Synology i can enable web server and install mysql + phpmyadmin just in few clicks.
I just need to access in the admin panel and using the PacketManager start installing the software.
I'm sure it's possible to do the same with Qnap.
A "NAS" is simply Network Attached Storage. A personal example is the Western Digital My Cloud. You can use it to host the data files themselves, but no software can be installed.
There are networked solutions that are basically mini servers. They have slimmed down versions of a Linux build that run web servers, database servers and the like. I do not have any examples to provide since I do not have the need for one, but I know they are out there from prior research.
To learn about what a NAS is, you can check out the Wikipedia article.
NAS is basically just storage, it doesn't really run a useful OS. You need a server to host MySQL or similar DB. You can install MySQL on a Windows, Mac or Linux OS, the DB file(s) would reside on those machines and the MySQL services would respond to API requests appropriately. Here are some links to installing MySQL:
Windows - http://www.iis.net/learn/application-frameworks/install-and-configure-php-on-iis/install-and-configure-mysql-for-php-applications-on-iis-7-and-above
Linux - https://www.digitalocean.com/community/tutorials/how-to-install-linux-apache-mysql-php-lamp-stack-on-ubuntu

Looking for thoughts, help in migrating database from Postgres (under Heroku) to SQL Azure

Would the approach to migrate from Postgres to MySQL also work for a database migration from Postgres (under Heroku) to SQL Azure? If not, can someone help in understanding what strategies to adopt? I'm considering the following:
Build a fresh database schema under SQL Azure and change database.yml to the SQL Server adapter (not sure how)
Migrate the complete schema (tables and entity relationships — don't need data). And, change database.yml to the SQL Server adapter (not sure how) .
If neither works — figure out a way to sync data between Postgres and SQL Azure.
Thoughts, ideas greatly appreciated.
If you have a rails application, were using ActiveRecord and were using database migrations, this is fairly simple. Follow this (http://blogs.msdn.com/b/silverlining/archive/2011/09/14/using-active-record-with-sql-azure.aspx) guide to get your app working against SQL Azure, upload your code changes to Heroku, run heroku run bundle exec rake db:migrate and you are done.
If you were not using database migrations, your task will be harder. I suggest using PGBackups to export data from Heroku Postgres (https://devcenter.heroku.com/articles/heroku-postgres-import-export) and finding a way to load this SQL to SQL Azure.

SQL Azure Backups

Has anyone come up with a good way to do backups of SQL Azure databases?
The two approaches seem to be SSIS or BCP. SSIS looks like it requires a local install of MS SQL 2008 which I don't have. BCP looks a bit more promising but I haven't managed to find any examples of using it with SQL Azure as of yet.
At the PDC09 they announced SQL Azure Data Sync, which was an early preview that is designed to let you keep your local SQL Server in sync with an Azure SQL Server.
In terms of database backups for maintenance etc, then of course that is part of the service you pay for with Azure that MS manage.
The sync framework team have a blog on a number of issues surrounding data syncronisation between Azure and a local DB - http://blogs.msdn.com/sync/default.aspx
My personal favorite solution is to use Azure Mobile Services to do a nightly backup & export from SQL Azure to a .bacpac file hosted in Azure Storage. This solution doesn't require a 3rd party tool, no bcp or powershell, is 100% cloud and doesn't require a local hosted SQL Server instance to download/copy/backup anything.
There are about 8 different steps, but they're all pretty easy: http://geekswithblogs.net/BenBarreth/archive/2013/04/15/how-to-create-a-nightly-backup-of-your-sql-azure.aspx
SQL Azure now offers automated and schedulable backups to bacpac files in Azure blob storage
http://blogs.msdn.com/b/sql-bi-sap-cloud-crm_all_in_one_place/archive/2013/07/24/sql-azure-automated-database-export.aspx
We use this to make nightly backups and have the tool keep the most recent 14 backups.
Enzo Backup for SQL Azure is available (full release expected October 1st): http://www.bluesyntax.net/backup.aspx
You can backup a database with transactional consistency and restore it at a later time; it stores its backups in the cloud, or on-premise. It also includes a scheduling capability.
I spent some time with BCP and got it working acceptably. It's a bit annoying to have to do the backup/restore table-by-table but I'll script it and that will do until Microsoft bring in a proper SQL Azure backup feature which is supposedly going to be the first half of 2010.
Backup:
bcp mydb.dbo.customers out customers.dat -n -U user#azureserver -P pass -S tcp:azureserver.database.windows.net
Restore:
bcp mydb.dbo.customers in customers.dat -n -U user#azureserver -P pass -S tcp:azureserver.database.windows.net
We set up a simple solution using Red Gate tools, but it too requires a local SQL instance: http://mooneyblog.mmdbsolutions.com/index.php/2011/01/11/simple-database-backups-with-sql-azure
I'm using www.sqlscripter.com to generate insert/update data scripts (to sync my local db). Not for free (shareware) but worth to try.
You can now use SQL Azure Copy to perform backups of your database. More details on this can be found here.
Seems that azurelabs has something to offer now:
Azure Labs
related article