i'm tying to install my ruby on rails application on a new Server.
In the folder db theres is my schema.rb file.
But my problem ist how to run the schema.rb file to run the sql statements?
You can do this:
rake db:create to create your DB (you do this only once)
rake db:migrate to migrate your BD (do this the first time and every time you want to apply changes, like removing a column)
And
rake db:seed to populate your DB, if you have something in you seeds.rb file
The schema file doesn't populate the data, rather it shows the structure of the database. You'll have to run:
rake:db:create
rake:db:migrate
on the new server and then create a dump of the data you wish to import to the new database. Then import the data. Both of these processes can differ widely depending on what kind of database you're using.
For MySQL:
Export and Import all MySQL databases at one time
For PostgreSQL:
import sql dump into postgresql database
Related
I have a dev version and a production version running in django.
I recently started populating it with a lot of data and found that the django loaddata tries to load everything into memory before adding it into the db and my files will be too big for that.
What is the proper way to push my data from my dev machine to my production?
I did...
pg_dump -U user -W db ./filename.sql
and then on the production server I did...
psql dbname < filename.sql
It seems like it worked, all the data is there, but it came up with some errors such as
relation xxx already exists
constrain xxx for relation xxx already exists
and there were quite a few of them, but like I said everything appears to be there. Is this the right way to do it?
Edit: I have in the production machine the database with information and I don't want truncate the tables before import.
This is the script that I use:
pg_dump -d DATABASE_NAME -U postgres --format plain --inserts > /FILE.sql
Edit: As you says in comments that you don't want truncate the tables before import, you can't do this type of import into your production database. I suggest empty your production database before import the dev database dump.
I've looked over several other questions on here, and they're vaguely similar, but not exactly what I'm looking for.
What I'm trying to do is import/"convert" a *.sql file which contains 8 tables, each of which contain roughly 24 columns. This file is actually fairly flat file, seeing as though the only queries that worked previous had to do with associating a shared :id between tables (so, SELECT * FROM table1, table2 WHERE id = '1' would pull all results, which was fine at the time).
I've searched around, but can't find a clever way to do this, so I'm asking you Rails pros for help now.
I'm assuming what you want is basically to convert your SQL file into a Rails database schema file without having to go through and do this yourself manually.
One quick way to do this would be to manually execute the SQL file, perhaps by logging into your database and loading the file that way, or by doing something like what was done in this question:
ActiveRecord::Base.connection.execute(IO.read("path/to/file"))
Once you have the schema that was defined in your .sql file actually loaded into your database, you will want to follow the steps outlined in this question:
First run rake db:schema:dump which will generate a db/schema.rb database file based on the current state of the database.
From here, you can create a db/migrate/001_original_schema.rb migration that references the schema.rb file as follows:
class OriginalDatabaseMigration < ActiveRecord::Migration
def self.up
# contents of schema.rb here
end
def self.down
# drop all the tables
end
end
If I understood your question right you need to populate your db from .sql file. I am doing it this way:
connection = ActiveRecord::Base.connection
ql = File.read('db/some_sql_file.sql')
statements = sql.split(/;$/)
statements.pop
ActiveRecord::Base.transaction do
statements.each do |statement|
connection.execute(statement)
end
end
Put your sql file to db folder.
One way I was able to do this - using rails dbconsole
.import FILE TABLE Import data from FILE into TABLE
And essentially .import ./path/to/file TABLE_NAME
Works like a champ.
I have faced the same problem, i just created a script and parsed
all the SQL sentences adding 'execute("' at the begining and '")' at the end of each line.
Then i created a new migartion as usual, and pasted all the output on the migration up script. That works for me.
Be aware of avoid any comments on the SQL file so the parsing will be easier.
How to export a Postgresql db into SQL that can be executed into other pgAdmin?
Exporting as backup file, doesn't work when there's a difference in version
Exporting as SQL file, does not execute when tried to run on a different pgAdmin
I tried exporting a DB with pgAdmin III but when I tried to execute the SQL in other pgAdmin it throws error in the SQL, when I tried to "restore" a Backup file, it says there's a difference in version that it can't do the import/restore.
So is there a "safe" way to export a DB into standard SQL that can be executed plainly in pgAdmin SQL editor, regardless of which version it is?
Don't try to use PgAdmin-III for this. Use pg_dump and pg_restore directly if possible.
Use the version of pg_dump from the destination server to dump the origin server. So if you're going from (say) 8.4 to 9.2, you'd use 9.2's pg_dump to create a dump. If you create a -Fc custom format dump (recommended) you can use pg_restore to apply it to the new database server. If you made a regular SQL dump you can apply it with psql.
See the manual on upgrading your PostgreSQL cluster.
Now, if you're trying to downgrade, that's a whole separate mess.
You'll have a hard time creating an SQL dump that'll work in any version of PostgreSQL. Say you created a VIEW that uses a WITH query. That won't work when restored to PostgreSQL 8.3 because it didn't support WITH. There are tons of other examples. If you must support old PostgreSQL versions, do your development on the oldest version you still support and then export dumps of it for newer versions to load. You cannot sanely develop on a new version and export for old versions, it won't work well if at all.
More troubling, developing on an old version won't always give you code that works on the new version either. Occasionally new keywords are added where support for new specification features are introduced. Sometimes issues are fixed in ways that affect user code. For example, if you were to develop on the (ancient and unsupported) 8.2, you'd have lots of problems with implicit casts to text on 8.3 and above.
Your best bet is to test on all supported versions. Consider setting up automated testing using something like Jenkins CI. Yes, that's a pain, but it's the price for software that improves over time. If Pg maintained perfect backward and forward compatibility it'd never improve.
Export/Import with pg_dump and psql
1.Set PGPASSWORD
export PGPASSWORD='123123123';
2.Export DB with pg_dump
pg_dump -h <<host>> -U <<username>> <<dbname>> > /opt/db.out
/opt/db.out is dump path. You can specify of your own.
3.Then set again PGPASSWORD of you another host. If host is same or password is same then this is not required.
4.Import db at your another host
psql -h <<host>> -U <<username>> -d <<dbname>> -f /opt/db.out
If username is different then find and replace with your local username in db.out file. And make sure on username is replaced and not data.
If you still want to use PGAdmin then see procedure below.
Export DB with PGAdmin:
Select DB and click Export.
File Options
Name DB file name for you local directory
Select Format - Plain
Ignore Dump Options #1
Dump Options #2
Check Use Insert Commands
Objects
Uncheck tables if you don't want any
Import DB with PGAdmin:
Create New DB.
By keeping selected DB, Click Menu->Plugins->PSQL Console
Type following command to import DB
\i /path/to/db.sql
If you want to export Schema and Data separately.
Export Schema
File Options
Name schema file at you local directory
Select Format - Plain
Dump Options #1
Check Only Schema
Check Blobs (By default checked)
Export Data
File Options
Name data file at you local directory
Select Format - Plain
Dump Options #1
Check Only Data
Check Blobs (By default checked)
Dump Options #2
Check Use Insert Commands
Check Verbose messages (By default checked)
Note: It takes time to Export/Import based on DB size and with PGAdmin it will add some more time.
How do I go up to a specific database version from an empty database in rails?
in my case, I did reset the whole database recently, so all tables have been already dropped.
my migration files are as follows:
20111127152636_create_users.rb
20120110100458_create_cars.rb
20120131003026_add_birth_date_to_users.rb
what command do I have to call to get me the second latest version, which is 20120110100458 ?
I have tried "rake db:migrate:up version=20120110100458".
unfortunately, it didn't get me the result I expected it should be; no tables were created at all.
If you want to run the 2 first migrations, use
rake db:migrate VERSION=20120110100458
(it will run 20111127152636_create_users.rb and 20120110100458_create_cars.rb)
Say for example I've got an SQL schema that is ready to go. How would I import it into my Rails app so that I use my prepared database instead of all those funny migrations.
EDIT: You have all misunderstood my question so far. I'm asking if I had a working database application in say PostgresQL. How would I use this as the basis of my Rails application?
There's no requirement that you use migrations, "funny" or otherwise. Just start creating models from your tables. The Rails authors are smart enough to recognise the need to support "legacy schemas".
Note that if your primary keys aren't called id then you'll need to define primary keys (see p316 of "Agile Web Development With Rails 3rd edition"):
class LegacyBook < ActiveRecord::Base
self.primary_key = "isbn"
end
Similarly, if your foreign key names do not follow the AR conventional default style, you'll need to explicitly define them in your relationship definitions.
Out of the box, ActiveRecord doesn't support composite primary keys yet: it kind of assumes something more like 5th Normal Form (PK just a sort of arbitrary number with no meaning in the business domain). There is at least one gem, appropriately named composite_primary_keys (gem install in the usual way) but it may not support AR 2.3 yet (I see v2.2.2 when I gem list --remote composite). There's some discussion on Google Groups.
This should do it
def self.up
execute <<EOF
begin;
SQL HERE
commit;
EOF
end
I would dump the current database into a rails schema, and then use that to generate the migration files. This way you can have more control over your database, in a rails fashion.
For this, you should:
setup your database configuration file (config/database.yml)
run the rake db:schema:dump which will create the schema file db/schema.rb
create your migration file/files using the content of your schema.rb
-
#db/migrate/001_create_database.rb
class CreateDatabase < ActiveRecord::Migration
def self.up
# the content of schema.rb
end
def self.down
# drop all the tables
end
end
This way you can take advantage of migrations later on, when developing your app.
Providing your database tables follow the Rails' ActiveRecord naming conventions then you should be good to go. Run the SQL to create the database objects. Then you can generate your Rails models in the normal fashion, but skipping the creation of a migration file.
For example:
script/generate --skip-migration User name:string, age:integer
edit your config/database.yml file like so:
development:
database: [your legacy database name]
host: localhost
adapter: postgres
you may have to include username and password keys as well, I usually don't for postgres. You'll want to set up a test database in postgres as well.
Once you've configured the development database in this way you'll be able to:
> rake db:schema:dump
and create your test database using
> rake db:schema:load RAILS_ENV=test
with the resulting file.