We have a Rails 3 application with a PostgreSQL database (with ~10 tables) mapped by activerecord. Everything's working fine.
However, we could also like to use:
a MongoDB database in order to store images (probably with mongoid gem).
a Neo4j database (probably with neo4j-rails gem) instead of PostgreSQL for some tables.
Using a database with one Rails ORM is simple, thanks to database.yml. But when there's more than one ORM, how can we process? Is there a good way to do so? For instance, ActiveHash (and ActiveYaml) can work well with ActiveRecord. I think there could be a possibility to let differents ORM working together. Thanks for any tips.
This really depends on the type of ORM. A great way to do this is by using inheritance. For example you can have multiple databases and adapters defined in your database.yml file. You can easily talk to these using the ActiveRecord establish_connection method.
# A typical Active record class
class Account < ActiveRecord::Base
...
end
# A new database connection
class NewConnection < ActiveRecord::Base
self.abstract_class = true
establish_connection "users_database"
end
# A new Active record class using the new connection
class User < NewConnection
...
end
The only down side here is that when you are connection to multiple active record databases migrations can get a little bit dicey.
Mixing ORM's
Mixing ORMS is easy. for example mongodb (with mongoid), simply dont inherit from active record and include the following in the model you want to use mongo:
class Vehicle
include Mongoid::Document
field :type
field :name
has_many :drivers
belongs_to :account
end
ORMs built on top of active model play very nicely together. For example with mongoid you should be able to define relations to ActiveRecord models, this means you can not only have multiple databases but they can easy communicate via active model.
Well, I had the same problem today using neo4j gem. I added require 'active_graph/railtie' in my application.rb.
So, when I want generate a model with ActiveGraph I use: rails generate model Mymodel --orm active_graph, with --orm option you can specify an orm to use.
Without --orm option, it will use AR, by default.
First off, I strongly recommend you do not try to have multiple ORMs in the same app. Inevitably you'll want your Mongoid object to 'relate' to your ActiveRecord object in some way. And there are ways (see below)...but all of them eventually lead to pain.
You're probably doing something wrong if you think you 'need' to do this. Why do you need MongoDB to store images? And if you're using it just as an image store, why would you need Mongoid or some other ORM (or more accurately, ODM)? If you really, really need to add a second data store and a second ORM/ODM, can you spin it off as a separate app and call it as a service from your first one? Think hard about this.
That said, if you really want to go with "polyglot persistence" (not my term), there is a decent gem: https://github.com/jwood/tenacity. It's no longer actively developed, but the maintainer does fix bugs and quickly responds to inquiries and pull requests.
Related
I'm new to rails and trying to understand the relationship between migrations and models. As far as I can tell the migration seems to only affect the datastore hence after I use scaffolding to create a resource, am I then responsible for keeping the model and migrations in sync? Are there any tools to assist in this?
Sorry if this is an obvious question, I'm still working my way through the docs.
All migrations do is modify the database. Rails handles maintaining the sync between the model and the database.
You can have a User table that has id, firs_name and your class model might look like this
class User < ActiveRecord::Base
end
As you can see the model class is empty and you can still pretty much access methods on that class like this:
#user = User.new
#user.first_name = "Leo"
#user.save!
and it will know what to do with it.
Migrations are just files that allow you to modify the database in incremental steps while keeping a sane versioning on the database schema.
Of course, Rails will complain if you try to call things from your model that don't exist in the database or the ActiveRecord::Base parent class.
#user = User.new
#user.awesome
#=> undefined method `awesome` for #<User:some_object_id>
As for the migrations, you can have multiple migrations that affect one table. Your job is only to know what attributes you've added to a model. Rails will do the rest for you.
A general rule of thumb is that migrations are best suited for Data Definition - columns in table, their type, constraints etc. So no, you do not need to keep a migration in sync with your data.
If in the longer run, your data definition itself changes (new column or change in a column type), then just adda new migration specifying the same.
ActiveRecord models are driven, in general, from the database. Any fields defined in the database will (generally) appear automatically as properties in the activerecord model bound to that table.
Altering the model will not change the schema (generally). To alter the model, one would typically define a migration and run it into the database.
Note that nothing stops you defining additional properties on the model with attr_accessor etc, but these will not be persisted by ActiveRecord if there is no column in the schema to which they are bound.
I've taken Leo's answer since he helped me to understand things better and if I had just got to the bottom of the page on migrations I might not have needed to ask : http://guides.rubyonrails.org/migrations.html#what-are-schema-files-for
The mentioned annotate_models gem also sound useful in helping improve awareness of the current structure of a class model without having to refer to the schema.
If you want to revert the attr_accessor behavior and instead give a whitelist give a blacklist, you can do per model using this:
attr_accesible *atribute_names - %(attributes black list)
I have a small table in my Rails app that contains static data (user Roles). It shouldn't ever change. I'm wondering if it's possible to lock the table to keep anyone (namely developers) from accidentally changing it in production.
Or should I have not put that data into the database at all? Should it have been hardcoded somewhere to make editing more difficult and, at least, auditable (git blame)?
The right way to do this is with permissions. Change the ownership of the table to another user, and grant the production database user SELECT only.
I would say the right place for these kinds of things is probably the database. That way if they ever need to change, you can change them in the database, and you don't have to redeploy your application (assuming it will notice the change).
You didn't ask this, but your wording brought it to mind so I'll say it anyway: you should never need to explicitly LOCK a table with PostgreSQL. If you think this is something you need to do, you should make sure what you're worried about can actually happen under MVCC and that transactions aren't going to do the right thing for you.
I would probably make use of attr_accesible
if you write something like:
class Role < ActiveRecord::Base
attr_accessible #none
end
you could at least prevent any assignment from the rails side, but it does not prevent any direct modifications through developers with database access.
see also this thread: How I can set 'attr_accessible' in order to NOT allow access to ANY of the fields FOR a model using Ruby on Rails?
You can use a trigger to prevent updates to the table (assuming you can't add a new db user).
Or, use a view and ensure all read requests go through it. (probably by removing the ActiveRecord class that corresponds to the table.
I have a Rails 3 applications that uses different databases depending on the subdomain. I do this by using "establish_connection" in the ApplicationController.
Now I'm trying to use delayed_job gem to do some background processing, however it uses the database connection that it's active in that moment. It's connecting to the subdomain database.
I'd like to force it to use the "common" database. I've done this for some models calling "establish_connection" in the model like this:
class Customer < ActiveRecord::Base
establish_connection ActiveRecord::Base.configurations["#{Rails.env}"]
...
end
Any idea how can I do this?
Here is what you need to know. When you include the DelayedJob gem in your app you create a migration for it to create the table where the jobs are stored, but you don't create a model. This is because DelayedJob already has a model included in the gem (i.e. Delayed::Job). What you need to do is patch this model slightly, just like you did with your own models. You can do this in an initializer.
You may already have an initializer to configure DelayedJob, if so you can do this in there, if not you need to create one. So, create your initializer (in config/initializers) if you don't have one, we'll call it delayed_job_config.rb, now add the following to it:
Delayed::Job.class_eval do
establish_connection ActiveRecord::Base.configurations["#{Rails.env}"]
end
We've done to the DelayedJob model the same thing you did to your own models. Now DelayedJob will use that connection to put jobs in the DB.
I am using Ruby on Rails 2.3.8. I have several User model objects stored in memory and several Where conditions that I want to check these against. Because all this data is stored in memory I want to avoid hitting the database to perform these checks. Is there a way to check these models without hitting the database, i.e. some way to validate a SQL Where condition against an in-memory model object?
To make things more clear, if I were to actually pull the record from the database I would do something like this:
whereCondition = "name LIKE 'James Smith'"
User.find(:first, :conditions => [whereCondition])
I have several Users and several whereConditions available in memory, and what I'd really like to do is something like this:
someUser.meetsCondition?(whereCondition)
Which would return a boolean. Is there some way of doing this without writing my own SQL parser?
Thanks.
If the User is already in memory then doing something like the following shouldn't reload the object should it?
user.name =~ /James Smith/
There is no way to do this without parsing it yourself.
However, if you wanted to, you could create a sqlite in-memory database and load the records into it and then you could use your query. I don't know how practical this would be though - it's definitely too much work to be doing on a normal web request, and the cost of doing this could very well be more expensive than just running the query against your real database again anyways. You'll have to experiment.
No, there is no built-in mechanism to execute SQL logic against in-memory collections. You would have to write your own SQL parser to do what you are wanting.
Rails has caching settings that you can read about on the Rails Guide site. Some of those may suit what you're trying to accomplish.
You also may want to check out Memcached in conjunction with the cached_model gem. I generally use this and update the cached copy of the object in an after_save model callback.
I made a bad decision as I was designing a MongoDB database to embed a model rather than reference it in an associated model. Now I need to make the embedded model a referenced model, but there is already a healthy amount of data in the database (or document?).
I'm using Mongoid, so I reasoned I can just change embedded_in to referenced_in. Before I start, I figured I'd ask people who know better than I do. How can I transition the embedded data already in the database to the document for the associated model.
class Building
embeds_many :landlords
..
end
class Landlord
embedded_in :building
...
end
Short answer - Incrementally.
Create a copy of Landlord, name it Landlord2.
Make it referenced in Building.
Copy all data from Landlord to Landlord2.
Delete Landlord.
Rename Landlord2 to Landlord.
Users should not be able to CRUD Landlord during steps 3-5 (ideally). You still can get away with only locking CRUD on 4-5. Just make sure you make all updates that happened during copying, before removing Landlords.
Just chan ging the model like you have above will not work, the old data will still be in a different strucutre in the db.
Very similar the previous answer, one of the things I have done to do this migration before is to do it dynamically, while the system is running and being used by the users.
I had the data layer separated from the logic, so it let me add some preprocessors and inject code to do the following.
Lets say we start with the old datamodel, then release new code that does the following:
On every access to the document, you would have to check whether the embedded property exists, if it does, create a new entry associated as a reference and save to the database and delete the embedded property from the documents. Once this is done for a couple of days, a lot of my data got migrated and then I just had to run a similar script for everything that was not touched, made the job of migrating the data much easier and simpler and I did not have to run long running scripts or get the system offline to perform the conversion.
You may not ha ve that requirement, so Pick accordingly.