Let's say I have two pairs of SQL conditions like these:
a = [ "users.accepted = ? AND users.active_at > ?", true, Time.zone.now ]
b = [ "users.accepted = ? AND users.active_at > ?", false, Time.zone.now + 3.days ]
I can use code like User.where(a) to get all rows that satisfy the a condition. How can I use where to get rows that satisfy either a or b conditions? The result should be ActiveRecord::Relation.
There are a couple ways to go about this.
get meta_where or squeel depending upon your rails version. These are really great gems that enhance the Arel behavior of ActiveRecord::Relation.
write sql manually and pass it into the where method as a string. You might have to mess with sql injection more manually, but from your example above I didn't see any incoming values that were user generated strings.
Related
In my controller method for the the index view I have the following line.
#students_instance = Student.includes(:memo_tests => {:memo_target => :memo_level})
So for each Student I eager-load all necessary info.
Later on in a .map block, I call the .where() method on one of the relations as shown below.
#all_students = #students_instance.map do |student|
...
last_pass = student.memo_tests.where(:result => true).last.created_at.utc
difference_in_weeks = ((last_pass.to_i - current_date.to_i) / 1.week).round
...
end
This leads to a single SQL query for each student. And since I have over 300+ students, leads to very slow load times and over 300+ SQL queries.
Am I right in thinking that this is caused by the .where() method. I think this because I have checked everything else and these are the two lines that cause all of the queries.
More importantly, is there a better way to do this that reduces these queries to a single query?
The moment you ask where, the statement is translated to a query. Normally, the result should be sql-cached...
Anyway, in order to be sure, you can instead add programming logic to your statement. That way, you are not requesting a NEW sql statement.
last_pass = student.memo_tests.map {|m| m.created_at if m.result}.compact.sort.last
EDIT
I see the OP's question does not require sorting... So, leaving the sorting out:
last_pass = student.memo_tests.map {|m| m.created_at if m.result}.compact.last
compact is required to remove nil results from the array.
I have this scope in Rails:
scope :by_default_checks, {:conditions => ['cars.sold IS ? AND cars.deactivated IS ?', nil, false]}
#cars = Car.by_title(params[:search][:title_like]).by_greater(params[:search][:amount_gte]).by_smaller(params[:search][:amount_lte]).by_default_checks
and on Heroku I am getting this error:
Completed 500 Internal Server Error in 6ms
... AND cars.sold IS NULL AND cars.deactivated IS 'f')
SELECT "cars".* FROM "cars" WHERE (cars.title LIKE '%iphoe%') AND (cars.sold IS NULL AND cars.deactivated IS 'f')
PG::SyntaxError: ERROR: syntax error at or near "'f'"
This code is working on SQLite, but doesn't on PostgreSQL. How to replace it?
Thanks
You should use = to check for equality with non-null values:
['cars.sold IS ? AND cars.deactivated = ?', nil, false]
You usually use is in is null, is not null, is distinct from, and is not distinct from when you're faced with NULLs and a simple = comparison will not work the way you want it to. You can use is for booleans if you're using the true or false values but not the 't' and 'f' strings that ActiveRecord uses to represent PostgreSQL booleans.
See Comparison Operators in the manual for details.
Alternatively, you could let ActiveRecord build the whole thing instead of using the old-school :conditions stuff:
scope :by_default_checks, where(:sold => nil).where(:deactivated => false)
That way ActiveRecord is responsible for all the native-to-PostgreSQL type conversions and it will choose the correct comparison operators for you.
Also, developing on top of one database (SQLite) and deploying on another (PostgreSQL) is a really bad idea that will just lead to pain, suffering, and hair loss. There are all sorts of differences between databases that no ORM can insulate you from. Please fix this bug and then immediately switch your development environment to PostgreSQL.
I'm using Searchlogic with Rails 2.3.5 and I need to do add a GROUP_BY clause with 2 columns to my query. I tried:
User.search.group = "column1, column2" # Undefined method 'group'
User.search(:group => "column1, column2") # Searchlogic::Search::UnknownConditionError: The group is not a valid condition. You may only use conditions that map to a named scope
And neither worked. I couldn't find any other ways in Searchlogic's docs. Is there any other way?
According to this page: http://vladzloteanu.wordpress.com/2009/01/25/searchlogic-plugin-activerecord-search-on-steroids/
You can probably use:
User.search.conditions.group("column1,column2")
I have been reading some recipes in the Perl Hacks book. Recipe #24 "Query Databases Dynamically without SQL" looked interesting. The idea is to use SQL-Abstract to generate the SQL statement for you.
The syntax to generate a select statement looks something like this:
my($stmt, #bind) = $sql->select($table, \#fields, \%where, \#order);
To illustrate further, an example could look like this (taken from the perldoc):
my %where = (
requestor => 'inna',
worker => ['nwiger', 'rcwe', 'sfz'],
status => { '!=', 'completed' }
);
my($stmt, #bind) = $sql->select('tickets', '*', \%where);
The above would give you something like this:
$stmt = "SELECT * FROM tickets WHERE
( requestor = ? ) AND ( status != ? )
AND ( worker = ? OR worker = ? OR worker = ? )";
#bind = ('inna', 'completed', 'nwiger', 'rcwe', 'sfz');
Which you could then use in DBI code like so:
my $sth = $dbh->prepare($stmt);
$sth->execute(#bind);
Now, sometimes the order of the columns in the WHERE clause is very important, especially if you want to make good use of indexes.
But, since the columns to the WHERE clause generator in SQL-Abstract are specified by means of a hash - and as is known, the order that data is retrieved out of perl hashes cannot be guaranteed - you seem to loose the ability to specify the order of the columns.
Am i missing something? Is there an alternate facility to guarantee the order that columns appear in the WHERE clause when using SQL-Abstract ?
I originally misinterpreted your question.
You can use -and to achieve the desired ordering.
For example:
#!/usr/bin/perl
use strict; use warnings;
use SQL::Abstract;
my $sql = SQL::Abstract->new;
my ($stmt, #bind) = $sql->select(
tickets => '*',
{
-and => [
requestor => 'inna',
status => { '!=', 'completed' },
worker => ['nwiger', 'rcwe', 'sfz'],
],
}
);
print "$stmt\n";
See Nested conditions, -and/-or prefixes.
This module cannot do everything -- it is meant as a convenience for constructing queries that will do the job "most of the time". Sometimes you still may need to write a query by hand. I use SQL::Abstract in my main $work::app and have never run into the situation that you describe. A good SQL engine will know which keys are indexed, and optimize the query to use those first, no matter the ordering you specify. Are you sure that your engine is not the same, and that the order you specify in the query is really significant?
If you really need to order your WHERE clauses in a special order, you may find it easier to write subqueries instead. SQL::Abstract can make this easier too.
I was wondering if there was a way to use "find_by_sql" within a named_scope. I'd like to treat custom sql as named_scope so I can chain it to my existing named_scopes. It would also be good for optimizing a sql snippet I use frequently.
While you can put any SQL you like in the conditions of a named scope, if you then call find_by_sql then the 'scopes' get thrown away.
Given:
class Item
# Anything you can put in an sql WHERE you can put here
named_scope :mine, :conditions=>'user_id = 12345 and IS_A_NINJA() = 1'
end
This works (it just sticks the SQL string in there - if you have more than one they get joined with AND)
Item.mine.find :all
=> SELECT * FROM items WHERE ('user_id' = 887 and IS_A_NINJA() = 1)
However, this doesn't
Items.mine.find_by_sql 'select * from items limit 1'
=> select * from items limit 1
So the answer is "No". If you think about what has to happen behind the scenes then this makes a lot of sense. In order to build the SQL rails has to know how it fits together.
When you create normal queries, the select, joins, conditions, etc are all broken up into distinct pieces. Rails knows that it can add things to the conditions without affecting everything else (which is how with_scope and named_scope work).
With find_by_sql however, you just give rails a big string. It doesn't know what goes where, so it's not safe for it to go in and add the things it would need to add for the scopes to work.
This doesn't address exactly what you asked about, but you might investigate 'contruct_finder_sql'. It lets you can get the SQL of a named scope.
named_scope :mine, :conditions=>'user_id = 12345 and IS_A_NINJA() = 1'
named_scope :additional {
:condtions => mine.send(:construct_finder_sql,{}) + " additional = 'foo'"
}
sure why not
:named_scope :conditions => [ your sql ]