I have a table of call data and I want to query all unanswered calls, which means that the call start time is equal to the call end time. I currently use the following plain SQL which works as expected:
select * from calls where calls.start = calls.end
I was wondering if there is a more "rails" way to do this using the ActiveRecord Query Interface. Ideally I'd like to set up a scope in my Call model that returns me all unanswered calls. Something like:
scope :unanswered, -> { where(start: :end) }
The above doesn't work because Rails treats :end as a string instead of the end column in the DB.
I'm using PostgreSQL as my DB engine.
The SQL query
select * from calls where calls.start = calls.end
could be done in a rails way using a scope as follows:
scope :unanswered, -> { where('start = end') }
I think you can do the following:
scope :unanswered, -> (end) { where(start: end) }
Related
I have an API I am pinging which queries a cosmos db to return records.
I can filter on a simple string in my api call like so:
// return objects where '_Subject' field equals "filterTest"
string getUrl = $"...baseApiPath/?$filter=_Subject+eq+'filterTest'";
This is working perfectly.
But I cannot figure out the filter syntax to make my API query be based on ARRAY_CONTAINS.
// return objects where '_Attachments' field CONTAINS "945afd138aasdf545a2d1";
How would I do that? Is there a general reference for API filter syntax somewhere?
If you're asking about how to query, a query against a property with an array of values looks like this:
SELECT * FROM c WHERE ARRAY_CONTAINS(c._Attachments, "945afd138aasdf545a2d1")
Another example in this answer.
I'm building a command line application that needs to connect in various postgresql databases and execute different queries in the ones as Prepared Statements. In a specific query, I need to use the IN clause in conjunction with the ActiveRecord's connection_raw method. My code is so:
ActiveRecord::Base.connection_raw.prepare('read_publications', "UPDATE publications SET readed = TRUE WHERE id IN ($1);")
After, I try execute this query:
ActiveRecord::Base.connection_raw.exec_prepared('read_publications', [1,2,3,4])
The problem is this is not working. The following error is raised when the query runs:
no implicit conversion of Array into Integer
What I'm doing wrong? Exists a way in that I can convert this array to a value that the IN clause can understand?
If you are using a raw connection, you can't pass in arrays like you can with ActiveRecords. ActiveRecord does some preprocessing for you. If you need raw SQL, then you need a parameter for each array element.
arr = [1,2,3,4]
i = 1
param = []
arr.each { param.push(i); i+=1; }
sql = "UPDATE publications SET readed = TRUE WHERE id IN ($"+param.join(',$')+");"
ActiveRecord::Base.connection_raw.prepare('read_publications', sql)
ActiveRecord::Base.connection_raw.exec_prepared('read_publications', arr)
However, the documentation says the array parameters has to be in a certain format:
https://deveiate.org/code/pg/PG/Connection.html#method-i-exec_prepared
params is an array of the optional bind parameters for the SQL query. Each element of the params array may be either:
a hash of the form:
{:value => String (value of bind parameter)
:format => Fixnum (0 for text, 1 for binary)
}
See similar question: Prepare and execute statements with ActiveRecord using PostgreSQL
I am calling a stored procedure from my Groovy code. The stored proc looks like this
SELECT * FROM blahblahblah
SELECT * FROM suchAndsuch
So basically, two SELECT statements and therefore two ResultSets.
sql.eachRow("dbo.testing 'param1'"){ rs ->
println rs
}
This works fine for a single ResultSet. How can I get the second one (or an arbitrary number of ResultSets for that matter).
You would need callWithAllRows() or its variant.
The return type of this method is List<List<GroovyRowResult>>.
Use this when calling a stored procedure that utilizes both output
parameters and returns multiple ResultSets.
This question is kind of old, but I will answer since I came across the same requirement recently and it maybe useful for future reference for me and others.
I'm working on a Spring application with SphinxSearch. When you run a query in sphinx, you get results, you need to run a second query to get the metadata for number of records etc...
// the query
String query = """
SELECT * FROM INDEX_NAME WHERE MATCH('SEARCHTERM')
LIMIT 0,25 OPTION MAX_MATCHES=25;
SHOW META LIKE 'total_found';
"""
// create an instance of our groovy sql (sphinx doesn't use a username or password, jdbc url is all we need)
// connection can be created from java, don't have to use groovy for it
Sql sql = Sql.newInstance('jdbc:mysql://127.0.0.1:9306/?characterEncoding=utf8&maxAllowedPacket=512000&allowMultiQueries=true','sphinx','sphinx123','com.mysql.jdbc.Driver')
// create a prepared statement so we can execute multiple resultsets
PreparedStatement ps = sql.getConnection().prepareStatement(query)
// execute the prepared statement
ps.execute()
// get the first result set and pass to GroovyResultSetExtension
GroovyResultSetExtension rs1 = new GroovyResultSetExtension(ps.getResultSet())
rs1.eachRow {
println it
}
// call getMoreResults on the prepared statement to activate the 2nd set of results
ps.getMoreResults()
// get the second result set and pass to GroovyResultSetExtension
GroovyResultSetExtension rs2 = new GroovyResultSetExtension(ps.getResultSet())
rs2.eachRow {
println it
}
Just some test code, this needs some improving on. You can loop the result sets and do whatever processing...
Comments should be self-explanatory, hope it helps others in the future!
I am trying to use the Arel#extract method. I have seen an example in a test case in test_extract.rb in the source but when I try to reproduce it in my app, I get undefined method.
table = Arel::Table.new :users
puts table[:created_at].extract('hour').to_sql
=> NoMethodError: undefined method `extract' for #<Arel::Attributes::Attribute:0x7..8>
I am using pg as the database.
Update:
My goal is to end up with this result in sql:
SELECT users.* FROM users WHERE EXTRACT(HOUR from users."created_at") = '1'
I would like to find all users that were created on the hour equal to one of any day. This works in sql but I am wondering how to create it in arel. Here is an example of how it's used in the arel test suite.
extract is node's method, you can cast it on any column(such as users[:id]), but not on Arel::Table instance.
So, to construct your query you need:
get Arel::Table instance users = Arel::Table.new(:users) or if you use ActiveRecord - users = User.arel_table
set SELECT statement on Arel::Table instance with project method: users = users.project(Arel.sql('*'))
set WHERE statement with where method: users.where(users[:created_at].extract(:hour).eq(1))
In one block:
query = User.arel_table.
project(Arel.sql('*')).
where(users[:created_at].extract(:hour).eq(1))
users = User.find_by_sql(query)
# => "SELECT * FROM \"users\" WHERE EXTRACT(HOUR FROM \"users\".\"created_at\") = 1"
I've had to perform an extract DOW on an instance of Arel::Nodes::NamedFunction, which does not expose the method #extract (as of Arel 6.0). I managed to achieve this by manually creating an instance of Arel::Nodes::Extract. Here is what worked for me, in case anyone have a similar issue:
Arel::Nodes::Extract.new(
Arel::Nodes::NamedFunction.new('some_function_name', [param1, param2, ...]),
:dow
)
You can use an Arel node directly with ActiveRecord's #where instead of building the full query through Arel as exemplified by Alexander Karmes's answer. So here is another way to perform the query required by the answer:
User.where(
Arel::Nodes::Extract.new(User.arel_table[:created_at], :hour).eq(1)
)
Which yields:
SELECT "users".* FROM "users" WHERE EXTRACT(HOUR FROM "users"."created_at") = 1
With the added benefit you can keep chaining other scopes defined in your User model.
Setting the DBIC_TRACE environment variable to true:
BEGIN { $ENV{DBIC_TRACE} = 1 }
generates very helpful output, especially showing the SQL query that is being executed, but the SQL query is all on one line.
Is there a way to push it through some kinda "sql tidy" routine to format it better, perhaps breaking it up over multiple lines? Failing that, could anyone give me a nudge into where in the code I'd need to hack to add such a hook? And what the best tool is to accept a badly formatted SQL query and push out a nicely formatted one?
"nice formatting" in this context simply means better than "all on one line". I'm not particularly fussed about specific styles of formatting queries
Thanks!
As of DBIx::Class 0.08124 it's built in.
Just set $ENV{DBIC_TRACE_PROFILE} to console or console_monochrome.
From the documentation of DBIx::Class::Storage
If DBIC_TRACE is set then trace information is produced (as when the
debug method is set). ...
debug Causes trace information to be emitted on the debugobj
object. (or STDERR if debugobj has not specifically been set).
debugobj Sets or retrieves the object used for metric collection.
Defaults to an instance of DBIx::Class::Storage::Statistics that is
compatible with the original method of using a coderef as a callback.
See the aforementioned Statistics class for more information.
In other words, you should set debugobj in that class to an object that subclasses DBIx::Class::Storage::Statistics. In your subclass, you can reformat the query the way you want it to be.
First, thanks for the pointers! Partial answer follows ....
What I've got so far ... first some scaffolding:
# Connect to our db through DBIx::Class
my $schema = My::Schema->connect('dbi:SQLite:/home/me/accounts.db');
# See also BEGIN { $ENV{DBIC_TRACE} = 1 }
$schema->storage->debug(1);
# Create an instance of our subclassed (see below)
# DBIx::Class::Storage::Statistics class
my $stats = My::DBIx::Class::Storage::Statistics->new();
# Set the debugobj object on our schema's storage
$schema->storage->debugobj($stats);
And the definition of My::DBIx::Class::Storage::Statistics being:
package My::DBIx::Class::Storage::Statistics;
use base qw<DBIx::Class::Storage::Statistics>;
use Data::Dumper qw<Dumper>;
use SQL::Statement;
use SQL::Parser;
sub query_start {
my ($self, $sql_query, #params) = #_;
print "The original sql query is\n$sql_query\n\n";
my $parser = SQL::Parser->new();
my $stmt = SQL::Statement->new($sql_query, $parser);
#printf "%s\n", $stmt->command;
print "The parameters for this query are:";
print Dumper \#params;
}
Which solves the problem about how to hook in to get the SQL query for me to "pretty-ify".
Then I run a query:
my $rs = $schema->resultset('SomeTable')->search(
{
'email' => $email,
'others.some_col' => 1,
},
{ join => 'others' }
);
$rs->count;
However SQL::Parser barfs on the SQL generated by DBIx::Class:
The original sql query is
SELECT COUNT( * ) FROM some_table me LEFT JOIN others other_table ON ( others.some_col_id = me.id ) WHERE ( others.some_col_id = ? AND email = ? )
SQL ERROR: Bad table or column name '(others' has chars not alphanumeric or underscore!
SQL ERROR: No equijoin condition in WHERE or ON clause
So ... is there a better parser than SQL::Parser for the job?