I want to convert an SQL query into a JSONiq Query, is there already an implementation for this, if not, what do I need to know to be able to create a program that can do this ?
I am not aware of an implementation, however, it is technically feasible and straightforward. JSONiq has 90% of its DNA coming from XQuery, which itself was partly designed by people involved in SQL as well.
From a data model perspective, a table is mapped to a collection and each row of the table is mapped to a flat JSON object, i.e., all fields are atomic values, like so:
{
"Name" : "Turing",
"First" : "Alan",
"Job" : "Inventor"
}
Then, the mapping is done by converting SELECT-FROM-WHERE queries to FLWOR expressions, which provide a superset of SQL's functionality.
For example:
SELECT Name, First
FROM people
WHERE Job = "Inventor"
Can be mapped to:
for $person in collection("people")
where $person.job eq "Inventor"
return project($person, ("Name", "First"))
More complicated queries can also be mapped quite straight-forwardly:
SELECT Name, COUNT(*)
FROM people
WHERE Job = "Inventor"
GROUP BY Name
HAVING COUNT(*) >= 2
to:
for $person in collection("people")
where $person.job eq "Inventor"
group by $name := $person.name
where count($person) ge 2
return {
name: $name,
count: count($person)
}
Actually, if for had been called from and return had been called select, and if these keywords were written uppercase, the syntax of JSONiq would be very similar to that of SQL: it's only cosmetics.
I have a payment_request model and a payment_detail model. In the payment_request index I need to be able to search by first and last name which are stored in the payment_details table. I am newish to writing SQL and could use some help. I have what I believe to be the correct query below, but am not sure how to write that in my Rails controller so I can search by name.
SELECT first_name, last_name
FROM payment_details
LEFT OUTER JOIN payment_requests
ON payment_requests.id = payment_details.payment_request_id;
If you're using ActiveRecord models, you can skip all that and build that query with the ActiveRecord Querying Interface.
#payment_requests = PaymentRequest.joins(:payment_detail).where(payment_detail: {first_name: params[:first_name], last_name: params[:last_name]})
If you intent to show payment_details data on that index page, you should consider including that information in that query, so you avoid n+1 queries.
#payment_requests = PaymentRequest.includes(:payment_detail).where(payment_detail: { first_name: params[:first_name], last_name: params[:last_name]})
Note: You've got to have a complete match to use the above, so it may not be what you want.
I'd also recommend you use the Ransack gem to build complex queries. It would go something like this:
PaymentRequest.ransack(params[:q])
and in your views:
<%= f.search_field :payment_detail_first_name_or_payment_detail_last_name_cont %>
That would allow you to use just one field to query both columns.
You can do the following:
term_to_find = params[:search]
columns_to_search = %w( payment_details.first_name payment_details.last_name )
sql_conditions = []
columns_to_search.map |column_name|
sql_conditions.push("#{column_name} ILIKE :term_to_find")
end
PaymentRequest.includes(:payment_details)
.where(sql_conditions.join(' OR '), term_to_find: "%#{term_to_find}%")
This will find results containing the string you searched. Example: you typed "bob" in the search, it could find "bobby" or even "Mr. Bob" (the ILIKE makes the search case-insensitive)
I have two queries which look at separate database tables and find items from a JSONB column in each table that are in the format ["tag1","tag2","tag3"] etc. The purpose of the queries are to populate a list for a predictive dropdown i.e. if the list contains "dog" and the user types "d", "dog" should be returned. Each of these queries works individually and I can easily combine them into a single JOOQ query?
final Field<String> value = field(name("A", "value"), String.class);
final Result<Record1<String>> res1 = sql.dsl()
.selectDistinct(value)
.from(CAMPAIGN,lateral(table("jsonb_array_elements_text({0})", CAMPAIGN.TAGS)).as("A"))
.where(CAMPAIGN.STORE_KEY.equal(campaign.getStoreKey()))
.and(CAMPAIGN.CAMPAIGN_KEY.notEqual(campaignKey))
.and(value.like(search + "%%"))
.fetch();
final Result<Record1<String>> res2 = sql.dsl()
.selectDistinct(value)
.from(STOREFRONT, lateral(table("jsonb_array_elements_text({0})", STOREFRONT.TAGS)).as("A"))
.where(STOREFRONT.STORE_KEY.equal(campaign.getStoreKey()))
.and(value.like(search + "%%")).fetch();
Sure! In SQL, "combining" two queries is mostly implemented using UNION [ ALL ] (where ALL indicates that you want to maintain duplicates). In your case, write the following:
final Result<Record1<String>> result =
sql.dsl()
.select(value)
.from(
CAMPAIGN,
lateral(table("jsonb_array_elements_text({0})", CAMPAIGN.TAGS)).as("A"))
.where(CAMPAIGN.STORE_KEY.equal(campaign.getStoreKey()))
.and(CAMPAIGN.CAMPAIGN_KEY.notEqual(campaignKey))
.and(value.like(search + "%%"))
.union(
select(value)
.from(
STOREFRONT,
lateral(table("jsonb_array_elements_text({0})", STOREFRONT.TAGS)).as("A"))
.where(STOREFRONT.STORE_KEY.equal(campaign.getStoreKey()))
.and(value.like(search + "%%")))
.fetch();
Note that I have replaced selectDistinct() by select(), because the UNION operation already removes duplicates, so there's no need to remove duplicates in each individual union subquery.
How can I convert this code to raw sql and use in rails? Because When I deploy this code in heroku,there is a request timeout error.I think this will be faster if I use raw sql.
#payments = PaymentDetail.joins(:project).order('payment_details.created_at desc')
#payment_errors = PaymentError.joins(:project).order('payment_errors.created_at desc')
#all_payments = (#payments + #payment_errors)
You can do this:
sql = "Select * from ... your sql query here"
records_array = ActiveRecord::Base.connection.execute(sql)
records_array would then be the result of your sql query in an array which you can iterate through.
I know this is old... But I was having the same problem today and found a solution:
Model.find_by_sql
If you want to instantiate the results:
Client.find_by_sql("
SELECT * FROM clients
INNER JOIN orders ON clients.id = orders.client_id
ORDER BY clients.created_at desc
")
# => [<Client id: 1, first_name: "Lucas" >, <Client id: 2, first_name: "Jan">...]
Model.connection.select_all('sql').to_hash
If you just want a hash of values:
Client.connection.select_all("SELECT first_name, created_at FROM clients
WHERE id = '1'").to_hash
# => [
{"first_name"=>"Rafael", "created_at"=>"2012-11-10 23:23:45.281189"},
{"first_name"=>"Eileen", "created_at"=>"2013-12-09 11:22:35.221282"}
]
Result object:
select_all returns a result object. You can do magic things with it.
result = Post.connection.select_all('SELECT id, title, body FROM posts')
# Get the column names of the result:
result.columns
# => ["id", "title", "body"]
# Get the record values of the result:
result.rows
# => [[1, "title_1", "body_1"],
[2, "title_2", "body_2"],
...
]
# Get an array of hashes representing the result (column => value):
result.to_hash
# => [{"id" => 1, "title" => "title_1", "body" => "body_1"},
{"id" => 2, "title" => "title_2", "body" => "body_2"},
...
]
# ActiveRecord::Result also includes Enumerable.
result.each do |row|
puts row['title'] + " " + row['body']
end
Sources:
ActiveRecord - Findinig by
SQL.
Ruby on Rails - Active Record Result
.
You can execute raw query using ActiveRecord. And I will suggest to go with SQL block
query = <<-SQL
SELECT *
FROM payment_details
INNER JOIN projects
ON projects.id = payment_details.project_id
ORDER BY payment_details.created_at DESC
SQL
result = ActiveRecord::Base.connection.execute(query)
You can do direct SQL to have a single query for both tables. I'll provide a sanitized query example to hopefully keep people from putting variables directly into the string itself (SQL injection danger), even though this example didn't specify the need for it:
#results = []
ActiveRecord::Base.connection.select_all(
ActiveRecord::Base.send(:sanitize_sql_array,
["... your SQL query goes here and ?, ?, ? are replaced...;", a, b, c])
).each do |record|
# instead of an array of hashes, you could put in a custom object with attributes
#results << {col_a_name: record["col_a_name"], col_b_name: record["col_b_name"], ...}
end
Edit: as Huy said, a simple way is ActiveRecord::Base.connection.execute("..."). Another way is ActiveRecord::Base.connection.exec_query('...').rows. And you can use native prepared statements, e.g. if using postgres, prepared statement can be done with raw_connection, prepare, and exec_prepared as described in https://stackoverflow.com/a/13806512/178651
You can also put raw SQL fragments into ActiveRecord relational queries: http://guides.rubyonrails.org/active_record_querying.html
and in associations, scopes, etc. You could probably construct the same SQL with ActiveRecord relational queries and can do cool things with ARel as Ernie mentions in http://erniemiller.org/2010/03/28/advanced-activerecord-3-queries-with-arel/. And, of course there are other ORMs, gems, etc.
If this is going to be used a lot and adding indices won't cause other performance/resource issues, consider adding an index in the DB for payment_details.created_at and for payment_errors.created_at.
If lots of records and not all records need to show up at once, consider using pagination:
https://www.ruby-toolbox.com/categories/pagination
https://github.com/mislav/will_paginate
If you need to paginate, consider creating a view in the DB first called payment_records which combines the payment_details and payment_errors tables, then have a model for the view (which will be read-only). Some DBs support materialized views, which might be a good idea for performance.
Also consider hardware or VM specs on Rails server and DB server, config, disk space, network speed/latency/etc., proximity, etc. And consider putting DB on different server/VM than the Rails app if you haven't, etc.
I want to work with exec_query of the ActiveRecord class, because it returns the mapping of the query transforming into object, so it gets very practical and productive to iterate with the objects when the subject is Raw SQL.
Example:
values = ActiveRecord::Base.connection.exec_query("select * from clients")
p values
and return this complete query:
[{"id": 1, "name": "user 1"}, {"id": 2, "name": "user 2"}, {"id": 3, "name": "user 3"}]
To get only list of values
p values.rows
[[1, "user 1"], [2, "user 2"], [3, "user 3"]]
To get only fields columns
p values.columns
["id", "name"]
You can also mix raw SQL with ActiveRecord conditions, for example if you want to call a function in a condition:
my_instances = MyModel.where.not(attribute_a: nil) \
.where('crc32(attribute_b) = ?', slot) \
.select(:id)
I'm trying to write a relatively simple algorithm to search for a string on several attributes
Given some data:
Some data:
1: name: 'Josh', location: 'los angeles'
2: name: 'Josh', location: 'york'
search string: "josh york"
The results should be [2, 1] because that query string hits the 2nd record twice, and the 1st record once.
It's safe to assume case-insensitivity here.
So here's what I have so far, in ruby/active record:
query_string = "josh new york"
some_attributes = [:name, :location]
results = {}
query_string.downcase.split.each do |query_part|
some_attributes.each do |attribute|
find(:all, :conditions => ["#{attribute} like ?", "%#{query_part}%"]).each do |result|
if results[result]
results[result] += 1
else
results[result] = 1
end
end
end
end
results.sort{|a,b| b[1]<=>a[1]}
The issue I have with this method is that it produces a large number of queries (query_string.split.length * some_attributes.length).
Can I make this more efficient somehow by reducing the number of queries ?
I'm okay with sorting within ruby, although if that can somehow be jammed into the SQL that'd be nice too.
Why aren't you using something like Ferret? Ferret is a Ruby + C extension to make a full text index. Since you seem to be using ActiveRecord, there's also acts_as_ferret.