Is there a way to dynamically add X number of nested form fields? For example if we have a select menu:
Select Menu
-1
-2
-3
-4
And the user selects 3, then create 3 nested form fields.
I have watched the Railscast on Nested model form but to me this already has the one set of fields_for already created and simply inserts them each time the link is clicked. I would like to dynamically insert X amount each time the select menu changes.
Here is some code from the Railscast:
def link_to_add_fields(name, f, association)
new_object = f.object.send(association).klass.new
id = new_object.object_id
fields = f.fields_for(association, new_object, child_index: id) do |builder|
render(association.to_s.singularize + "_fields", f: builder)
end
link_to(name, '#', class: "add_fields", data: {id: id, fields: fields.gsub("\n", "")})
end
sorry for the late answer, but I was just looking around for the same thing. Had you checked this gem?
https://github.com/nathanvda/cocoon
Related
I have an article model and comments model. How do i get a list of articles that does not have any comments using active record?
Model Columns:
Article: body:string (has many comments)
Comment: body:string, article_id:integer (belongs to article)
If you want to get the result using single query and want the result to be an activerecord relation, use:
Article.where('id NOT IN (SELECT DISTINCT(article_id) FROM comments)')
This is same but would be more rails way
Article.where.not('id IN (SELECT DISTINCT(article_id) FROM comments)')
try below code to fetch all articles with no comments:
Article.includes(:comments).where.not(comments: {article_id: nil})
OR
data = []
Article.all.each do |a|
data << a if a.comments.blank?
end
puts data
OR
ids = Comment.all.pluck(:article_id)
data = Article.where.not(id: ids)
I have an Item model with a name and user_id. I would like to search all Items and group by User so I can display each user with their items:
User 1:
Item A
Item B
User 2
Item C
User 3
Item D
Item E
Item D
...
In the console, I try this: (From the documentation)
Item.search({group_by: :user_id, limit: 50}).all
And I get this:
Sphinx Query (0.4ms)
Sphinx Caught Sphinx exception: can't dup Symbol (0 tries left)
TypeError: can't dup Symbol
from /Users/pinouchon/.rvm/gems/ruby-1.9.3-p392#gemset/gems/riddle-1.5.6/lib/riddle/client/message.rb:18:in `dup'
Same error with this:
Item.search({group_by: :user_id, order_group_by: '#count desc'}).each_with_group
Search with no group by returns results without any problem.
What's wrong ?
The quick answer: try sending through the attribute name as a string, not a symbol.
The longer answer: that query isn't going to give you the results you want – it'll return one item per user. You'd be better served sorting by user_id instead:
items = Item.search(
:order => 'user_id ASC, #weight DESC',
:sort_mode => :extended,
:limit => 50
)
From there, you could then get the layer of users grouping each items using Ruby/Rails:
items.group_by(&:user)
I have been having a problem with the Rails 3 Active Record Query Interface. I have a lookup table (lookups), a Main table (through_references), and a through/join table called through_tables. Thus this is a HABTM configuration that I have set up using has_many :through.
Update: Of special note here is that when I am doing these joins, I have been joining on IDs, to provide filtering of records. It seems that this does not work with Active Record Query Interface. If you do not want to see the gory details of my travails, you can skip down to see my workaround below.
We are also going to have a number of Main Items (through_references table) should be able to have any combination of lookup items, and to conveniently be able to click the relevant lookup items say through check boxes.
I have posted the code on github. There is quite a lot more explanations on the github source code. to see the results, go to the lookups index page. Note that you will need to create the records using the scaffold code.
I also have the code up and running on heroku, with more explanations and examples.
class Lookup < ActiveRecord::Base
has_many :fk_references
has_many :through_tables
has_many :through_references, :through => :through_tables
attr_accessible :name, :value
end
class ThroughTable < ActiveRecord::Base
belongs_to :through_reference
belongs_to :lookup
attr_accessible :description, :through_reference_id, :lookup_id
end
class ThroughReference < ActiveRecord::Base
has_many :through_tables
has_many :lookups, :through => :through_tables
attr_accessible :description
end
If we want to have a listing if all the lookup items, and the Main Items that correspond with them, we can LEFT JOIN the ‘lookups’ table with the Main Items (through_references) table.
Corresponding SQL:
SELECT * FROM lookups
LEFT OUTER JOIN through_tables ON (lookups.id = through_tables.lookup_id AND through_tables.through_reference_id = 1)
LEFT OUTER JOIN through_references ON through_references.id = through_tables.through_reference_id
ORDER BY lookups.id
Returned records:
1;“Lookup Item 1”;“1”;“2012-06-06 17:14:40.819791”;“2012-06-06 17:14:40.819791”;1;1;1;“Main Item 1 has Lookup item 1”;“2012-06-06 17:17:31.355425”;“2012-06-06 17:17:31.355425”;1;“Main Item 1”;“2012-06-06 17:16:30.004375”;“2012-06-06 17:16:30.004375”
2;“Lookup Item 2”;“2”;“2012-06-06 17:14:59.584756”;“2012-06-06 17:14:59.584756”;;;;“”;“”;“”;;“”;“”;“”
3;“Lookup Item 3”;“3”;“2012-06-06 17:15:14.700239”;“2012-06-06 17:15:14.700239”;2;1;3;“Main Item 1 has Lookup item 3”;“2012-06-06 17:17:53.169715”;“2012-06-06 17:17:53.169715”;1;“Main Item 1”;“2012-06-06 17:16:30.004375”;“2012-06-06 17:16:30.004375”
This is what I expected.
=== Active Record Query Interface using custom left join
Lookup.joins(“LEFT OUTER JOIN through_tables ON (lookups.id = through_tables.lookup_id AND through_tables.through_reference_id = 1)” ).includes(:through_references).order(‘lookups.id’)
What is returned from Active Record Query Interface (note I navigate down through the Active Record hierarchy):
Lookup ID Lookup Name Lookup Value Through Table ID Through Table Description Main Item ID Main Item Description
1 Lookup Item 1 1 1 Main Item 1 has Lookup item 1 1 Main Item 1
1 Lookup Item 1 1 3 Main Item 2 has Lookup item 1 2 Main Item 2
2 Lookup Item 2 2 4 Main Item 2 has Lookup item 2 2 Main Item 2
3 Lookup Item 3 3 2 Main Item 1 has Lookup item 3 1 Main Item 1
This is NOT what I expected.
What we have here is identical to the simple left join (without the AND clause). This tells me that the AND clause is being ignored in the Active Record Query Interface.
=== Active Record Query Interface using find_by_sql approach
Lookup.find_by_sql("SELECT * FROM lookups LEFT OUTER JOIN through_tables ON (through_tables.lookup_id = lookups.id AND through_tables.through_reference_id = 1) LEFT OUTER JOIN through_references ON through_references.id = through_tables.through_reference_id ORDER BY lookups.value, through_references.id" )
What is returned from Active Record Query Interface (note I navigate down through the Active Record hierarchy)::
Lookup ID Lookup Name Lookup Value Through Table ID Through Table Description Main Item ID Main Item Description
1 Lookup Item 1 1 3 Main Item 2 has Lookup item 1 2 Main Item 2
1 Lookup Item 1 1 1 Main Item 1 has Lookup item 1 1 Main Item 1
Lookup Item 2 2 No through_tables entry
1 Lookup Item 3 3 3 Main Item 2 has Lookup item 1 2 Main Item 2
1 Lookup Item 3 3 1 Main Item 1 has Lookup item 1 1 Main Item 1
The results here are crazy!
Is this a BUG, is this the intended effects, or am I missing something ?
I hope there is a clean way of doing this, without having to generate two result sets, and merge them by code.
I have found a work-around. The issue seems to be that Active Record will not recognize joins that filter on an ID (LEFT OUTER JOIN xyz ON xyz.id = ID).
My work-around involves creating a stored procedure or function that takes the ID in as a parameter, does the join in the Database, and returns a nice flat recordset.
see: Heroku demo page (skip to bottom)
Note, I am not marking this as a solution, because this is a work-around, and nothing to do with active record.
Well, reading the github project, I see this:
What I really want to do is have a list of all of the lookup items,
and if there are matching Main Items, have them appended on to the
returned record, and if not, I want nulls. This is a technique that I
have used for over 10 years.
I'm thinking that problem is exactly that you want to do it that way, when it would be more natural to let rails eager loading handle it, and so you've gotten fixated on fetching everything in a single massive join.
What I would do is something like:
Lookup.where( .. insert any needed conditions here ...).includes(:through_tables)
Then ActiveQuery will then fetch all the Lookup in one query, and then use eager loading to fetch any associations named in the includes statement, one query per association.
Note I'm not saying that joins are bad, just saying that this is a more natural way to do it in rails. I like to use the Preloader http://apidock.com/rails/ActiveRecord/Associations/Preloader to separate out the decision about what to eager load from the decision about which data to fetch. I find that helpful in controllers - let the model decide what the conditions are, but let the controller decide which objects it'll need to eager load.
HTH
I've just followed the Railscast tutorial:
http://railscasts.com/episodes/262-trees-with-ancestry
Is it possible to paginate results from Ancestry which have been arranged?
eg: Given I have the following in my Message controller:
def index
#messages = Message.arrange(:order => :name)
end
Then how would I paginate this as it's going to result in a hash?
Update
I found that if I use .keys then it will paginate, but only the top level not the children.
Message.scoped.arrange(:order => :name).keys
Update
Each message has a code and some content. I can have nested messages
Suppose I have
code - name
1 - Test1
1 - test1 sub1
2 - test1 sub2
2 - Test2
1 - test2 sub1
2 - test2 sub2
3 - test2 sub3
This is how I want to display the listing, but I also want to paginate this sorted tree.
It is possible but I've only managed to do it using two database trips.
The main issue stems from not being able to set limits on a node's children, which leads to either a node's children being truncated or children being orphaned on subsequent pages.
An example:
id: 105, Ancestry: Null
id: 117, Ancestry: 105
id: 118, Ancestry: 105/117
id: 119, Ancestry: 105/117/118
A LIMIT 0,3 (for the sake of the example above) would return the first three records, which will render all but id:119. The subsequent LIMIT 3,3 will return id: 119 which will not render correctly as its parents are not present.
One solution I've employed is using two queries:
The first returns root nodes only. These can be sorted and it is this query that is paginated.
A second query is issued, based on the first, which returns all children of the paginated parents. You should be able to sort children per level.
In my case, I have a Post model (which has_ancestry) . Each post can have any level of replies. Also a post object has a replies count which is a cache counter for its immediate children.
In the controller:
roots = #topic.posts.roots_only.paginate :page => params[:page]
#posts = Post.fetch_children_for_roots(#topic, roots)
In the Post model:
named_scope :roots_only, :conditions => 'posts.ancestry is null'
def self.fetch_children_for_roots(postable, roots)
unless roots.blank?
condition = roots.select{|r|r.replies_count > 0}.collect{|r| "(ancestry like '#{r.id}%')"}.join(' or ')
unless condition.blank?
children = postable.posts.scoped(:from => 'posts FORCE INDEX (index_posts_on_ancestry)', :conditions => condition).all
roots.concat children
end
end
roots
end
Some notes:
MySQL will stop using the ancestry column index if multiple LIKE statements are used. The FORCE INDEX forces mySQL to use the index and prevents a full table scan
LIKE statements are only built for nodes with direct children, so that replies_count column came in handy
What the class method does is appends children to root, which is a WillPaginate::Collection
Finally, these can be managed in your view:
=will_paginate #posts
-Post.arrange_nodes(#posts).each do |post, replies|
=do stuff here
The key method here is arrange_nodes which is mixed in from the ancestry plugin and into your model. This basically takes a sorted Array of nodes and returns a sorted and hierarchical Hash.
I appreciate that this method does not directly address your question but I hope that the same method, with tweaks, can be applied for your case.
There is probably a more elegant way of doing this but overall I'm happy with the solution (until a better one comes along).
In my application, say, animals have many photos. I'm querying photos of animals such that I want all photos of all animals to be displayed. However, I want each animal to appear as a photo before repetition occurs.
Example:
animal instance 1, 'cat', has four photos,
animal instance 2, 'dog', has two photos:
photos should appear ordered as so:
#photo belongs to #animal
tiddles.jpg , cat
fido.jpg dog
meow.jpg cat
rover.jpg dog
puss.jpg cat
felix.jpg, cat (no more dogs so two consecutive cats)
Pagination is required so I can't
order on an array.
Filename
structure/convention provides no
help, though the animal_id exists on
each photo.
Though there are two
types of animal in this example this
is an active record model with
hundreds of records.
Animals may be
selectively queried.
If this isn't possible with active_record then I'll happily use sql; I'm using postgresql.
My brain is frazzled so if anyone can come up with a better title, please go ahead and edit it or suggest in comments.
Here is a PostgreSQL specific solution:
batch_id_sql = "RANK() OVER (PARTITION BY animal_id ORDER BY id ASC)"
Photo.paginate(
:select => "DISTINCT photos.*, (#{batch_id_sql}) batch_id",
:order => "batch_id ASC, photos.animal_id ASC",
:page => 1)
Here is a DB agnostic solution:
batch_id_sql = "
SELECT COUNT(bm.*)
FROM photos bm
WHERE bm.animal_id = photos.animal_id AND
bm.id <= photos.id
"
Photo.paginate(
:select => "photos.*, (#{batch_id_sql}) batch_id",
:order => "batch_id ASC, photos.animal_id ASC",
:page => 1)
Both queries work even when you have a where condition. Benchmark the query using expected data set to check if it meets the expected throughput and latency requirements.
Reference
PostgreSQL Window function
Having no experience in activerecord. Using plain PostgreSQL I would try something like this:
Define a window function over all previous rows which counts how many time the current animal has appeared, then order by this count.
SELECT
filename,
animal_id,
COUNT(*) OVER (PARTITION BY animal_id ORDER BY filename) AS cnt
FROM
photos
ORDER BY
cnt,
animal_id,
filename
Filtering on certain animal_id's will work. This will always order the same way. I don't know if you want something random in there, but it should be easily added.
New solution
Add an integer column called batch_id to the animals table.
class AddBatchIdToPhotos < ActiveRecord::Migration
def self.up
add_column :photos, :batch_id, :integer
set_batch_id
change_column :photos, :batch_id, :integer, :nil => false
add_index :photos, :batch_id
end
def self.down
remove_column :photos, :batch_id
end
def self.set_batch_id
# set the batch id to existing rows
# implement this
end
end
Now add a before_create on the Photo model to set the batch id.
class Photo
belongs_to :animal
before_create :batch_photo_add
after_update :batch_photo_update
after_destroy :batch_photo_remove
private
def batch_photo_add
self.batch_id = next_batch_id_for_animal(animal_id)
true
end
def batch_photo_update
return true unless animal_id_changed?
batch_photo_remove(batch_id, animal_id_was)
batch_photo_add
end
def batch_photo_remove(b_id=batch_id, a_id=animal_id)
Photo.update_all("batch_id = batch_id- 1",
["animal_id = ? AND batch_id > ?", a_id, b_id])
true
end
def next_batch_id_for_animal(a_id)
(Photo.maximum(:batch_id, :conditions => {:animal_id => a_id}) || 0) + 1
end
end
Now you can get the desired result by issuing simple paginate command
#animal_photos = Photo.paginate(:page => 1, :per_page => 10,
:order => :batch_id)
How does this work?
Let's consider we have data set as given below:
id Photo Description Batch Id
1 Cat_photo_1 1
2 Cat_photo_2 2
3 Dog_photo_1 1
2 Cat_photo_3 3
4 Dog_photo_2 2
5 Lion_photo_1 1
6 Cat_photo_4 4
Now if we were to execute a query ordered by batch_id we get this
# batch 1 (cat, dog, lion)
Cat_photo_1
Dog_photo_1
Lion_photo_1
# batch 2 (cat, dog)
Cat_photo_2
Dog_photo_2
# batch 3,4 (cat)
Cat_photo_3
Cat_photo_4
The batch distribution is not random, the animals are filled from the top. The number of animals displayed in a page is governed by per_page parameter passed to paginate method (not the batch size).
Old solution
Have you tried this?
If you are using the will_paginate gem:
# assuming you want to order by animal name
animal_photos = Photo.paginate(:include => :animal, :page => 1,
:order => "animals.name")
animal_photos.each do |animal_photo|
puts animal_photo.file_name
puts animal_photo.animal.name
end
I'd recommend something hybrid/corrected based on KandadaBoggu's input.
First off, the correct way to do it on paper is with row_number() over (partition by animal_id order by id). The suggested rank() will generate a global row number, but you want the one within its partition.
Using a window function is also the most flexible solution (in fact, the only solution) if you want to plan to change the sort order here and there.
Take note that this won't necessarily scale well, however, because in order to sort the results you'll need to:
fetch the whole result set that matches your criteria
sort the whole result set to create the partitions and obtain a rank_id
top-n sort/limit over the result set a second time to get them in their final order
The correct way to do this in practice, if your sort order is immutable, is to maintain a pre-calculated rank_id. KandadaBoggu's other suggestion points in the correct direction in this sense.
When it comes to deletes (and possibly updates, if you don't want them sorted by id), you may run into issues because you end up trading faster reads for slower writes. If deleting the cat with an index of 1 leads to updating the next 50k cats, you're going to be in trouble.
If you've very small sets, the overhead might be very acceptable (don't forget to index animal_id).
If not, there's a workaround if you find the order in which specific animals appear is irrelevant. It goes like this:
Start a transaction.
If the rank_id is going to change (i.e. insert or delete), obtain an advisory lock to ensure that two sessions can't impact the rank_id of the same animal class, e.g.:
SELECT pg_try_advisory_lock('the_table'::regclass, the_animal_id);
(Sleep for .05s if you don't obtain it.)
On insert, find max(rank_id) for that animal_id. Assign it rank_id + 1. Then insert it.
On delete, select the animal with the same animal_id and the largest rank_id. Delete your animal, and assign its old rank_id to the fetched animal (unless you were deleting the last one, of course).
Release the advisory lock.
Commit the work.
Note that the above will make good use of an index on (animal_id, rank_id) and can be done using plpgsql triggers:
create trigger "__animals_rank_id__ins"
before insert on animals
for each row execute procedure lock_animal_id_and_assign_rank_id();
create trigger "_00_animals_rank_id__ins"
after insert on animals
for each row execute procedure unlock_animal_id();
create trigger "__animals_rank_id__del"
before delete on animals
for each row execute procedure lock_animal_id();
create trigger "_00_animals_rank_id__del"
after delete on animals
for each row execute procedure reassign_rank_id_and_unlock_animal_id();
You can then create a multi-column index on your sort criteria if you're not joining all over them place, e.g. (rank_id, name). And you'll end up with a snappy site for reads and writes.
You should be able to get the pictures (or filenames, anyway) using ActiveRecord, ordered by name.
Then you can use Enumerable#group_by and Enumerable#zip to zip all the arrays together.
If you give me more information about how your filenames are really arranged (i.e., are they all for sure with an underscore before the number and a constant name before the underscore for each "type"? etc.), then I can give you an example. I'll write one up momentarily showing how you'd do it for your current example.
You could run two sorts and build one array as follows:
result1= The first of each animal type only. use the ruby "find" method for this search.
result2= All animals, sorted by group. Use "find" to again find the first occurrence of each animal and then use "drop" to remove those "first occurrences" from result2.
Then:
markCustomResult = result1 + result2
Then:
You can use willpaginate on markCustomResult