I currently have a Postgres DB with a SHOWS table that looks like this:
id date venue price is_sold ages time multi_day pit
13 2016-01-02 924 Gilman Street, Berkeley $8 FALSE a/a 7:30pm FALSE FALSE
I have a query that orders everything in this table by date (which is of timestamptz type):
SELECT shows.* FROM shows ORDER BY date
Right now, running this query with the pg-promise library produces an array of objects that look like this:
[
{ id: 1,
date: Thu Oct 20 2016 17:00:00 GMT-0700 (PDT),
venue: 'Regency Ballroom, S.F.',
price: null,
is_sold: false,
ages: 'a/a',
time: null,
multi_day: false,
pit: false },
{ id: 2,
date: Thu Oct 20 2016 17:00:00 GMT-0700 (PDT),
venue: 'Great American Music Hall.',
price: null,
is_sold: false,
ages: 'a/a',
time: null,
multi_day: false,
pit: false } ... ]
I want to return an array that groups this data by date column. Example output:
[
{date: Thu Oct 20 2016 17:00:00 GMT-0700 (PDT),
shows: [1, 2]}, // or however this should be structured; I just want a way to get multiple show ids from a single date.
...
]
which I assume I could then INNER JOIN to get the rest of the show details that I need.
How do I adjust my query to return this sort of grouping?
Just to answer the question, use
SELECT date, string_agg(id, ', ') AS show_ids
FROM shows
GROUP BY date
Related
I have survey responses that I need to display in 2 week increments based on the created_at date. The output should be something like:
{10/1 : 4
10/15: 6
10/29: 3}
...where the first week is created from the earliest created_at date in the survey responses and the same for the last, but the latest created_at. I've seen things like group_by{ |s| s.created_at.month} but not something for every other week, starting on the Monday of the week. Any help would be much appreciated!
You could calculate the number of days between the current record and the oldest and then use modulo 14:
oldest_date = YourModel.minimum(:created_at).to_date
your_relation.group_by { |record|
(record.created_at.to_date - oldest_date).modulo(14)
}
You could define a method returning the year and the week range, for example:
def by_year_and_by_two_weeks(my_date)
wk = my_date.strftime("%W").to_i/2
wks = [wk, wk.odd? ? wk+1 : wk - 1].sort # <== adjust this
[my_date.year, wks]
end
The %W in rails uses monday as the first day of the week.
So, when you have your object:
object.created_at #=> 2021-09-19 08:58:16.78053 +0200
by_year_and_by_two_weeks(object.created_at) #=> [2021, [17, 18]]
Then you can use the method for grouping.
objects.group_by { |d| by_year_and_by_two_weeks(d.created_at) }
This is an example of result after values transformation to make it readables:
{[2021, [20, 21]]=>[2021-10-14 09:00:17.421142 +0200, 2021-10-15 09:00:17.421224 +0200, 2021-10-06 09:00:17.421276 +0200, 2021-10-10 09:00:17.421328 +0200], [2021, [18, 19]]=>[2021-09-22 09:00:17.421385 +0200]}
Of course you can change the by_year_and_by_two_weeks return value as it best fits for you.
Your requirements:
You want to group on the monday of the starting biweekly period.
You want the Hash key of the group to be the date of that monday.
I will also add another future-proofing requirement:
The start and end dates can be anywhere, even accross year boundaries.
If we take these requirements, and then utilize the modulo idea from spickermann, we can build it like this:
start_date = first_item.created_at.to_date.prev_occurring(:monday)
your_items.group_by { |item|
item_date = item.created_at.to_date
days_from_start = item_date - start_date
biweekly_offset = days_from_start.modulo(14)
biweekly_monday = item_date - biweekly_offset
biweekly_monday
}
Example:
test_dates = [
Date.new(2021, 10, 1),
Date.new(2021, 10, 6),
Date.new(2021, 10, 10),
Date.new(2021, 10, 13),
Date.new(2021, 10, 20),
Date.new(2021, 10, 31)
]
start = test_dates.first.prev_occurring(:monday)
pp test_dates.group_by { |date|
days_from_start = date - start
biweekly_offset = days_from_start.modulo(14)
biweekly_monday = date - biweekly_offset
biweekly_monday
}
Output:
{ Mon, 27 Sep 2021 => [Fri, 01 Oct 2021, Wed, 06 Oct 2021, Sun, 10 Oct 2021],
Mon, 11 Oct 2021 => [Wed, 13 Oct 2021, Wed, 20 Oct 2021],
Mon, 25 Oct 2021 => [Sun, 31 Oct 2021] }
ruby "2.6.3"
gem "audited", "4.8.0"
gem "pg", "0.18.4"
I need to be able to:
search revisions by date range
search all the audits by date range then get the latest of each record
Currently the record's revision has nil for created_at and updated_at columns. I am able to report.revision_at(yesterday) but not date range
So I tried looking into audits. I am grouping them by auditable_id which looks it's group by records. Then I get the latest. But this only shows me what was changed:
Audited::Audit.where(auditable_type: "Report")
.where(created_at: 7.months.ago..1.month.ago)
.group_by(&:auditable_id)
.map { |_, v| v.sort_by(&:created_at).max }
Ideally I want to do something like this to get the latest revisions between the time range I am looking for
Reports.all.map do |report|
report.revisions
.where(created_at: 7.months.ago..1.month.ago)
.sort_by(&:created_at).max
end
Here's what I get when I look at report.revisions. As you see created_at comes back nil when looking at the revisions:
[#<Report:0x000
id: "100f5d9d-",
device: "1-device-118",
advisor: "system",
created_at: nil,
updated_at: nil>]
original record in Report table does have created_at and updated_at:
#<Report:0x0
id: "100f5d9d-",
device: "1-device-118",
advisor: "system",
created_at: Fri, 07 Dec 2018 03:15:37 UTC +00:00,
updated_at: Fri, 07 Dec 2018 03:15:37 UTC +00:00>
I have a query to list unique records based on project_parent_id field. The records should be listed based on the last updated_at time. Means, only one record can appear for a day, for one project_parent_id.
The used query is:
Project.unscoped.group("updated_at,project_parent_id,id").select("distinct project_parent_id,id,updated_at").order(updated_at: :desc).where('Date(projects.updated_at)>=? and Date(projects.updated_at)<=? and projects.last_modified_by=? and projects.project_parent_id is not ?', start_date, end_date, user_id,nil)
The result obtained is:
Project:0x0000000bd2a568 id: 2973, updated_at: Wed, 19 Sep 2018 10:03:27 UTC +00:00, project_parent_id: 2966,
Project:0x0000000bd2a400 id: 2972, updated_at: Wed, 19 Sep 2018 09:45:03 UTC +00:00, project_parent_id: 2964,
Project:0x0000000bd2a298 id: 2971, updated_at: Wed, 19 Sep 2018 09:44:30 UTC +00:00, project_parent_id: 2966
But, the last record not expecting in the result. The last record and first record has the same project_parent_id: 2966. Only the first two records are enough.
Can you please help.
You are selecting distinct combinations of 'project_parent_id,id,updated_at'. The last record is returned because the project_parent_id is different than the one in the first record. As you said
Means, only one record can appear for a day, for one
project_parent_id.
So there is no mistake in your result set.
Project.unscoped.group("updated_at,project_parent_id,id")
.select("project_parent_id, id, updated_at")
.order(updated_at: :desc)
.where("Date(projects.updated_at)>= ? AND Date(projects.updated_at)<= ?", start_date, end_date)
.where("projects.last_modified_by = ? AND projects.project_parent_id != ?", user_id, nil).uniq
Use DISTINCT ON to get just the first row for each group from project_parent_id and the date from updated_at.
Project.unscoped
.group("DATE(updated_at), project_parent_id, id")
.select("DISTINCT ON (project_parent_id, DATE(updated_at)) project_parent_id, id, updated_at")
.order(updated_at: :desc)
.where('Date(projects.updated_at)>=? and Date(projects.updated_at)<=? and projects.last_modified_by=? and projects.project_parent_id is not ?', start_date, end_date, user_id, nil)
I have some activity occurrences with the date range they occur across:
ActivityOccurrence:
ID: 1, ActivityID: 1, StartDate: 2018-05-01, EndDate: 2018-06-30
ID: 2, ActivityID: 2, StartDate: 2018-06-01, EndDate: 2018-07-31
ID: 3, ActivityID: 3, StartDate: 2018-07-01, EndDate: 2018-08-31
Each activity has a price which apply within an effective period:
EffectivePeriod
ID: 1, ActivityID: 1, ValidFrom: 2018-01-01, ValidTo: 2018-06-30, Price: 50
ID: 2, ActivityID: 2, ValidFrom: 2018-01-01, ValidTo: 2018-06-30, Price: 100
ID: 3, ActivityID: 3, ValidFrom: 2018-01-01, ValidTo: 2018-06-30, Price: 70
ID: 4, ActivityID: 1, ValidFrom: 2018-07-01, ValidTo: 2018-12-31, Price: 55
ID: 5, ActivityID: 2, ValidFrom: 2018-07-01, ValidTo: 2018-12-31, Price: 120
ID: 6, ActivityID: 3, ValidFrom: 2018-07-01, ValidTo: 2018-12-31, Price: 80
I'd like to link the Activity Occurrences with their correct rates. So:
ActivityOccurrence ID of 1 would link with EffectivePeriod ID of 1, spanning only the first effective period.
ActivityOccurrence ID of 2 would link with both EffectivePeriod ID of 2 and 5 as it spans across 2 effective periods.
ActivityOccurrence ID of 3 would link with EffectivePeriod ID of 6, spanning only the second effective period.
Doing a standard JOIN gets both effective periods for all 3 activity occurrences which I don't want. Using StartDate >= ValidFrom is correct for the first activity occurrence, but not the second and third. Using StartDate <= ValidTo means the first one is wrong, but the second and third are correct. Switching StartDate to EndDate also has some issues.
SQLFiddle: http://sqlfiddle.com/#!18/576c6/6
I feel like I'm missing something and the answer is very simple but I can't figure out what it is.
Are you trying to make sure that each ActivityOccurrence is joined with EACH EffectivePeriod that is temporally included in its date range?
What I use in such cases is make sure either start or end date of one table is between the start-end of the other:
SELECT ao.ActivityID, ao.StartDate, ao.EndDate, ep.Price
FROM ActivityOccurrence ao
JOIN EffectivePeriod ep ON ao.ActivityID = ep.ActivityID
AND
(
(ao.StartDate between ep.ValidFrom and ep.ValidTo)
OR
(ao.EndDate between ep.ValidFrom and ep.ValidTo)
)
I'm a newbie really getting stuck with the different time and date formats in ruby.
I've figured out how to work around them until now. In this case I'm using
find_or_initialize_by_sessiondate_and_person_id() and it always creates a new record because (apparently) it doesn't recognize the date being pulled out of the database as the same as the one being provided. I'm reading data in from a CSV file and the input looks like:
Fri Apr 22 15:09:00 2011
Active record doesn't complain about this format -- it puts it in a datetime field in the database (SQLlite but I'd like the solution to work on other DB's) just fine. I tried adding a timezone to the input like this:
createdate = DateTime.strptime(record_split[ 4 ].chomp + " UTC","%a %b %e %T %Y %Z"
session = Session.find_or_initialize_by_sessiondate_and_person_id(createdate, person.id)
if session.new_record?
puts "new record for #{person.name}, #{createdate.inspect}, #{session.sessiondate.inspect}"
The output looks like:
new record for Mickey Mouse, Fri, 22 Apr 2011 15:09:00 +0000, Fri, 22 Apr 2011 15:09:00 UTC +00:00
This certainly looks like the same date/time to me, why is ActiveRecord creating a new record?
I'm a bit confused about what's going on. From your example, the thing that concerns me most is the fact that "Tue Oct 5 19:43:23 2010" is being coverted to "Fri, 22 Apr 2011 15:09:00 +0000" (unless you're using different date for each example...)
To make it easier on yourself, you may be better off simply using Time.parse rather than DateTime.strptime:
ree-1.8.7-2010.02 > Time.parse("Tue Oct 5 19:43:23 2010")
=> Tue Oct 05 19:43:23 +1100 2010
The other thing to consider is that internally, Rails 3 stores all timestamps as UTC; it then displays that time in the default time-zone on-the-fly.
Finally, depending how the original timestamp field was generated (ie, Time.now vs parsing a string) you might be running into issues with precision. If createdate was populated using Time.now, you might find that it has been persisted with a value that is correct to the microsecond (eg: '19:43:23.34131') which, when compared to your parsed time ('19:43:23.00000') is not strictly equal.
Maybe because person_id is different?
Also this DateTimes are little differetn:
Fri, 22 Apr 2011 15:09:00 +0000
vs
Fri, 22 Apr 2011 15:09:00 UTC +00:00
It's always a good idea to store dates in UTC, so you can have something like this in your model:
def sessiondate=(dt)
write_attribute(:sessiondate, Time.parse(dt).utc)
end
Your DateTime.new ... doesn't work on my Rails 3.0.7 but a can advice you about an other way, this is my tests in rails console:
ruby-1.9.2-p136 :028 > v = Video.new :title => "toto"
=> #<Video id: nil>
ruby-1.9.2-p136 :029 > v.save
=> true
ruby-1.9.2-p136 :030 > v
=> #<Video id: 507>
v.id = 507
ruby-1.9.2-p136 :032 > v.created_at = "Tue Oct 5 19:43:23 2010" #I use your datetime
=> "Tue Oct 5 19:43:23 2010"
ruby-1.9.2-p136 :033 > v.save
=> true
ruby-1.9.2-p136 :034 > v.created_at
=> Tue, 05 Oct 2010 19:43:23 UTC +00:00
ruby-1.9.2-p136 :035 > v2 = Video.find_or_initialize_by_created_at("Tue Oct 5 19:43:23 2010")
=> #<Video id: nil, ... , created_at: "2010-10-05 19:43:23", ... >
ruby-1.9.2-p136 :039 > v2.save
=> true
ruby-1.9.2-p136 :040 > v2
=> #<Video id: 508>
New record, v.id = 507 and v2 = 508 !
... > v3 = Video.
find_or_initialize_by_created_at("Tue Oct 5 19:43:23 2010".to_datetime)
Now with String.to_datetime method on your date string
=> #<Video id: 507> # We get v !!!
ruby-1.9.2-p136 :043 > v == v3
=> true
p.s: It is a I18n app it's why you don't see title attribute.