Im beginner using sql. Im creating new column
x1 = date_order - date_Delivery
to mesure the duration of waiting time of clients. My column X1: contains for example (00:10:24.949131, 01:00:01, 2 days 00:40:45). I want to convert this duration to seconds , in this case the second values will be 3601.
Can anyone help me please?
You can do:
extract(epoch from date_order - date_Delivery) / 60
This expression converts the interval to a number of seconds, that you can then divide by 60 to get the corresponding number of minutes.
Related
I have a column 'Duration' it holds the time the therapist spent with the client.
This is always entered as minutes so if the time was 3 hours it is entered as 180. I would like to set this in the query as 3.
This is how it is reporting from a canned report: Total duration time is the entered column, it is
defined as int,null. I would like to make this calculation and formatiing, in the sql for the shown column 'total duration'.
total_duration_num total_duration
10 0:10
120 2:00
30 0:30
5 0:05
60 1:00
One means of achieving this is to use:
the floor function to round down when dividing the minutes by 60 (for whole hours)
the mod function to get the remaining number of minutes after putting however many can fit into "whole" 60-minute hours
the lpad function to put a leading zero before that number of minutes, if <10, so that you see :05 rather than :5 for example
The query would look like this:
select duration,
concat( floor(duration/60) , ':' , lpad(mod(duration,60),2,'0') ) as hrs_mins
from duration_table;
This is a demonstration:
http://sqlfiddle.com/#!9/a52b6c/1/0
I need random interval time between 0 and (10 days and 5 hours).
My code:
select random() * (interval '10 days 5 hours')
from generate_series(1, 50)
It works like should, except a few strange results, like:
0 years 0 mons 7 days 26 hours 10 mins 1.353353 secs
The problem is 26 hours, it shouldn't be more than 23. And I never get 10 days, what I'd like to.
Intervals in Postgres are quite flexible, so hour values of greater than 23 do not necessarily roll over to days. Use jusify_interval() to return them to the normal "days" and "hours"."
So:
select justify_interval(random() * interval '10 day 5 hour')
from generate_series(1, 200)
order by 1 desc;
will return values with appropriate values for days, hours, minutes, and seconds.
Now, why aren't you getting intervals with more than 10 days? This is simple randomness. If you increase the number of rows to 200 (as above), you'll see them (in all likelihood). If you run the code multiple times, sometimes you'll see none in that range; sometimes you'll see two.
Why? You are asking how often you get a value of 240+ in a range of 245. Those top 5 hours account for 0.02% of the range (about 1/50). In other words a sample of 50 is not big enough -- any given sample of 50 random values is likely to be missing 1 or more 5 hour ranges.
Plus, without justify_interval(), you are likely to miss those anyway because they may show up as 9 days with an hours component larger than 23.
Try this:
select justify_hours(random() * (interval '245 hours'))
FROM generate_series(1, 50)
See Postgres Documentation for an explanation of the justify_* functions.
One option would be to use an interval of one hour, and then multiply by the random number between 0 and 1 coming from the series:
select random() * 245 * interval '1 hour'
from generate_series(1, 50);
I can see that the other answers suggest using justify_interval. If you just want a series of intervals between 0 and 245 hours (245 hours corresponding to 10 days and 5 hours), then my answer should suffice.
I am currently working on a Firebird database which stores slots for a time (like appointments). In the data the timeslot is stored in an Integer format but is not the slot time.
Example:
Slot Time: 11am
Database Value: 660
The database value is represented as the number of minutes since midnight. So 11am is 660 minutes from midnight. Or 12noon is 720
Question How, in Firebird, can a convert this 660 to display as 1100 (still an integer) or 540 as 900 (basically 9am)?
What you have stored into database is minutes since start of the day. So you just divide with 60 to get hours. Keep in mind that in SQL if both arguments of the division are integers you'll get integer as an answer, not decimal! Something like following should work for you (assuming the number of minutes is stored to timeslot field):
SELECT
cast(timeslot / 60 as varchar(2)) ||
case
when mod(timeslot, 60) < 10 then '0'|| mod(timeslot, 60)
else mod(timeslot, 60)
end
FROM t
This should give you 1130 for 11:30 am, not 1150 (11,5 hours).
Also see the DATEADD function.
First off I'd store it in the database as a TIME column instead of a number of minutes. Note that on a technical level Firebird stores TIME as the number of 100 microseconds since midnight. However if you really want to store as number of minutes, then you can use:
From minutes to TIME using DATEADD:
DATEADD(x MINUTE TO TIME'00:00')
Time to minutes using DATEDIFF:
DATEDIFF(MINUTE FROM TIME'00:00' TO y)
So:
SELECT
DATEADD(660 MINUTE TO TIME'00:00'),
DATEDIFF(MINUTE FROM TIME'00:00' TO TIME'11:00')
FROM RDB$DATABASE
Will return: 11:00:00.000, 660
I am using Postgres v9.2.6.
Have a system with lots of devices that take measurements. These measurements are stored in
table with three fields.
device_id
measurement (Indexed)
time (Indexed)
There could be 10 Million measurements in a single year. Most of the time the user is only interested in 100 min max pairs within equal interval for a certain period, for example in last 24 hours or in last 53 weeks. To get these 100 mins and maxs the period is divided into 100 equal intervals. From each interval min and max is extracted. Would you recommend the most efficient approach to query the data? So far I have tried the following query:
WITH periods AS (
SELECT time.start AS st, time.start + (interval '1 year' / 100) AS en FROM generate_series(now() - interval '1 year', now(), interval '1 year' / 100) AS time(start)
)
SELECT * FROM sample_data
JOIN periods
ON created_at BETWEEN periods.st AND periods.en AND
customer_id = 23
WHERE
sample_data.id = (SELECT id FROM sample_data WHERE created_at BETWEEN periods.st AND periods.en ORDER BY sample ASC LIMIT 1)
This test approach took over a minute for 1 million points on MacBook Pro.
Thanks...
Sorry about that. It was actually my question and looks like the author of this post caught cold so I ca not ask him to edit it. I've posted "more good" question here - Slow PostgreSQL Query for Mins and Maxs within equal intervals in a time period. Could you please close this question?
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
What is the fastest way to truncate timestamps to 5 minutes in Postgres?
Postgresql SQL GROUP BY time interval with arbitrary accuracy (down to milli seconds)
I want to aggregate data at 5 minute intervals in PostgreSQL. If I use the date_trunc() function, I can aggregate data at an hourly, monthly, daily, weekly, etc. interval but not a specific interval like 5 minute or 5 days.
select date_trunc('hour', date1), count(*) from table1 group by 1;
How can we achieve this in PostgreSQL?
SELECT date_trunc('hour', date1) AS hour_stump
, (extract(minute FROM date1)::int / 5) AS min5_slot
, count(*)
FROM table1
GROUP BY 1, 2
ORDER BY 1, 2;
You could GROUP BY two columns: a timestamp truncated to the hour and a 5-minute-slot.
The example produces slots 0 - 11. Add 1 if you prefer 1 - 12.
I cast the result of extract() to integer, so the division / 5 truncates fractional digits. The result:
minute 0 - 4 -> slot 0
minute 5 - 9 -> slot 1
etc.
This query only returns values for those 5-minute slots where values are found. If you want a value for every slot or if you want a running sum over 5-minute slots, consider this related answer:
PostgreSQL: running count of rows for a query 'by minute'
Here's a simple query you can either wrap in a function or cut and paste all over the place:
select now()::timestamp(0), (extract(epoch from now()::timestamptz(0)-date_trunc('d',now()))::int)/60;
It'll give you the current time, and a number from 0 to the n-1 where n=60 here. To make it every 5 minutes, make that number 300 and so on. It groups by the seconds since the start of the day. To make it group by seconds since year begin, hour begin, or whatever else, change the 'd' in the date_trunc.