I am using PostgreSQL version 8.1. I have a table as follows:
datetime | usage
-----------------------+----------
2015-12-16 02:01:45+00 | 71.615
2015-12-16 03:14:42+00 | 43.000
2015-12-16 01:51:43+00 | 25.111
2015-12-17 02:05:26+00 | 94.087
I would like to add the integer values in the usage column based on the date in the datetime column.
Simply, I would like the output to look as below:
datetime | usage
-----------------------+----------
2015-12-16 | 139.726
2015-12-17 | 94.087
I have tried SELECT dateTime::DATE, usage, SUM(usage) FROM tableName GROUP BY dateTime::DATE, lngusage; which does not perform as expected. Any assistance would be appreciated. Thanks in advance.
Below query should give you the desired result:
select to_char(timestamp, 'YYYY-MM-DD') as time, sum(usage)
from table
group by time
This one is for postgreSQL, I see you added MySQL also.
SELECT
dt
SUM(usage),
FROM (
SELECT
DATE_TRUNC('day', datetime) dt,
usage
FROM
tableName
) t
GROUP BY
dt
SELECT to_char(datetime, 'format'), sum(usage)
FROM table
group by to_char(datetime, 'format')
In addition you could a window function.
SELECT DATETIME
,SUM(USAGE) OVER(PARTITION BY CAST(datetime AS DATE) ORDER BY datetime) AS Usage
FROM TableName
Related
I need to find the average of the order that came:
Order_Date
2022-06-02 15:40:00 UTC
2022-06-07 11:01:00 UTC
2022-06-21 10:55:00 UTC
2022-06-23 14:44:00 UTC
Outcome:
average Order_Date *that came
Just apply the AVG() average function over your entire table:
SELECT AVG(Order_Date) AS Avg_Order_Date
FROM yourTable;
Average timestamp is unusual ask! But anyway, formally you can do below
select
timestamp_seconds(cast(avg(unix_seconds(timestamp(Order_date))) as int64)) as average_Order_Date
from your_table
if applied to sample data in your question - output is
Note: Supported signatures for AVG: AVG(INT64); AVG(FLOAT64); AVG(NUMERIC); AVG(BIGNUMERIC); AVG(INTERVAL) - that is why you need all this back and forth "translations"
WITH CTE as
(
SELECT Order_Date, LAG(Order_Date,1) OVER(ORDER BY Order_Date ASC) as Datelag
FROM table
),
CTE2 as
(
SELECT Order_Date, datetime_diff(Order_Date,Datelag,hour) as Datedif
FROM CTE
)
SELECT AVG(Datedif)
FROM CTE2
I have a table in Postgres with timestamps:
timestamp
2022-01-01 00:52:53
2022-01-01 00:57:12
...
2022-02-13 11:00:31
2022-02-13 16:45:10
How can I select the timestamp closest to max timestamp? Meaning, I want the timestamp 2022-02-13 11:00:31.
I am looking for something like max(timestamp)-1 so I can do on a recurring basis. Thank you
You can do:
select *
from (
select *,
rank() over(order by timestamp desc) as rk
from t
) x
where rk = 2
See running example at DB Fiddle.
I think the following query might meet your requirements:
SELECT MAX(date_col) FROM test WHERE date_col < (SELECT MAX(date_col) from test);
See DB Fiddle
I have a hive table that has a timestamp in string format as below,
20190516093836, 20190304125015, 20181115101358
I want to get row count with an aggregate timestamp into hourly as below
date_time count
-----------------------------
2019:05:16: 00:00:00 23
2019:05:16: 01:00:00 64
I followed several links like this but was unable to generate the desired results yet.
This is my final query:
SELECT
DATE_PART('day', b.date_time) AS date_prt,
DATE_PART('hour', b.date_time) AS hour_prt,
COUNT(*)
FROM
(SELECT
from_unixtime(unix_timestamp(`timestamp`, "yyyyMMddHHmmss")) AS date_time
FROM table_name
WHERE from_unixtime(unix_timestamp(`timestamp`, "yyyyMMddHHmmss"))
BETWEEN '2018-12-10 07:02:30' AND '2018-12-12 08:02:30') b
GROUP BY
date_prt, hour_prt
I hope for some guidance from you, thanks in advance
You can extract date_time already in required format 'yyyy-MM-dd HH:00:00'. I prefer using regexp_replace:
SELECT
date_time,
COUNT(*) as `count`
FROM
(SELECT
regexp_replace(`timestamp`, '^(\\d{4})(\\d{2})(\\d{2})(\\d{2})(\\d{2})(\\d{2})$','$1-$2-$3 $4:00:00') AS date_time
FROM table_name
WHERE regexp_replace(`timestamp`, '^(\\d{4})(\\d{2})(\\d{2})(\\d{2})(\\d{2})(\\d{2})$','$1-$2-$3 $4:$5:$6')
BETWEEN '2018-12-10 07:02:30' AND '2018-12-12 08:02:30') b
GROUP BY
date_time
This will also work:
from_unixtime(unix_timestamp('20190516093836', "yyyyMMddHHmmss"),'yyyy-MM-dd HH:00:00') AS date_time
I need two columns: 1 showing 'date' and the other showing 'maximum date in table - date in row'.
I kept getting a zero in the 'datediff' column, and thought a nested select would work.
SELECT date, DATEDIFF(max_date, date) AS datediff
(SELECT MAX(date) AS max_date
FROM mytable)
FROM mytable
GROUP BY date
Currently getting this error from the above code : mismatched input '(' expecting {, ';'}(line 2, pos 2)
Correct format in the end would be:
date | datediff
--------------------------
2021-08-28 | 0
2021-07-26 | 28
2021-07-23 | 31
2021-08-11 | 17
If you want the date difference, you can use:
SELECT date, DATEDIFF(MAX(date) OVER (), date) AS datediff
FROM mytable
GROUP BY date
You can do this using the analytic function MAX() Over()
SELECT date, MAX(date) OVER() - date FROM mytable;
Tried this here on sqlfiddle
I have the following need:
I need to count the number of times each id activated from all dates.
Let's say the table looks like this:
tbl_activates
PersonId int,
ActivatedDate datetime
The result set should look something like this:
counted_activation | ActivatedDate
5 | 2009-04-30
7 | 2009-04-29
5 | 2009-04-28
7 | 2009-04-27
... and so on
Anyone know how to do this the best possible way? The date comes in the following format '2011-09-06 15:47:52.110', I need to relate only to the date without the time. (summary for each date)
you can use count(distinct .. )
and if the ActivatedDate is datetime you can get the date part
select Cast(ActivatedDate AS date), count(distinct id)
from my_table
group by ast(ActivatedDate AS date)
You can use to_char function to remove the time from date
select count(*) counted_activation,
to_char(activatedDate,"yyyy-mm-dd") ActDate
from table1
group by to_char(activatedDate,"yyyy-mm-dd");
Use 'GROUP BY' and 'COUNT'. Use CONVERT method to convert datetime to Date only
SELECT CONVERT(DATE,activatedate), COUNT(userId)
FROM [table]
GROUP BY CONVERT(DATE,InvoiceDate)