Wonder how to calculate de interval between this datetime. I've been trying Datetime_diff() function in SQL Bigqury but unsucessfuly.
started_at
ended_at
2020-04-26 17:45:00 UTC
2020-04-26 18:12:00 UTC
2020-04-17 17:08:00 UTC
2020-04-17 17:17:00 UTC
2020-04-01 17:54:00 UTC
2020-04-01 18:08:00 UTC
I would like to add a new column with the DURATION of this time travels.
started_at and ended_at looks TIMESTAMP type, so use TIMESTAMP_DIFF() like below.
WITH sample_data AS (
SELECT TIMESTAMP '2020-04-26 17:45:00 UTC' started_at, TIMESTAMP '2020-04-26 18:12:00 UTC' ended_at UNION ALL
SELECT '2020-04-17 17:08:00 UTC', '2020-04-17 17:17:00 UTC' UNION ALL
SELECT '2020-04-01 17:54:00 UTC', '2020-04-01 18:08:00 UTC'
)
SELECT TIMESTAMP_DIFF(ended_at, started_at,MINUTE) duration FROM sample_data;
If it's STRING type by any chance, cast it as TIMESTAMP like below instead.
TIMESTAMP_DIFF(TIMESTAMP(ended_at), TIMESTAMP(started_at), MINUTE)
Related
I have table in Oracle SQL like below:
ID | date | place
-----------------------------
123 | 1610295784376 | OBJ_1
444 | 1748596758291 | OBJ_1
567 | 8391749204754 | OBJ_2
888 | 1747264526789 | OBJ_3
ID - ID of client
date - date in Unix timestamp in UTC
place - place of contact with client
And I need to aggregate above date to achieve results as below, so I need to:
convert unix timestamp in UTC from column "date" to normal date as below
calculate min and max date for each values from column "place"
min_date
max_date
distinct_place
2022-01-05
2022-02-15
OBJ_1
2022-02-10
2022-03-20
OBJ_2
2021-10-15
2021-11-21
OBJ_3
You can use:
SELECT TIMESTAMP '1970-01-01 00:00:00 UTC'
+ MIN(date_column) * INTERVAL '0.001' SECOND(3)
AS min_date,
TIMESTAMP '1970-01-01 00:00:00 UTC'
+ MAX(date_column) * INTERVAL '0.001' SECOND(3)
AS max_date,
place
FROM table_name
GROUP BY place;
Note: the (3) after SECOND is optional and will just explicitly specify the precision of the fractional seconds.
or:
SELECT TIMESTAMP '1970-01-01 00:00:00 UTC'
+ NUMTODSINTERVAL( MIN(date_column) / 1000, 'SECOND')
AS min_date,
TIMESTAMP '1970-01-01 00:00:00 UTC'
+ NUMTODSINTERVAL( MAX(date_column) / 1000, 'SECOND')
AS max_date,
place
FROM table_name
GROUP BY place;
Which, for the sample data:
CREATE TABLE table_name (ID, date_column, place) AS
SELECT 123, 1610295784376, 'OBJ_1' FROM DUAL UNION ALL
SELECT 444, 1748596758291, 'OBJ_1' FROM DUAL UNION ALL
SELECT 567, 1391749204754, 'OBJ_2' FROM DUAL UNION ALL -- Fixed leading digit
SELECT 888, 1747264526789, 'OBJ_3' FROM DUAL;
Both output:
MIN_DATE
MAX_DATE
PLACE
2021-01-10 16:23:04.376000000 UTC
2025-05-30 09:19:18.291000000 UTC
OBJ_1
2014-02-07 05:00:04.754000000 UTC
2014-02-07 05:00:04.754000000 UTC
OBJ_2
2025-05-14 23:15:26.789000000 UTC
2025-05-14 23:15:26.789000000 UTC
OBJ_3
db<>fiddle here
I have a use case where I want to order by time, but at a certain resolution. For example my schema saves timestamps out to 9 decimals (nanosecond precision), but I only want to order by minutes and use a different field to order within that minute. I tried this
select * from myTable order by (cast(myTimeStamp at time zone 'UTC' as timestamp) - to_timestamp('01-JAN-01'))/1000000000*60 desc, id desc;
To convert the timestamp into epoch and then divide to get minute precision. But this gives the wrong ordering. Also when I do a dump on the above command to understand the returned data type I see data type: typ=190 and I can't find that type in the oracle docs which adds to my confusion.
So I'm wondering what I'm missing? It should be possible to order by a truncated (to minute) timestamp, any help is appreciated.
Convert the TIMESTAMP WITH TIME ZONE to the UTC time zone so that you can compare identical times and then TRUNCate it back to the start of the minute:
SELECT *
FROM myTable
ORDER BY
TRUNC(myTimeStamp at time zone 'UTC', 'MI') DESC,
id DESC;
Which, for the sample data:
CREATE TABLE myTable(
id NUMBER,
myTimestamp TIMESTAMP WITH TIME ZONE
);
INSERT INTO myTable(id, myTimestamp)
SELECT 1, TIMESTAMP '1970-01-01 00:00:00 UTC' FROM DUAL UNION ALL
SELECT 2, TIMESTAMP '1970-01-01 00:00:00 America/New_York' FROM DUAL UNION ALL
SELECT 3, TIMESTAMP '1970-01-01 00:00:00 Asia/Hong_Kong' FROM DUAL UNION ALL
SELECT 4, TIMESTAMP '1970-01-01 00:00:00 Europe/Paris' FROM DUAL UNION ALL
SELECT 5, TIMESTAMP '1970-01-01 01:00:00 UTC' FROM DUAL UNION ALL
SELECT 6, TIMESTAMP '1970-01-01 01:00:00 America/New_York' FROM DUAL UNION ALL
SELECT 7, TIMESTAMP '1970-01-01 01:00:00 Europe/Berlin' FROM DUAL;
Outputs:
ID
MYTIMESTAMP
6
01-JAN-70 01.00.00.000000 AMERICA/NEW_YORK
2
01-JAN-70 00.00.00.000000 AMERICA/NEW_YORK
5
01-JAN-70 01.00.00.000000 UTC
7
01-JAN-70 01.00.00.000000 EUROPE/BERLIN
1
01-JAN-70 00.00.00.000000 UTC
4
01-JAN-70 00.00.00.000000 EUROPE/PARIS
3
01-JAN-70 00.00.00.000000 ASIA/HONG_KONG
If you want to see the values converted to UTC that are being used in the sorting process then just add it in the output:
SELECT t.*,
TO_CHAR(TRUNC(myTimeStamp at time zone 'UTC', 'MI'), 'YYYY-MM-DD HH24:MI:SS')
AS converted_ts
FROM myTable t
ORDER BY
TRUNC(myTimeStamp at time zone 'UTC', 'MI') DESC,
id DESC;
Which outputs:
ID
MYTIMESTAMP
CONVERTED_TS
6
01-JAN-70 01.00.00.000000 AMERICA/NEW_YORK
1970-01-01 06:00:00
2
01-JAN-70 00.00.00.000000 AMERICA/NEW_YORK
1970-01-01 05:00:00
5
01-JAN-70 01.00.00.000000 UTC
1970-01-01 01:00:00
7
01-JAN-70 01.00.00.000000 EUROPE/BERLIN
1970-01-01 00:00:00
1
01-JAN-70 00.00.00.000000 UTC
1970-01-01 00:00:00
4
01-JAN-70 00.00.00.000000 EUROPE/PARIS
1969-12-31 23:00:00
3
01-JAN-70 00.00.00.000000 ASIA/HONG_KONG
1969-12-31 16:00:00
If you just use TRUNC without converting to a common time zone then it will order based on the date and time components without considering the relative difference in the time zones.
db<>fiddle here
Why don't you then just truncate timestamp to minutes?
SQL> alter session set nls_date_format = 'dd.mm.yyyy hh24:Mi:ss';
Session altered.
SQL> select systimestamp col_1,
2 trunc(systimestamp, 'mi') col_2
3 from dual;
COL_1 COL_2
---------------------------------------- -------------------
29.12.21 20:32:50,178000 +01:00 29.12.2021 20:32:00
SQL>
Then you'd
order by trunc(timestamp_column, 'mi'),
yet_another_column
I have a date something like below :
Thu Nov 29 18:00:00 CST 2018
Thu Apr 26 01:00:00 BST 2018
I need to convert it to 8AM UTC in oracle.
How do i do this ?
It is a string not date.
Referred link deals with proper dates and there is no accepted answer for it.
Thanks in Advance
Since it's a string you could use regexp_replace.
regexp_replace(nmuloc, '[[:digit:]]{2}:[[:digit:]]{2}:[[:digit:]]{2} [A-Z]{3}', '08:00:00 GMT')
Oracle Setup
CREATE TABLE table_name ( datetime TIMESTAMP WITH TIME ZONE );
INSERT INTO table_name
SELECT TIMESTAMP '2018-11-29 18:00:00 CST' FROM DUAL UNION ALL
SELECT TIMESTAMP '2018-04-26 01:00:00 Europe/London' FROM DUAL UNION ALL
SELECT TIMESTAMP '2018-06-26 00:00:00 Europe/London' FROM DUAL;
Query 1:
Use datetime AS TIME ZONE 'UTC' to convert it from your time zone to UTC
Then use TRUNC() to truncate it back to the start of the UTC day (and also cast it to a date)
Because its now a date, use CAST( ... AS TIMESTAMP ) to get it back to a timestamp
Then use FROM_TZ( ..., 'UTC' ) to get it to be a timestamp in the UTC time zone
Then add INTERVAL '8' HOUR to be 8am.
Like this:
SELECT FROM_TZ(
CAST(
TRUNC( datetime AT TIME ZONE 'UTC' )
AS TIMESTAMP
),
'UTC'
) + INTERVAL '8' HOUR AS utc_date_at_8am_utc
FROM table_name;
Output:
UTC_DATE_AT_8AM_UTC
--------------------------------
30-NOV-18 08.00.00.000000 AM UTC
26-APR-18 08.00.00.000000 AM UTC
25-JUN-18 08.00.00.000000 AM UTC
Note: this translates 2018-06-26 00:00:00 BST to 2018-06-25 23:00:00 UTC before truncating. So it will be the same UTC day (but not necessarily the same day in the local time zone).
Query 2
If this is an issue then just remove the initial time zone conversion:
SELECT FROM_TZ(
CAST(
TRUNC( datetime )
AS TIMESTAMP
),
'UTC'
) + INTERVAL '8' HOUR AS date_at_8am_utc
FROM table_name
Output:
DATE_AT_8AM_UTC
--------------------------------
29-NOV-18 08.00.00.000000 AM UTC
26-APR-18 08.00.00.000000 AM UTC
26-JUN-18 08.00.00.000000 AM UTC
Suppose I have this query:
SELECT ga_channelGrouping, ga_sourceMedium,ga_campaign, SUM(ga_sessions) as sessions,
SUM(ga_sessionDuration)/SUM(ga_sessions) as avg_sessionDuration,
SUM(ga_users)as Users, SUM(ga_newUsers)as New_Users, SUM(ga_bounces)/SUM(ga_sessions)
AS ga_bounceRate, SUM(ga_pageviews)/SUM(ga_sessions)as pageViews_per_sessions,
SUM( ga_transactions)/SUM(ga_sessions) AS ga_conversionRate
FROM db.table
group by ga_channelGrouping, ga_sourceMedium,ga_campaign
How do I find rolling 30 days of data from Big Query. My DATE column value is of this format: 2018-06-19 11:00:00 UTC
You can use the DATE_ADD or DATE_SUB functions to shift date values and TIMESTAMP_ADD, TIMESTAMP_SUB to shift timestamp values.
So you could try:
SELECT ga_channelGrouping, ga_sourceMedium,ga_campaign, SUM(ga_sessions) as sessions,
SUM(ga_sessionDuration)/SUM(ga_sessions) as avg_sessionDuration,
SUM(ga_users)as Users, SUM(ga_newUsers)as New_Users, SUM(ga_bounces)/SUM(ga_sessions)
AS ga_bounceRate, SUM(ga_pageviews)/SUM(ga_sessions)as pageViews_per_sessions,
SUM( ga_transactions)/SUM(ga_sessions) AS ga_conversionRate
FROM db.table
WHERE your_date_column >= TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 24*30 HOUR)
group by ga_channelGrouping, ga_sourceMedium,ga_campaign
The TIMESTAMP_SUB doesn't take DAY as an interval, so here we've done 24*30 hours to go back 30 days.
EDIT: If you want to roll back 30 days regardless of the time of the day you can do the following:
WHERE your_date_column >= TIMESTAMP_TRUNC(TIMESTAMP_SUB(CURRENT_TIMESTAMP(), INTERVAL 24*30 HOUR), DAY)
OR
WHERE CAST(your_date_column AS DATE) >= DATE_SUB(CURRENT_DATE(), INTERVAL 30 DAY))
How do I find rolling 30 days of data from Big Query. My DATE column value is of this format: 2018-06-19 11:00:00 UTC
First, I wanted to point out that aggregating last 30 days is quite different from rolling 30 days - so below answer is actually focusing on rolling 30 days vs. just last 30 days
Below is for BigQuery Standard SQL and assumes that your date column is named your_date_column and is of TIMESTAMP data type
#standardSQL
SELECT
your_date_column, -- data type of TIMESTAMP with value like 2018-06-19 11:00:00 UTC
ga_channelGrouping,
ga_sourceMedium,
ga_campaign,
SUM(ga_sessions) OVER(win) AS sessions,
(SUM(ga_sessionDuration) OVER(win))/(SUM(ga_sessions) OVER(win)) AS avg_sessionDuration,
SUM(ga_users) OVER(win) AS Users,
SUM(ga_newUsers) OVER(win) AS New_Users,
(SUM(ga_bounces) OVER(win))/(SUM(ga_sessions) OVER(win)) AS ga_bounceRate,
(SUM(ga_pageviews) OVER(win))/(SUM(ga_sessions) OVER(win)) AS pageViews_per_sessions,
(SUM(ga_transactions) OVER(win))/(SUM(ga_sessions) OVER(win)) AS ga_conversionRate
FROM `project.dataset.table`
WINDOW win AS (
PARTITION BY ga_channelGrouping, ga_sourceMedium, ga_campaign
ORDER BY UNIX_DATE(DATE(your_date_column))
RANGE BETWEEN 29 PRECEDING AND CURRENT ROW
)
For you to understand how it works - try and play with below dummy example (for simplicity it does rolling 3 days)
#standardSQL
WITH `project.dataset.table` AS (
SELECT 1 value, TIMESTAMP '2018-06-19 11:00:00 UTC' your_date_column UNION ALL
SELECT 2, '2018-06-20 11:00:00 UTC' UNION ALL
SELECT 3, '2018-06-21 11:00:00 UTC' UNION ALL
SELECT 4, '2018-06-22 11:00:00 UTC' UNION ALL
SELECT 5, '2018-06-23 11:00:00 UTC' UNION ALL
SELECT 6, '2018-06-24 11:00:00 UTC' UNION ALL
SELECT 7, '2018-06-25 11:00:00 UTC' UNION ALL
SELECT 8, '2018-06-26 11:00:00 UTC' UNION ALL
SELECT 9, '2018-06-27 11:00:00 UTC' UNION ALL
SELECT 10, '2018-06-28 11:00:00 UTC'
)
SELECT
your_date_column,
value,
SUM(value) OVER(win) rolling_value
FROM `project.dataset.table`
WINDOW win AS (ORDER BY UNIX_DATE(DATE(your_date_column)) RANGE BETWEEN 2 PRECEDING AND CURRENT ROW)
ORDER BY your_date_column
where result is
Row your_date_column value rolling_value
1 2018-06-19 11:00:00 UTC 1 1
2 2018-06-20 11:00:00 UTC 2 3
3 2018-06-21 11:00:00 UTC 3 6
4 2018-06-22 11:00:00 UTC 4 9
5 2018-06-23 11:00:00 UTC 5 12
6 2018-06-24 11:00:00 UTC 6 15
7 2018-06-25 11:00:00 UTC 7 18
8 2018-06-26 11:00:00 UTC 8 21
9 2018-06-27 11:00:00 UTC 9 24
10 2018-06-28 11:00:00 UTC 10 27
Trying to insert some data into a a table. However, I tried i am receiving data compatibility issues. The column that I am trying to insert into is a timestamp(6) column while the column I am drawing data from is a timestamp with timezone column. I know how to use cast to convert from timestamp to a timestamp with timezone but not the inverse. Is there a way I can just strip out the 'UTC'?
Date I am starting out with:
'20-MAY-18 09.00.00.000000000 AM UTC'
Date I want to end up with:
'20-MAY-18 09.00.00.000000000 AM'
What I have tried thus far:
select to_date('20-MAY-18 09.00.00.000000000 AM UTC', 'dd-mon-yy hh:mi:ss A.M.') from dual;
However I receive an error, and I just can't seem to figure it out, what am I doing wrong? thanks in advance!
You can cast a timestamp with time zone to a plain timestamp:
cast(<your_value> as timestamp)
so with your value:
select cast(
to_timestamp_tz('20-MAY-18 09.00.00.000000000 AM UTC', 'DD-MON-RR HH:MI:SS.FF AM TZR')
as timestamp)
from dual;
CAST(TO_TIMESTAMP_T
-------------------
2018-05-20 09:00:00
If you insert as timestamp with tome zone value into a plain timestamp column then it will be converted automatically, just losing its tie zone information.
If the values might not always be UTC then you can convert them to UTC and to a plain timestamp in one go with sys_extract_tc():
with cte (tsz) as (
select timestamp '2018-05-20 09:00:00.0 UTC' from dual
union all select timestamp '2018-05-20 13:00:00.0 America/New_York' from dual
)
select tsz, cast(tsz as timestamp) as ts, sys_extract_utc(tsz) utc
from cte;
TSZ TS UTC
------------------------------ ------------------- -------------------
2018-05-20 09:00:00.000 +00:00 2018-05-20 09:00:00 2018-05-20 09:00:00
2018-05-20 13:00:00.000 -04:00 2018-05-20 13:00:00 2018-05-20 17:00:00