SELECT average timestamp difference in JPA with Postgres and HSQL - sql

I have a table with two timestamp columns, startTime and stopTime, and I would like to calculate the average difference of these timestamps in my table. I have a solution that works in Postgres and in HSQLDB (which I use for local unit testing) but not both, and I'm having trouble trying to figure out a common solution.
Postgres:
select EXTRACT(EPOCH FROM(avg(m.stopTime - m.startTime))) from Measures m
HSQL:
select avg(FUNC('UNIX_TIMESTAMP', m.stopTime) - FUNC('UNIX_TIMESTAMP', m.startTime) from Measures m
Is there a way to use the same query for both databases? All of the functions I've found seem to only be supported in one database or the other.
I think my main problem is that there isn't a common function to convert a timestamp to seconds in order to perform the calculation. EPOCH is only compatible with Postgres and UNIX_TIMESTAMP is only compatible with HSQL.

The crux of your problem is converting the dates and timestamps down to a number of seconds. Instead of using epoch, I'll use a julian date for the date. I'll convert the julian date to seconds, then add the number of seconds since minight for each timestamp being compared. The following query does not calculate the difference, it simply converts the date to a number that's similar on both platforms .. you'll have to do this once for each date being compared. note: replace "current"timestamp" with m.startTime and m.stopTime respectively.
select
(to_number(to_char(current_timestamp,'J'),'99999999999999999999')*86400/*convert from julian days to julian seconds*/)
+ (to_number(to_char(current_timestamp,'HH'),'99') * 3600) /*Hours to seconds */
+ (to_number(to_char(current_timestamp,'MM'),'99') * 60) /*Minutes to seconds */
+ (to_number(to_char(current_timestamp,'SS'),'99') /*add in the seconds*/
Ugly as sin, I know-- perhaps you can rewrite it easier as function, but as I don't know hsqls full feature set, I'll leave it in this form rather than using a CTE or function.

Related

What do I have to change in my syntax for it to work in MS SQL Server from Informix

I'm currently updating a lot of our queries to work again in our new ERP-Release, where we will be working with MS SQL Server, swapping away from our current Informix database. Many are just simple Date-Format changes but this one I am unable to translate.
It really only is the following line:
round((GETDATE() - max(l105.dataen))::interval second(9) to second::char(10)::int
/ 60 / 60, 3)
I simply can't grasp what the part starting at the colons (::) is doing or what function it is.
I hope someone maybe can identify it.
In Informix, subtracting two DATETIME values results in an INTERVAL. The <value>::<type> notation is a shorthand for CAST(<value> AS <type>). Therefore, there are three consecutive casts:
::interval second(9) to second
::char(10)
::int
By default, if you subtract two DATETIME YEAR TO SECOND values, you will get an INTERVAL DAY(n) TO SECOND value. The first cast converts that to INTERVAL SECOND(9) TO SECOND — a number of seconds; the second cast converts the result to CHAR(10) because there isn't a direct conversion from INTERVAL to numeric types; the third cast converts the string to an INT. That gives the integer number of seconds; it is divided by 60 twice (effectively, divided by 3600) to convert the seconds into hours.
The result is then rounded to 3 decimal places.
So, the overall operation calculates the number of hours between two times.
The two times are the current time and the most recent value in the l105.dataen column (the MAX expression). Presumably, there is a GROUP BY clause somewhere in the SELECT statement that this is a part of.
You will likely need to use a 'time difference' function in MS SQL Server, and maybe the function allows you to control the presentation of the result as a number of hours and fractions of an hour.
Judging from DATEDIFF function, you will need to use something like:
DATEDIFF(hh, MAX(l105.dataen), GETDATE())
However, that returns an integer value for the difference in hours. You may prefer to get the time in seconds and divide by 3600 to get a fractional value:
DATEDIFF(ss, MAX(l105.dataen), GETDATE()) / 3600.0
No database server was consulted to ensure the veracity of the suggested translation.

SQLite strftime() returning null with integers

I have a date with the column name "date" in my table "productions" stored as integer with Unix format ( example: 1548263300000). And I want to retrieve only the year. When I do:
SELECT strftime('%Y ',date) as year
FROM productions
it returns null.
When I change the type to TEXT into my table and I store the date in string format ( example 2020-05-01 ), the same sql returns me "2020" which is correct and what I was looking for.
Why strftime() doesn't work with integers since the SQLite documentation say you can work with TEXT,INTEGER and REAL for dates? How to use date functions with integers?
extra information:
In this tutorial, they also use strftime with integers and it seems to work for them, so I understand from that, that the functions are available no matter what type you use ( text,int,real): link
when I use:
SELECT strftime('%Y',DATETIME(ROUND(date/ 1000), 'unixepoch'))
FROM productions;
it works fine, but I don't understand why I have to do all this when I use integers but when I use text, it works directly.
You can use strftime() but you have to add the 'unixepoch' modifier:
strftime('%Y', date / 1000, 'unixepoch')
so your date / 1000 is recognized as the number of seconds since 1970-01-01.
From Date And Time Functions:
The "unixepoch" modifier (11) only works if it immediately follows a
timestring in the DDDDDDDDDD format. This modifier causes the
DDDDDDDDDD to be interpreted not as a Julian day number as it normally
would be, but as Unix Time - the number of seconds since 1970.

Why do I get an incompatible value type for my column?

I am trying to calculate the difference between two dates in an oracle database using a JDBC connection. I followed the advice from this question using a query like this:
SELECT CREATE_DATE - CLOSED
FROM TRANSACTIONS;
and I get the following error:
Incompatible value type specified for
column:CREATE_DATE-CLOSED. Column Type = 11 and Value Type =
8.[10176] Error Code: 10176
What should I change so I can successfully calculate the difference between the dates?
note: CREATE_DATE and CLOSED both have TIMESTAMP type
The answer you found is related to date datatypes, but you are dealing with timestamps. While substracting two Oracle dates returns a number, substracting timestamps produces an interval datatype. This is probably not what you want, and, apparently, your driver does not properly handle this datatype.
For this use case one solution is to cast the timestamps to dates before substracting them:
select cast(create_date as date) - cast(closed as date) from transactions;
As it was mentioned, it seems that JDBC cannot work with the INTERVAL datatype. What about casting it with the EXTRACT function to the expected output as number? If you want number of seconds between those two timestamps, it would be:
SELECT EXTRACT(SECOND FROM (CREATE_DATE - CLOSED)) FROM TRANSACTIONS;
Here are list of options which might be used instead of SECOND:
https://docs.oracle.com/database/121/SQLRF/functions067.htm#SQLRF00639
When we subtract one date from another Oracle gives us the difference as a number: it's straightforward arithmetic. But when we subtract one timestamp from another - which is what you're doing - the result is an INTERVAL. Older versions of JDBC don't like the INTERVAL datatype (docs) .
Here are a couple of workarounds, depending on what you want to do with the result. The first is to calculate the number of seconds from the interval result. extract second from ... only gives us the numbers of seconds in the interval. This will be fine providing none of your intervals are more than fifty-nine seconds long. Longer intervals require us to extract minute, hour and even days. So that solution would be:
select t.*
, extract (day from (t.closed - t.create_date)) * 84600
+ extract (hour from (t.closed - t.create_date)) * 3600
+ extract (minute from (t.closed - t.create_date)) * 60
+ extract (second from (t.closed - t.create_date)) as no_of_secs
from transactions t
A second solution is to follow the advice in the JDBC mapping guide and turn the interval into a string:
select t.*
, cast ((t.closed - t.create_date) as varchar2(128 char)) as intrvl_str
from transactions t
The format of a string interval is verbose:INTERVAL'+000000001 04:40:59.710000'DAY(9)TO SECOND. This may not be useful in the Java side of the application. But with regex we can turn it into a string which can be converted into a Java 8 Duration object (docs) : PnDTnHnMn.nS.
select t.id
, regexp_replace(cast ((t.closed - t.create_date) as varchar2(128 char))
, 'INTERVAL''\+([0-9]+) ([0-9]{2}):([0-9]{2}):([0-9]{2})\.([0-9]+)''DAY\(9\)TO SECOND'
, 'P\1DT\2H\3M\4.\5S')
as duration
from transactions t
There is a demo on db<>fiddle

How to convert an Epoch timestamp to a Date in Standard SQL

I didn't find any simple answer to this while I was looking around, so I thought I'd put it up here in case anyone was having the same problem as me with what should have been a trivial issue.
I was using ReDash analytics with Google's BigQuery and had turned on Standard SQL in the datasource settings. For the purposes of my query, I needed to convert a timestamp - unix time in milliseconds, as a string - to a Date format so that I could use the DATE_DIFF method.
As an example... "1494865480000" to "2017-05-15"
The difficulty was that casting and conversion was excessively strict and there seemed no adequate way to make it parse. See my answer down below!
(Though let me know if some SQL sensei knows a more eloquent way!)
In Standard SQL use TIMESTAMP_MICROS function together with EXTRACT(DATE FROM <timestamp>):
SELECT EXTRACT(DATE FROM TIMESTAMP_MILLIS(1494865480000))
A simpler way with TIMESTAMP_MILLIS():
#standardSQL
SELECT DATE(TIMESTAMP_MILLIS(CAST("1494865480000" AS INT64)))
2017-05-15
After much trial and error, this was my solution:
DATE_ADD( DATE'1970-01-01', INTERVAL CAST( ( CAST( epochTimestamp AS INT64 ) / 86400000 ) AS INT64 ) DAY ) AS convertedDate
That is, I took the string, cast it to an integer, divided it by the number of milliseconds in a day, then used a DATE_ADD method, and added the result to the start of Epoch time, and calculated the resulting day.
I hope this saves another junior some time!
Use UTC_USEC_TO_TIMESTAMP():
select UTC_USEC_TO_TIMESTAMP(postedon * 1000)
You can then extract the date using date():
select DATE(UTC_USEC_TO_TIMESTAMP(postedon * 1000))
This doesn't require knowing the internal format of Unix timestamps.

Date comparison in Hive

I'm working with Hive and I have a table structured as follows:
CREATE TABLE t1 (
id INT,
created TIMESTAMP,
some_value BIGINT
);
I need to find every row in t1 that is less than 180 days old. The following query yields no rows even though there is data present in the table that matches the search predicate.
select *
from t1
where created > date_sub(from_unixtime(unix_timestamp()), 180);
What is the appropriate way to perform a date comparison in Hive?
How about:
where unix_timestamp() - created < 180 * 24 * 60 * 60
Date math is usually simplest if you can just do it with the actual timestamp values.
Or do you want it to only cut off on whole days? Then I think the problem is with how you are converting back and forth between ints and strings. Try:
where created > unix_timestamp(date_sub(from_unixtime(unix_timestamp(),'yyyy-MM-dd'),180),'yyyy-MM-dd')
Walking through each UDF:
unix_timestamp() returns an int: current time in seconds since epoch
from_unixtime(,'yyyy-MM-dd') converts to a string of the given format, e.g. '2012-12-28'
date_sub(,180) subtracts 180 days from that string, and returns a new string in the same format.
unix_timestamp(,'yyyy-MM-dd') converts that string back to an int
If that's all getting too hairy, you can always write a UDF to do it yourself.
Alternatively you may also use datediff. Then the where clause would be
in case of String timestamp (jdbc format) :
datediff(from_unixtime(unix_timestamp()), created) < 180;
in case of Unix epoch time:
datediff(from_unixtime(unix_timestamp()), from_unixtime(created)) < 180;
I think maybe it's a Hive bug dealing with the timestamp type. I've been trying to use it recently and getting incorrect results.
If I change your schema to use a string instead of timestamp, and supply values in the
yyyy-MM-dd HH:mm:ss
format, then the select query worked for me.
According to the documentation, Hive should be able to convert a BIGINT representing epoch seconds to a timestamp, and that all existing datetime UDFs work with the timestamp data type.
with this simple query:
select from_unixtime(unix_timestamp()), cast(unix_timestamp() as
timestamp) from test_tt limit 1;
I would expect both fields to be the same, but I get:
2012-12-29 00:47:43 1970-01-16 16:52:22.063
I'm seeing other weirdness as well.
TIMESTAMP is milliseconds
unix_timestamp is in seconds
You need to multiply the RHS by 1000.
where created > 1000 * date_sub(from_unixtime(unix_timestamp()), 180);
After reviewing this and referring to Date Difference less than 15 minutes in Hive I came up with a solution. While I'm not sure why Hive doesn't perform the comparison effectively on dates as strings (they should sort and compare lexicographically), the following solution works:
FROM (
SELECT id, value,
unix_timestamp(created) c_ts,
unix_timestamp(date_sub(from_unixtime(unix_timestamp()), 180), 'yyyy-MM-dd') c180_ts
FROM t1
) x
JOIN t1 t ON x.id = t.id
SELECT to_date(t.Created),
x.id, AVG(COALESCE(x.HighestPrice, 0)), AVG(COALESCE(x.LowestPrice, 0))
WHERE unix_timestamp(t.Created) > x.c180_ts
GROUP BY to_date(t.Created), x.id ;