Subtraction of dates with hours and minutes (result in float) - sql

I would like some help with an SSIS problem.
I have two columns, one with a date of when demand was open and another when the demand was responded to.
My date comes in this way:
DT_ANSWERED_DATE
DT_CREATED_DATE
2021-02-04 19:48:00.000
2021-02-04 19:44:00.000
I would like to subtract DT_ANSWERED_DATE MINUS DT_CREATED_DATE
but I would like the result would be a float number:
like in this case when a subtract in excel
I get the result:
DT_ANSWERED_DATE
DT_CREATED_DATE
DT_ANSWERED_DATE minus DT_CREATED_DATE
2021-02-04 19:48:00.000
2021-02-04 19:44:00.000
0,00277777777228039
I would like to do the same thing but in a derived column at SSIS (Microsoft Visual Studio)
Thanks for the response in advance

It looks like your granularity is in minutes. This should get you the decimal number you are looking for...
DATEDIFF("mi", DT_CREATED_DATE, DT_ANSWERED_DATE) / (60 * 24)
(60 min per hour * 24 hours in a day)
Microsoft documentation... https://learn.microsoft.com/en-us/sql/integration-services/expressions/datediff-ssis-expression?view=sql-server-ver16
In your example above this results in:
4 min / (60*24) = 0.00277777777
Note:
I highly recommend using decimal vs float. Unless you really, really have a reason. 1=1 is usually not true when using a float number. It will always be true with integers or decimals.

Related

Going more than 24hrs and adding days

I have an expression in a table that is calculating the total of the column that has different times in it. But after the total goes over 24 hours it resets. I want to add to have days also in it if it goes more than 24hrs. I have so far added days in the format but this means that even with 0 days it gives 01 days since it is using the days section of the date. This is wrong and I either want to take one away from this ot have some sort of a counter to count the days. This is the expression if have so far:
=format(DateAdd("s", SUM(Fields!TotalDowntime.Value), "00:00:00"), "dd 'days' HH 'hrs' mm 'mins' ss 'secs'")
I have tried to format and using dateadd function to see if this can be done in a different way
The following isn't a full answer, but I'm hoping it will help you towards one. I think I've understood from your question that the field which records your TotalDowntime figure records this value in seconds.
SELECT
TotalDowntime/(24*60*60) as [days],
TotalDowntime/(60*60) % 24 as [hours],
TotalDowntime/(60) % 60 as [minutes],
TotalDowntime % 60 as [seconds]
FROM
MyTable
You'll see that all the parts simply cut off any decimals that result from the division. The % is SQL Servers "Mod" function that returns the remainder on division.
You should be able to put code similar to the above "server side" and concatenate these columns in reportbuilder (or server side if you wish).

extracting HOUR from an interval in spark sql

I was wondering how to properly extract amount of hours between given 2 timestamps objects.
For instance, when the following SQL query gets executed:
select x, extract(HOUR FROM x) as result
from
(select (TIMESTAMP'2021-01-22T05:00:00' - TIMESTAMP'2021-01-01T09:00:00') as x)
The result value is 20, while I'd expect it to be 500.
It seems odd to me considering that x value indicates the expected return value.
Can anyone please explain to me what I'm doing wrong and perhaps suggest additional way of query so the desired result would return?
Thanks in advance!
I think you have to do the maths with this one as datediff in SparkSQL only supports days. This worked for me:
SELECT (unix_timestamp(to_timestamp('2021-01-22T05:00:00') ) - unix_timestamp(to_timestamp('2021-01-01T09:00:00'))) / 60 / 60 diffInHours
My results (in Synapse Notebook, not Databricks but I expect it to be the same):
The unix_timestamp function converts the timestamp to a Unix timestamp (in seconds) and then you can apply date math to it. Subtracting them gives the number of seconds between the two timestamps. Divide by 60 for the number minutes between the two dates and by 60 again for the number of hours between the two dates.

Oracle SQL: Is There a Maximum Date Difference Oracle Can Interpret

I'm working on sql that looks for rows in a table where the rows 'last_run' date + 'frequency' (in minutes), is greater than the current date/time. I've noticed that there appears to be an upper bound for date comparisons Oracle can make sense of.
For example this query;
with tests as
(
select
'TEST 1' as code,
99999999 as frequency,
sysdate as last_run
from dual
union
select
'TEST 2' as code,
99999999999 as frequency,
sysdate as last_run
from dual
)
select
p.*,
(p.last_run + p.frequency / 24 / 60 ) as next_run
from tests p
where (p.last_run + p.frequency / 24 / 60 < sysdate or p.last_run is null)
I would expect this query to return null but instead it returns;
CODE
FREQUENCY
LAST_RUN
NEXT_RUN
TEST 2
99999999999
05-OCT-2021 10:15:46 AM
15-APR-4455 08:54:46 PM
I can solve the problem by setting frequency = null and my other code will recognize that the row need not be considered, but it seems strange to me that Oracle can't recognize that the year 4455 > 2021.
Is there some maximum conceivable date in Oracle that I'm unaware of?
I'm running this in Oracle SQL Developer Version 18.2.0.183 and Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production.
it seems strange to me that Oracle can't recognize that the year 4455 > 2021
It can. The problem is that your year isn't 4455; it's -4455. See this db<>fiddle, showing the result (in a different timezone) with default DD-MON-RR format, your output format, and ISO format with the year sign included (S format element).
CODE
FREQUENCY
LAST_RUN
NEXT_RUN
TEST 2
99999999999
2021-10-05 17:16:21
-4454-03-12 03:55:21
With your frequency of 99999999999 the value you are adding to the current date is 69444444 days, which is (very roughly) 190128 years - clearly that's going to put you well past the maximum date of 9999-12-31; and indeed with a different value like 9999999999 (one less digit), which is 6944444 days or roughly 19012 years, you get an error - also shown in that db<>fiddle.
The issue seems to be how Oracle manipulates its internal representation when it does the calculation; in adding that large value it appears that the year - which is stored in two bytes - is overflowing and wrapping.
Using the type-13 version, 190128+2021 = 192149, which is (256 * 750) + 149. 750 doesn't fit in one byte, so you get the modulus, which is 238. That would make the first two bytes of the calculated date come out as 149,238. That actually corresponds to year -4459:
select dump(date '-4459-01-01') from dual;
Typ=13 Len=8: 149,238,1,1,0,0,0,0
which is close enough to demonstrate that's what's happening - given that the calculation is outside the expected range and it's probably trying to do invalid leap day calculations in there somewhere. The point, though, is that the generated, wrapped, value represents a valid year in that internal notation.
With the lower value, 19012+2021 = 20133, which is (256 * 82) + 41. Now there is no wrapping, so the first two bytes of the calculated date come out as 41,82. That is now not a valid year, so Oracle knows to throw the ORA-01841 exception.
So, you need to limit the frequency value to a number that won't ever go past 9999-12-31, or test it at run time against 9999-12-31 minus the current date - and if it's too big, ignore it. That's if you want what appears to be a magic number at all.
There is a maximum date in Oracle, it is 9999-12-31 23:59:59 in YYYY-MM-DD HH24:MI:SS format. Here is a screenshot of the Oracle Documentation:
Here is the Oracle Documentation which talks about the valid date values (LINK)
The problem is that you are adding ~190,258 years with your second query. Likely overflowing the buffer many times over. It just so happened that you ended up back at the value you did.

Ingres multiplication gives wrong result

I have an Ingres table with following columns
from_date ingresdate
to_date ingresdate
model_amt money
The dates can reflect a period of any number of days, and the model_amt is always a weekly figure. I need to work out the the total model_amt for the period
To do this I need to know how many days are covered by the period, and then divide model_amt by 7, and multiply it by the number of days
however, I am getting incorrect results using the code below
select model_amt, date_part('day',b.to_date - b.from_date),
model_amt / 7 * int4( (date_part('day',b.to_date - b.from_date)) )
from table
For example, where model_amt = 88.82 and the period is for 2 weeks, I get the following output
+-------------------------------------------------------+
¦model_amt ¦col2 ¦col3 ¦
+--------------------+-------------+--------------------¦
¦ #88.82¦ 14¦ #177.66¦
+-------------------------------------------------------+
But 88.82 / 7 * 14 = 177.64, not 177.66?
Any ideas what is going on? The same issue happens regardless of whether I include the int4 function around the date_part.
* Update 15:28 *
The solution was to add a float8 function around the model_amt
float8(model_amt)/ 7 * interval('days', to_date - from_date)
Thanks for the responses.
In computers, floating point numbers are notoriously inaccurate. You can multiply do all kinds of basic mathematics calculations on floating point numbers and they'll be off by a few decimals.
Some information can be found here; but its very googleable :). http://effbot.org/pyfaq/why-are-floating-point-calculations-so-inaccurate.htm
Generally to avoid inaccuracies, you need to use a language specific feature (e.g. BigDecimal in Java) to "perfectly" store the decimals. Alternatively, you can represent decimals as separate integers (e.g. main number is one integer and the decimal is another integer) and combine them later.
So, I suspect this is just ingres showing the normal floating point inaccuracies and that there are known workarounds for it in that database.
Update
Here's a support article from Actian specifically about ingres floating point issues which seems useful: https://communities.actian.com/s/article/Floating-Point-Numbers-Causes-of-Imprecision.

SQL Server adding two time columns in a single table and putting result into a third column

I have a table containing two time columns like this:
Time1 Time2
07:34:33 08:22:44
I want to add the time in both these columns and put the result of addition into a third column may be Time3
Any help would be appreciated..Thanks
If the value you expect as the result is 15:57:17 then you can get it by calculating for instance the number of seconds from midnight for Time1 and add that value to Time2:
select dateadd(second,datediff(second,0,time1),time2) as Time3
from your_table
I'm not sure how meaningful adding two discrete time values together is though, unless they are meant to represent duration in which case the time datatype might not be the best as it is meant for time of day data and only has a range of 00:00:00.0000000 through 23:59:59.9999999 and an addition could overflow (and hence wrap around).
If the result you want isn't 15:57:17 then you should clarify the question and add the desired output.
The engine doesn't understand addition of two time values, because it thinks you can't add two times of day. You get:
Msg 8117, Level 16, State 1, Line 8
Operand data type time is invalid for add operator.
If these are elapsed times, not times of day, you could take them apart with DATEPART, but in SQL Server 2008 you will have to use a CONVERT to put the value back together, plus have all the gymnastics to do base 60 addition.
If you have the option, it would be best to store the time values as NUMERIC with a positive scale, where the unit of measure is hours, and then break them down when finally reporting them. Something like this:
DECLARE
#r NUMERIC(7, 5);
SET #r = 8.856;
SELECT FLOOR(#r) AS Hours, FLOOR(60 * (#r - FLOOR(#r))) AS Minutes, 60 * ((60 * #r) - FLOOR(60 * #r)) AS Seconds
Returns
Hours Minutes Seconds
8 51 21.60000
There is an advantage to writing a user-defined function to do this, to eliminate the repeated 60 * #r calculations.