Convert TIMESTAMPL (timestamp long) to TIMESTAMP ends with 60 - abap

I am having an OData Service returning some DateTime values. They are saved in a table in the back end as TIMESTAMPL (with some other data).
Now there is the value 20160630084459.5000. With MOVE-CORRESPONDING into the et_entityset, where it is a TIMESTAMP. Because of the rounding, it gets 20160630084460, Since a the seconds must be between 00 and 59, this is not a valid value to return.
My main problem is, that my table has extremely much entries, so I need a performant way to fix this error.

Here is a way to convert it to what you want.
REPORT zzy NO STANDARD PAGE HEADING.
FORM convert_timestamp.
DATA(l_t1) = CONV timestampl('20160630084459.5000').
DATA: l_t2 TYPE timestamp.
l_t2 = l_t1.
WRITE / : l_t1, l_t2.
CONVERT TIME STAMP l_t1 TIME ZONE sy-zonlo INTO DATE DATA(l_date) TIME DATA(l_time).
CONVERT DATE l_date TIME l_time INTO TIME STAMP l_t2 TIME ZONE sy-zonlo.
WRITE / l_t2.
ENDFORM.
START-OF-SELECTION.
PERFORM convert_timestamp.
Here is the output.
20.160.630.084.459,5000000
20.160.630.084.460
20.160.630.084.459

You mention floor in your question but that is not what is happening. The value is rounded. If you simple do use FLOOR in your assignment from TIMESTAMPL to TIMESTAMP you will get the answer you want. If you have to use MOVE-CORRESPONDING, just do that first and then do a seperate assignment for the timestamp.
However, this means that 0:59.9 will get translated to 0:59 and not 1:00. If that missing second is OK for your application then just use the FLOOR command. If not it becomes more complicated and you will take a performance hit.

Related

Issues while converting timestamp to specific timezone and then converting it to date in bigquery

I am doing just a simple conversion of timestamp column value to specific timezone and then getting the date out of it to create analytical charts based on the output of the query.
I am having the column of type timestamp in the bigquery and value for that column is in UTC. Now I need to convert that to PST (which is -8:00 GMT) and was looking straight forward to convert but I am seeing some dates up and down based on the output I get.
From the output that I was getting I took one abnormal output and wrote a query out of it as below:
select "2021-05-27 18:10:10" as timestampvalue ,
Date(Timestamp("2021-05-27 18:10:10" ,"-8:00")) as completed_date1,
Date(Timestamp("2021-05-27 18:10:10","America/Los_Angeles")) as completed_date2,
Date(TIMESTAMP_SUB("2021-05-27 18:10:10", INTERVAL 8 hour)) as completed_date3,
Date(Timestamp("2021-05-27 18:10:10","America/Tijuana")) as completed_date4
The output that I get is as below:
Based on my understanding I need to subtract 8 hours from the time in order to get the timestamp value for the timezone that I wanted and according to that completed_date3 column seems to show the correct value that should be there but if I use other timezone conversions as suggested in google documentation, the output gets changed to 2021-05-28 and I am not able to understand how that can happen.
Can anyone let me know what is the thing that I am doing wrong?
I was actually using it in a wrong way. I need to use it as below :
select "2021-05-27 18:10:10" as timestampvalue ,
Date(Timestamp("2021-05-27 18:10:10") ,"-8:00") as completed_date1,
Date(Timestamp("2021-05-27 18:10:10"),"America/Los_Angeles") as completed_date2,
Date(TIMESTAMP_SUB("2021-05-27 18:10:10", INTERVAL 8 hour)) as completed_date3,
Date(Timestamp("2021-05-27 18:10:10"),"America/Tijuana") as completed_date4
Initially I was converting that string timestamp to a specific timestamp based on the timezone and that is what I did not want.
Now if a convert a string to timestamp first without using time zone parameter and then apply timezone parameter when getting the date value out of it then it would return me correct date.
Please see the snapshot below :

How to use Column value as Interval to increase date in Oracle SQL?

I have table named REFERENCE with columns TIME (timestamp), and ADJUST that stores varchar seconds values: 10, 13, 55, .. etc
I want to use the field ADJUST to increase value of TIME while making sure that ADJUST values are converted to seconds and not minutes or other units, something like:
SELECT
TIME AS START,
TIME + ADJUST AS END
FROM REFERENCE;
How to that?
I've tried using interval but it works only with explicit values, example:
TIME + INTERVAL '13' SECOND AS END
You can use numToDSInterval():
time + numToDSInterval(adjust, 'second')
It might be cleaner to explicitly convert the string to a number:
time + numToDSInterval(to_number(adjust), 'second')
If adjust always belongs to range 00-59, you can also use to_dsinterval():
time + to_dsinterval('0 00:00:' || adjust)
Seconds should be stored as number, why are they varchar (or probably varchar2)?
Other than that, the trick is to take an interval literal (which indeed requires a hard-coded string) and use arithmetic operations on it.
So, let's say time and adjust are your columns; time is in timestamp data type, and adjust is number, measured in seconds. (If it's string, you can convert it to number, explicitly, by wrapping it within to_number(); I will leave that out, since adjust should really be number data type to begin with.)
Then you can do something like this:
..... time + adjust * interval '1' second.
Here adjust can be 300, as in the example you gave under GMB's answer.

Convert DOUBLE column to TIMESTAMP in Firebird database

I have a Firebird database that saves the datetime field as a DOUBLE. I have created a ColdFusion datasource connection, so I can query the data remotely. While the rest of the data is being returned correctly, the datetime field is unreadable. I have tried using CAST and CONVERT to no avail. How can I convert this to a timestamp?
An example of the data stored is: 43016.988360
You can't just convert a DOUBLE PRECISION to a TIMESTAMP, not without explicitly defining how you want it mapped and writing that conversion yourself (or hoping there is an existing third-party UDF that does this for you).
A TIMESTAMP in Firebird is a date + time represented as an 8 byte value, where the date range is from January 1, 1 a.d. to December 31, 9999 a.d. and the time range is 00:00 to 23:59.9999 (so, 100 microsecond precision).
A DOUBLE PRECISION is - usually - the wrong type for storing date and time information, and as you haven't provided how that double value should be interpreted, we can't help you other than saying: there is no default method in Firebird to do this.
Based on the comments below, it looks like the value is a ColdFusion date value stored as double precision with the number of days since December 30th 1899, see also why is ColdFusion's Epoch Time Dec 30, 1899?. If this is really the case, then you can use the following for conversion to a TIMESTAMP:
select timestamp'1899-12-30 00:00' + 43016.988360 from rdb$database
Which will yield the value 2017-10-08 23:43:14.304. Using the value 43182.4931754 from the comments will yield 2018-03-23 11:50:10.354. That is a millisecond off from your expectation, but that might be a rounding/presentation issue, eg I get the exact expected date if I use 43182.49317539 instead.
I would strongly suggest you carefully test this with known values.

Parse time strings with different formats and compare them

I am running a query between several tables and I am running into an issue between comparing two time columns on separate tables: "rc1_time" is in a string format and "osemplog_time" is in a time format. both are time only with no date
rc1_time's contents look like this '10560684' which corresponds to HH24MISSMS
osemplog_time's contents look like 07:57:02.917455
how do I format the rc1_time into a "time format" with no date?
what are some options for comparing the two times?
I am newbie at this exposition on your answers would be welcome
below is my query
SELECT
"public".payroll_master.prm1_name,
"public".payroll_master.prm1_oe_init,
"public".receipt.rc1_init,
"public".employee_log.osemplog_ipaddress,
"public".employee_log.osemplog_event,
"public".receipt.rc1_date,
"public".employee_log.osemplog_logdate,
"public".receipt.rc1_code,
"public".employee_log.osemplog_logname,
"public".oslogname.lognm_empname,
"public".receipt.rc1_arname,
"public".receipt.rc1_arnum,
"public".receipt.rc1_time,
"public".employee_log.osemplog_logtime
FROM
"public".receipt
INNER JOIN "public".employee_log ON "public".receipt.rc1_date = "public".employee_log.osemplog_logdate
INNER JOIN "public".payroll_master ON "public".payroll_master.prm1_oe_init = "public".receipt.rc1_init
INNER JOIN "public".oslogname ON "public".oslogname.lognm_empname = "public".payroll_master.prm1_name AND "public".oslogname.lognm_name = "public".employee_log.osemplog_logname
WHERE
"public".receipt.rc1_code = 'CA'
AND
"public".employee_log.osemplog_logdate = "public".receipt.rc1_date
ORDER BY
"public".receipt.rc1_init ASC
Question as stated
You can represent a time without a date using the time data type. To convert a string from a given format into one, you can go through the to_timestamp function and then cast to time:
SELECT to_timestamp('10560684', 'HH24MISSUS')::time;
SELECT to_timestamp('07:57:02.917455', 'HH24:MI:SS.US')::time;
The basic idea is that you parse the time string using to_timestamp. The resulting timestamp will have a default date, and casting to time will remove the date, leaving only the parsed out time portion.
Assumptions:
Your hours are in 24-hour clock format (13-23 for 1 PM to 11 PM and 00 for midnight). If they are not 24 hour times, then you are missing the AM/PM designation and will need to sort that out.
The second "SS" you mention in your first pattern is actually a fractional part of seconds. If not, you'll need to adjust the pattern. If you don't care about the fractional seconds, you might consider just leaving the US and the .US off entirely and working only at the seconds level. Note that US interprets 84 to be 0.84 seconds, not actually 84 microseconds (0.000084 seconds).
Ultimately, you will need to either provide much more precise details about the format or figure out the correct format string yourself. Rather than worry about those details, I've tried to exemplify the general mechanism and leave those to you.
Comparison is then trivial. You just use PostgreSQL's operators (<, >, =, etc.):
SELECT to_timestamp('07:57:02.917455', 'HH24:MI:SS.US')::time < to_timestamp('10560684', 'HH24MISSUS')::time;
Other considerations
Be aware of time zone issues if you are working across them. You'll want to look at timetz (short form of time with time zone) or timestamptz (short form of timestamp with time zone) if you need to deal with time zones. Generally, I would recommend including time zone handling up front in case it becomes a problem later.
In this case, why not build a complete timestamp? You already have the dates: "public".receipt.rc1_date and "public".employee_log.osemplog_logdate.
You don't specify the data types, but whatever the forms of those are, it should be possible. For example, if they are actual date objects, then:
SELECT to_timestamp(to_char("public".receipt.rc1_date, 'YYYY-MM-DD')||' '||"public".receipt.rc1_time, 'YYYY-MM-DD HH24MISSMS');
If they are strings of the form 'YYYY-MM-DD', then:
SELECT to_timestamp("public".receipt.rc1_date||' '||"public".receipt.rc1_time, 'YYYY-MM-DD HH24MISSMS');
And so on. Now you have a real timestamp, which makes simple great/less than comparison much, much easier.
In my experience, it's extremely rare that you actually want to test time stamps with fractional second precision for equality. You might want a more tolerant equality check, something like SELECT t1 - t2 < interval '5 seconds', but this is really up to the application.

Qlikview Timestamp formatting upto microseconds?

In qlikview I can get timestamp in milliseconds, by setting timestamp format as :-
SET TimestampFormat='MM/DD/YYYY hh:mm:ss.fff';
I want to know if there is a way to get time stamp in qlikview upto microseconds.
Formula to optain microseconds from TimeField:
((frac(TimeField) * 86400000) - floor(frac(TimeField) * 86400000)) * 1000 as Micro
And I would use this formula for formatting:
Timestamp(TimeField - (Micro/86400000000)) & Num(floor(Micro), '000') as TimeStamp
As far as I can determine from the QlikView help, there is no format specifier for microseconds, only for milliseconds.
If you need to obtain the microsecond value from a time, I quickly threw the below together (it can probably be done a bit neater). Here I assume your input time field is called TimeField. We can obtain the number of milliseconds using:
=((TimeField-num(date(floor(TimeField)))-
num(maketime(hour(TimeField),minute(TimeField),second(TimeField))))*24*60*60)*1000
For the sake of simplicity, I will call the above formula MillisecondCount. Then, using this field, we can then calculate the number of microseconds:
=floor(((MillisecondCount)-floor(MillisecondCount))*1000)
Finally, the full formula to obtain microseconds becomes:
floor(((((TimeField-num(date(floor(TimeField)))-
num(maketime(hour(TimeField),minute(TimeField),second(TimeField))))*24*60*60)*1000)
-floor(((TimeField-num(date(floor(TimeField)))-
num(maketime(hour(TimeField),minute(TimeField),second(TimeField))))*24*60*60)*1000))*1000)
You can then just format this with num() and append it to your time-stamp string.