I have an UI, in which we which we select a date range, and then perform a query to check orders within that range.
So the valid formats accepted are
a) 2013-04-01 17:00:00 - 2013-04-16 18:00:00
b) 2013-04-01 17:00:00 - 2013-04-16
c) 2013-04-01 - 2013-04-16
I split it in ruby to give me start_date and end_date. I don't have to touch the start time, as it ready for the sql query.
But the end_date has to be modified because to perform a ranged query created_at BETWEEN '2013-04-01 17:00:00' AND '2013-04-16' does not give me the results of 16th which should be included in the result set. (As it compares to 2013-04-16 00:00:00)
So this is the working solution, I came up with
end_date = Time.parse(end_date).strftime("%T")=="00:00:00" ? (end_date.to_s) + " 23:59:59" : end_date.to_s
Is there a better way to do this as the above looks quite confusing? (Looking or 1-2 line answers, not more)
You could use the DateTime.parse(str) method:
date_str = "2013-04-16"
date = DateTime.parse(date_str)
#=> Tue, 16 Apr 2013 00:00:00 +0000
date_time_str = "2013-04-16 18:00:00"
date = DateTime.parse(date_time_str)
#=> Tue, 16 Apr 2013 18:00:00 +0000
And then you could test the .hour (or .minute) and set the end_date to end_of_day if no time was selected:
end_date = (date.hour == 0 && date.minute == 0) ? date.end_of_day : date
Little improvement: You could test if the parsed date is equal to the beginning of this date (no hours/minutes/seconds):
date = DateTime.parse("2013-12-12")
#=> Thu, 12 Dec 2013 00:00:00 +0000
date.beginning_of_day
#=> Thu, 12 Dec 2013 00:00:00 +0000
end_date = (date.beginning_of_day == date) ? date.end_of_day : date
#=> Thu, 12 Dec 2013 23:59:59 +0000
To solve the problem when user enters a DateTime like 2013-04-16 00:00:00, you can use .include? to check if the string contains the '00:00:00' part:
date_str = "2013-04-16 00:00:00"
date = DateTime.parse(date_str)
end_date = date if date_str.include?('00:00:00') # means the user explicitly wants the time at 00:00 and 00 second
end_date ||= (date.beginning_of_day == date) ? date.end_of_day : date # set end_date if end_date.nil?
Related
On RHEL, the date/time at the command prompt is in UTC time.
I query my table for dates in the last hour, but it returns me other dates/times. I don't know if it is timezone related but I am super confused how to work with it
select to_char(update_dt, 'DD Mon YYYY HH12:MI'),data
from mytable
where update_dt > current_date - interval '1' hour
to_char | data
-------------------+----------------
07 May 2020 12:37 | blah
07 May 2020 12:37 | blah
07 May 2020 12:37 | blah
07 May 2020 12:37 | blah
07 May 2020 12:37 | blah
07 May 2020 05:23 | huh
07 May 2020 05:23 | huh
07 May 2020 05:22 | huh
[root#ip-172-31-1-28 ~]# date
Thu May 7 13:25:03 UTC 2020
This expression:
where update_dt > current_date - interval '1' hour
goes since midnight yesterday, because current_date is only the date at midnight (at the beginning of the date).
You seem to want the time included:
where update_dt > now() - interval '1' hour
You can also use:
where update_dt > current_timestamp - interval '1' hour
but now() is easier to type.
I'm wondering if it is possible to build a ORACLE SQL FUNCTION to return a mid-week date rather than using a joining table.
The week ends on a Wednesday so if we had date of Friday 5th Jan then the week ending date would be the following Wednesday 10/01/2018
MON 1ST JAN - 03/01/18
TUES 2ND JAN - 03/01/18
WED 3RD JAN - 03/01/18
THURS 4TH JAN - 10/01/18
FRI 5TH JAN - 10/01/18
SAT 6TH JAN - 10/01/18
SUN 7TH JAN - 10/01/18
MON 8TH JAN - 10/01/18
TUES 9TH JAN - 10/01/18
WED 10TH JAN - 10/01/18
THURS 11TH JAN - 17/01/18
The reason I'm investigating the function route is to future proof the process so I do not have maintain a second lookup table & get rid of unnecessary joins.
I've managed to google a work around
SELECT NEXT_DAY('25-JAN-2018 00.00', 'WEDNESDAY')FROM DUAL;
You may need something like the following:
select d,
case
when trunc(d, 'IW') +2 >= d
then trunc(d, 'IW') +2
else trunc(d, 'IW') +9
end as next_wed
from (
select date '2018-01-01' + level -1 as d
from dual
connect by level <= 11
)
This gets the monday of the week in which your date is contained and then adds 2 or 9 depending on the fact that adding 2 gives a day before the inut date or not.
Another way could be by checking if the input date is a before or after the wednesday in its week:
case when to_char(d, 'D') <= 3
then trunc(d, 'IW') +2
else trunc(d, 'IW') +9
end as next_wed
I'd like to compare and filter NSDates to determine whether there are any for today regarding the users current timezone.
Let's say I've a user located in Los Angeles (UTC-8 hours) and the following dates:
TargetDate
UTC: 2 pm (14:00) - 12. Feb 2017
LocalTime: 10 pm (22:00) - 12. Feb 2017
Now
UTC: 10 pm (22:00) - 11. Feb 2017 // Change of date!
LocalTime: 6 pm (06:00) - 12. Feb 2017
Today
UTC: 00 am (00:00) - 11. Feb 2017 // Begin of today
Tomorrow
UTC: 00 am (00:00) - 12. Feb 2017
In the next step I'd like to compare the TargetDate, Today and Tomorrow to determine, if the TargetDate is between Today and Tomorrow. This is where the problem is. When I compare the dates I receive an answer that it is of course not between these dates.
+ (BOOL)date:(NSDate*)date isBetweenDate:(NSDate*)beginDate andDate:(NSDate*)endDate
{
if ([date compare:beginDate] == NSOrderedAscending)
return NO;
if ([date compare:endDate] == NSOrderedDescending)
return NO;
return YES;
}
What I can do, is to convert the UTC date, TargetDate, to the local timezone but I'm very confused whether this is the best solution. In this post it's mentioned that you shouldn't do this because it confuses the whole problem.
Does anyone has an idea, how to solve this problem?
The problem you are having is actually here:
Today
UTC: 00 am (00:00) - 11. Feb 2017 // Begin of today
Tomorrow
UTC: 00 am (00:00) - 12. Feb 2017
You're deciding that the "day" is the UTC day. Instead, you need to determine the day in the target time zone.
Today
LocalTime: 00 am (00:00) - 11. Feb 2017 // Begin of today
UTC: 08 am (08:00) - 11. Feb 2017 // Equivalent UTC
Tomorrow
LocalTime: 00 am (00:00) - 12. Feb 2017 // Begin of tomorrow
UTC: 08 am (08:00) - 12. Feb 2017 // Equivalent UTC
Do keep in mind a few other things:
Compare with half-open intervals: StartOfToday <= SomePointToday < StartOfTomorrow (inclusive start, exclusive end).
America/Los_Angeles is not always UTC-8. It switches to UTC-7 during daylight saving time.
Not every day starts at midnight. For example, America/Sao_Paulo on 2016-10-16 started at 01:00. The 00:00 hour was skipped for DST.
If you just care about some point on that day, rather than the entire day, compare at 12:00 noon, rather than at 00:00 midnight. It avoids the DST problem.
select TO_TIMESTAMP_TZ('1965-08-01 00:00:00.0','Dy Mon DD HH:MM:ss TZD YYYY') from dual;
i have data in wrong format in varchar column as below mentioned.
1973-12-12 00:00:00.0
2003-05-14 00:00:00.0
1950-05-01 00:00:00.0
Fri Jul 01 00:00:00 PDT 1977
Sun Feb 01 00:00:00 PST 2015
Wed May 14 00:00:00 PDT 2003
I want to keep all date in same format as below but not able to convert.
Fri Jul 01 00:00:00 PDT 1977
Sun Feb 01 00:00:00 PST 2015
Wed May 14 00:00:00 PDT 2003
When adding 'Dy Mon DD HH:MM:ss TZD YYYY' format then getting exception not valid date format due to TZD.
Can any one help me to convert this date and keep in same format through update query.
Storing dates or timestamps as strings is not a good idea. You should make your columns the correct datatype for the data they will hold. As well as allowing Oracle to be more efficient, it prevents you storing invalid data.
However, if you just want to convert strings like '1965-08-01 00:00:00.0' to 'Tue Aug 01 00:00:00 PDT 2000' so they are all in the same string format you can convert those values to timestamps with to_timestamp(), specify which time zone they represent with from_tz(), and then convert them back to string in the format you want. With a demo table t built with your sample data:
update t set str = to_char(from_tz(to_timestamp(str, 'YYYY-MM-DD HH24:MI:SS.FF'),
'America/Los_Angeles'), 'Dy Mon DD HH24:MI:SS TZD YYYY')
where str not like '%PDT%' and str not like '%PST%';
3 rows updated.
select * from t;
STR
----------------------------
Wed Dec 12 00:00:00 PST 1973
Wed May 14 00:00:00 PDT 2003
Mon May 01 00:00:00 PDT 1950
Fri Jul 01 00:00:00 PDT 1977
Sun Feb 01 00:00:00 PST 2015
Wed May 14 00:00:00 PDT 2003
6 rows selected
You need to apply a filter so it ignores any rows that are already in the target format, since those would error. Here I've excluded any that contain PDT or PST. If you have other formats you haven't shown you could use regexp_like() to look for rows that exactly match a specific format instead.
More generally, to convert any of your strings to actual timestamps you can create a function that tries various formats and returns the first one that successfully converts, by trapping exceptions. A simple brute force approach might be:
create or replace function my_to_timestamp_tz(p_str varchar2)
return timestamp with time zone as
begin
-- try each expected pattern in turn
begin
return to_timestamp_tz(p_str, 'Dy Mon DD HH24:MI:SS TZD YYYY');
exception
when others then
null;
end;
begin
-- unspecified TZ; this assumes same as server
return to_timestamp_tz(p_str, 'YYYY-MM-DD HH24:MI:SS.FF');
exception
when others then
null;
end;
-- maybe throw an exception if no conversions worked
return null;
end;
/
PDT/PST isn't always recognised; I have two DBs that I thought were built the same but one allows that and other throws "ORA-01857: not a valid time zone". If you see that too you can work around it by treating the TZD value as a fixed string and specifying which region it represents:
-- if your DB doesn't recognise PDT/PST then force them to a region:
begin
return from_tz(to_timestamp_tz(p_str, 'Dy Mon DD HH24:MI:ss "PDT" YYYY'),
'America/Los_Angeles');
exception
when others then
null;
end;
begin
return from_tz(to_timestamp_tz(p_str, 'Dy Mon DD HH24:MI:ss "PST" YYYY'),
'America/Los_Angeles');
exception
when others then
null;
end;
-- other time zone abbreviations and matching regions if you expect any
Using that function:
with t (str) as (
select '1973-12-12 00:00:00.0' from dual
union all select '2003-05-14 00:00:00.0' from dual
union all select '1950-05-01 00:00:00.0' from dual
union all select 'Fri Jul 01 00:00:00 PDT 1977' from dual
union all select 'Sun Feb 01 00:00:00 PST 2015' from dual
union all select 'Wed May 14 00:00:00 PDT 2003' from dual
)
select str, my_to_timestamp_tz(str) as converted
from t;
STR CONVERTED
---------------------------- ----------------------------------------------------
1973-12-12 00:00:00.0 1973-12-12 00:00:00 Europe/London
2003-05-14 00:00:00.0 2003-05-14 00:00:00 Europe/London
1950-05-01 00:00:00.0 1950-05-01 00:00:00 Europe/London
Fri Jul 01 00:00:00 PDT 1977 1977-07-01 00:00:00 America/Los_Angeles
Sun Feb 01 00:00:00 PST 2015 2015-02-01 00:00:00 America/Los_Angeles
Wed May 14 00:00:00 PDT 2003 2003-05-14 00:00:00 America/Los_Angeles
Notice that the first three assume a time zone, and because I'm running this in the UK pick London. If that isn't going to give the right result for you and you know those always represent a specific time zone, you can can specify that zone by changing the last block in the function to do:
begin
-- unspecified TZ; assume from a specific region
return from_tz(to_timestamp(p_str, 'YYYY-MM-DD HH24:MI:SS.FF'),
'America/Los_Angeles');
exception
...
which will then get:
STR CONVERTED
---------------------------- ----------------------------------------------------
1973-12-12 00:00:00.0 1973-12-12 00:00:00 America/Los_Angeles
2003-05-14 00:00:00.0 2003-05-14 00:00:00 America/Los_Angeles
1950-05-01 00:00:00.0 1950-05-01 00:00:00 America/Los_Angeles
Fri Jul 01 00:00:00 PDT 1977 1977-07-01 00:00:00 America/Los_Angeles
Sun Feb 01 00:00:00 PST 2015 2015-02-01 00:00:00 America/Los_Angeles
Wed May 14 00:00:00 PDT 2003 2003-05-14 00:00:00 America/Los_Angeles
If you really want to you can convert the generated timestamps back to strings, but I'd really recommend you store them as the right data type
I have a question about fiscal date literals in the Force.com API (http://www.salesforce.com/us/developer/docs/api/Content/sforce_api_calls_soql_select_dateformats.htm):
For which time zone are date ranges calculated?
For example, suppose we execute the query:
SELECT Id FROM Opportunity WHERE CloseDate = THIS_FISCAL_QUARTER
where, according to our company's fiscal settings, THIS_FISCAL_QUARTER runs from Jan 1 to Mar 31.
Does the range for THIS_FISCAL_QUARTER use...
the user's time zone? For example, if the user's time zone is GMT-8, THIS_FISCAL_QUARTER = Jan 1 00:00 GMT-8 to Mar 31 23:59 GMT-8 (or Jan 1 08:00 UTC to Mar 31 07:59 UTC)
the company's default time zone (according to the company profile)? For example, if the company's default time zone is GMT-8, THIS_FISCAL_QUARTER = Jan 1 00:00 GMT-8 to Mar 31 23:59 GMT-8 (or Jan 1 08:00 UTC to Mar 31 07:59 UTC)
UTC? THIS_FISCAL_QUARTER = Jan 1 00:00 UTC to Mar 31 23:59 UTC
something else?
From my experience (fun with reports, not queries) the note from all the way on bottom is valid also for time literals. So it uses running user's timezone setting.
These values are offset by your timezone. For example, in the Pacific timezone, the earliest valid date is 1699-12-31T16:00:00, or 4:00 PM on December 31, 1699.
Maybe you can simply create a test record with datetime field just slightly outside the fiscal quarter and query for it "WHERE mydatetimefield__c > THIS_FISCAL_QUARTER"?
See also http://forums.sforce.com/t5/General-Development/SOQL-Date-literal-TODAY-is-evaluated-incorrectly/m-p/43607