I have a query using COALESCE(timestamp_type::date,charachter_varying) which fails due to the mismatched data types:
ERROR: COALESCE types date and character varying cannot be matched
It works if i cast the timestamp as text:
COALESCE(timestamp_type::text,charachter_varying)
However, now it is returning the full timestamp when I only want YYYY-MM-DD (instead of the full timestamp, YYYY-MM-DD HH:MM:SS.000000+00)
How can I use COALESCE and return only the date portion of the timestamp?
You can use to_char to convert the timestamp using appropriate format mask:
COALESCE(to_char(timestamp_type, 'YYYY-MM-DD'), varchar_col)
The correct casting would be
COALESCE(timestamp_type::date::text,char_var)
This should work as you expect ... if you have the ISO datestyle. But it's MUCH better to not rely on datestyle settings for converting date-times to/from text. Hence, #Gurwinder Singh's answer is the way to go.
Related
I am trying to better understand the date_format function offered by Spark SQL.As per the official databricks documentation (I am using databricks), this function expects any date/ string in a valid datetime format. Below is the link for the same.
I am finding it difficult to understand what is the exact definition of "valid" here. I am trying to understand the functionality through two examples here.
Input string in YYYY-MM-DD format (2021-07-09), for which I get the expected results correctly:
Input string in DD-MM-YYYY format (20-07-2021), and I get null:
Why is this happening? How did this function understand that the parameter that I am passing is indeed in YYYY-MM-DD format? It could also have been YYYY-DD-MM.
My requirement is that I implement a logic that could handle all kinds of valid date formats (MM-DD-YYYY, YYYY-MM-DD, DD-MM-YYYY) and format the dates accordingly.
The following is valid input and output formats for ANSI date/time data types:
Example: ANSIDATE yyyy-mm-dd 2007-02-28 TIME WITH TIME ZONE hh:mm:ss.ffff... [+|-]th:tm
The valid range of time zone offset is from -14:00 to +14:00. date complies with the ANSI SQL standard definition for the Gregorian calendar: "NOTE 85 - Datetime data types will allow dates in the Gregorian format to be stored in the date range 0001-01-01 CE through 9999-12-31 CE
See Databricks SQL datetime patterns for details on valid formats. The function checks that the resulting dates are valid dates in the Proleptic Gregorian calendar, otherwise it returns NULL
When you use "20-07-2021" it does not conform to "yyyy-mm-dd" so results in NULL
Alternately, you can use make_date function which Creates a date from year, month, and day fields. Or better use to_date function
select date_format(to_date('9/15/2021', 'MM/dd/yyyy'), 'yyyy/MM/dd')
See Datetime Patterns for Formatting and Parsing in Spark.
CASE WHEN ISDATE(LTRIM(RTRIM(rard.thevalue))) = 1
THEN CONVERT(smalldatetime, LTRIM(RTRIM(rard.thevalue)))
WHEN ISDATE(LTRIM(RTRIM(rard2.thevalue))) = 1
THEN CONVERT(smalldatetime, LTRIM(RTRIM(rard2.thevalue)))
ELSE CONVERT(smalldatetime, LTRIM(RTRIM(r.receiptdate)))
I have this syntax in SQL which has to get converted into oracle. The column "thevalue" has different formats in it ex: HH:MM , MM/DD/YYYY, HH:MM:SS etc. So isdate() function is checking whether its matching the date format and then pulling the data. I would need similar kind of function to check whether the columns value is matching date time format and then display as date.
The Oracle equivalent would be validate_conversion().
However, unlike SQL Server, Oracle won't recognize varying formats. You need to explicitly specify the format that you want (unless your dates already are in the format configured by nls_date_format). Basically, you could test each possible format one after the other, and stop whenever one is recognized.
Since your purpose is to actually convert the string to a date, it would be simpler to use directly to_date(), with the on conversion error clause.
Consider something like:
coalesce(
to_date(thevalue default null on conversion error, 'MM/DD/YYYY'),
to_date(thevalue default null on conversion error, 'YYYY-MM-DD HH24:MI:SS'),
...
)
Notes:
the function happily ignores leading and trailing spaces, so there is no need to trim() beforehand
this requires Oracle 12.2 or higher
isdate() is not really safe in SQL Server; better use try_convert(), which basically behaves like Oracle's to_date() with default null on conversion error
'dd-mm-yy' being NLS_DATE_FORMAT it is implicitly converted to Date data type during comparison, insertion but why is not converted during a arithmetic operation.
sysdate>'01-01-17' //is valid
sysdate-'01-01-17' //is in valid
First I assumed the operators(+,-,..) are only for numeric data type. Later I got to know these operators are used even in Date Arithmetic and even operands with Datedata type are also valid.
"During arithmetic operations on and comparisons between character and noncharacter datatypes, Oracle converts from any character datatype to a numeric, date, or rowid, as appropriate" -
doc
Using to_date solves the issue. I am looking for the reason why it is not implicitly converted.
Forget implicit conversion. Just express your dates using explicit date literals:
sysdate > date '2017-01-01'
sysdate - date '2017-01-01'
The code is clearer and less ambiguous as well.
As to why Oracle doesn't do implicit conversion in the second case. Oracle doesn't know what type to expect. The second operand could be either a date or a number, so it doesn't know how to convert the string. In the first case, the comparison should be to a date.
Adding more detail on Gordon Lindoff's answer with an example.
During sysdate>'010117' as your comparing with a date '010117' surely should be date and is implicitly converted. Same going during insert.
But during sysdate-'010117' the system has the possibilities of converting it Number or Date, and it chooses to convert into Number. So 'dd-mm-yy' format is tried to convert into Number in this context.
I have a question regarding SQL dates.
The table I am working with has a date field in the following format: "22-SEP-08". The field is a date column.
I am trying to figure out how to output records from 1/1/2000 to present day.
The code below is not filtering the date field:
Select distinct entity.lt_date
from feed.entitytable entity
where entity.lt_date >= '2000-01-01'
Any help regarding this issue is much appreciated. Thanks!
Edit: I am using Oracle SQL Developer to write my code.
DATEs do not have "a format". Any format you see is applied by the application displaying the date value.
You can either change the configuration of SQL Developer to display dates in a different format, or you can use to_char() to format the date the way you want.
The reason your statement does not work, is most probably because of the implicit data type conversion that you are relying on.
'2000-01-01' is a string value, not a date. And the string is converted using the NLS settings of your session. Given the fact that you see dates displayed as DD-MON-YY means that that is the format that is used by the evil implicit data type conversion. You should supply date values always as real date literals.
There are two ways of specifying a real date literal. The first is ANSI SQL and simple uses the keyword DATE in front of an ISO formatted string:
where entity.lt_date >= DATE '2000-01-01'
Note the DATE keyword in front of the string, wich makes it a real date literal not a string expression.
The other option is to use to_date() to convert a character value into a date:
where entity.lt_date >= to_date('2000-01-01', 'yyyy-mm-dd');
More details about specifying date literals can be found in the manual:
Date literals
to_date function
My guess is the data type isn't a Date. Just in case its a char type, try to convert it using the Oracle TO_DATE() function. The Oracle documentation below should help you with parameters.
http://docs.oracle.com/cd/B19306_01/server.102/b14200/functions183.htm
An implicit datatype conversion bites once again.
You're right. The predicate is not doing the comparison you are expecting,
Oracle is performing an implicit datatype conversion, from DATE to VARCHAR, so that it can do a comparison to the string literal.
If lt_date column is DATE datatype, then Oracle is seeing your where clause:
where entity.lt_date >= '2000-01-01'
Oracle is actually seeing it as if it's written like this:
where TO_CHAR(entity.lt_date) >= '2000-01-01'
And that's where the "format" problem comes in. The column itself does not have a "format". Because the second argument to the TO_CHAR function is not supplied, Oracle is using the value of the NLS_DATE_FORMAT parameter (from your session). And that's probably set to DD-MON-YY. Which is why that's the "format" you're seeing when you a run a SELECT statement in SQL*Plus. Because the DATE value is (again) being run through a TO_CHAR function to get a string that can be displayed.
To get the "filtering" you want, don't do a comparison to a string literal. Instead, do the comparison to an expression that has DATE datatype.
You can use the Oracle TO_DATE function. And you don't want to rely on setting of NLS_DATE_FORMAT, explicitly specify the format model as the second argument to the function. For example:
DO THIS
where entity.lt_date >= TO_DATE('2000-01-01','YYYY-MM-DD')
DON'T DO THIS
It's also possible to specify the format model as the second argument to the TO_CHAR function.
where TO_CHAR(entity.lt_date,'YYYY-MM-DD') >= '2001-01-01'
But you don't want to do that because that's going to force Oracle to evaluate that expression on the left side for every flipping row in the table, so it has a string value to do the comparison. (That's true unless someone created a function-based index for you.) If you do the comparison on the bare column, using the TO_DATE on the literal side, Oracle can make effective use of an appropriate index (with lt_date as the leading column) to satisfy the predicate.
For example the following query works fine:
SELECT *
FROM quotes
WHERE expires_at <= '2010-10-15 10:00:00';
But this is obviously performing a 'string' comparison - I was wondering if there was a function built in to MySQL that specifically does 'datetime' comparisons.
...this is obviously performing a 'string' comparison
No - if the date/time format matches the supported format, MySQL performs implicit conversion to convert the value to a DATETIME, based on the column it is being compared to. Same thing happens with:
WHERE int_column = '1'
...where the string value of "1" is converted to an INTeger because int_column's data type is INT, not CHAR/VARCHAR/TEXT.
If you want to explicitly convert the string to a DATETIME, the STR_TO_DATE function would be the best choice:
WHERE expires_at <= STR_TO_DATE('2010-10-15 10:00:00', '%Y-%m-%d %H:%i:%s')
But this is obviously performing a 'string' comparison
No. The string will be automatically cast into a DATETIME value.
See 11.2. Type Conversion in Expression Evaluation.
When an operator is used with operands of different types, type conversion occurs to make the operands compatible. Some conversions occur implicitly. For example, MySQL automatically converts numbers to strings as necessary, and vice versa.
I know its pretty old but I just encounter the problem and there is what I saw in the SQL doc :
[For best results when using BETWEEN with date or time values,] use CAST() to explicitly convert the values to the desired data type. Examples: If you compare a DATETIME to two DATE values, convert the DATE values to DATETIME values. If you use a string constant such as '2001-1-1' in a comparison to a DATE, cast the string to a DATE.
I assume it's better to use STR_TO_DATE since they took the time to make a function just for that and also the fact that i found this in the BETWEEN doc...