I have a requirement where integer value should be converted to date type in Snowflake.Initially I used following query to get the desired result:
SELECT TO_DATE(TO_varchar(19000000+1200034),'YYYYMMDD')
2019-07-09
Now when I used same query for the input - "20200034", I am getting following error:
select TO_DATE(TO_varchar(19000000+1200034),'YYYYMMDD')
Can't parse '20200034' as date with format 'YYYYMMDD'
"20200034" is actually coming from one of the columns in snowflake table. To resolve this issue I tried using "TRY_TO_DATE" function, but output of "TRY_TO_DATE" function is giving incorrect result. Please find details below:
select TRY_TO_DATE(TO_varchar(19000000+1200034))
1970-08-22
As per Snowflake documentation, error handling function does not support optional format argument supported by TO_DATE , DATE.
https://docs.snowflake.com/en/sql-reference/functions/try_to_date.html
You can set the DATE_INPUT_FORMAT for the session before calling the TRY_TO_DATE function.
I suggest you contact Snowflake support and ask them to enable try_to_date with format string - it's available but needs to be enabled manually.
However you have to be aware that TRY_TO_DATE on '20200034' will be resolved to NULL.
Related
I am having trouble working with the JSONB structure in PostgreSQL. currently my data is saved as follows:
"{\"Hello\":\"World\",\"idx\":0}"
Which obviously is not correct 😀 so I am trying to "repair" this and get the actual JSON representation for querying with:
SELECT regexp_replace(trim('"' FROM json_data::text), '\\"', '"', 'g')::jsonb FROM My_table
However when trying this, I get the following error:
ERROR: invalid input syntax for type json
DETAIL: Token "Рыба" is invalid.
CONTEXT: JSON data, line 1: ...х, как : Люди X, Пароль \\"Рыба...
SQL state: 22P02
So I am thinking that this is due to the character encoding that is not being accepted by the JSONB standard.
My main question then is though, how can I repair this kind of table so that I am still able to query it? I tried utilizing conver_from and convert_to but am unable to figure out how to fix this error... did anyone encounter this already?
Update: Found it! (thanks to Convert JSON string to JSONB), utilizing
SELECT (json_data#>>'{}')::jsonb FROM my_table
fixed it
I have to make some changes in a existing mule flow with little knowledge and although I've spent some days reading online documentation and possible solutions to this, I cannot figure out why this query is failing, as I also have more dynamic queries in my flow with #[xxx] parameters. The query is as follows:
select times from user_request where
ip_address=SUBSTR(#message.inboundProperties.MULE_REMOTE_CLIENT_ADDRESS],2,INSTR(#[message.inboundProperties.MULE_REMOTE_CLIENT_ADDRESS], ':')-2)
and request_date=CAST(CURRENT_DATE as varchar2(8))
And the error I got is:
Message : Index: 0
(java.lang.IndexOutOfBoundsException). Payload :
{fecha_solicitud=2016-06-22, moneda=USD, client_id=RIVERA,
user_ip=127.0.0.1, request_times=0} Payload Type :
java.util.LinkedHashMap Element :
/OANDAFlow/processors/3 # oanda:oanda.xml:126 Element XML :
select times from user_requestwhere
ip_address=SUBSTR(#[message.inboundProperties.MULE_REMOTE_CLIENT_ADDRESS],2,INSTR(#[message.inboundProperties.MULE_REMOTE_CLIENT_ADDRESS],
':')-2)and request_date=CAST(CURRENT_DATE as
varchar2(8))>
Note: The transformation to varchar of the date is because the column request_date is varchar.
I've tried this query directly in the Oracle SQL developer replacing #[message.inboundProperties.MULE_REMOTE_CLIENT_ADDRESS]
with and example like /127.0.0.1:55406 and it worked fine so why through mule is failing???
In the first: #message.inboundProperties.MULE_REMOTE_CLIENT_ADDRESS] you are missing a [
One of the fields in your query expects a string value try to put a single quote..it would work ,
Try this
select times from user_request where
ip_address=SUBSTR('#message.inboundProperties.MULE_REMOTE_CLIENT_ADDRESS]',2,'INSTR(#[message.inboundProperties.MULE_REMOTE_CLIENT_ADDRESS]', ':')-2)
and request_date=CAST(CURRENT_DATE as varchar2(8))
I am trying to write the index and search using date and time in that index in Cloudant NoSql database.
When I pass only the date in the query string, it works fine
created_date:[2015-08-16 TO 2015-08-27]
This returns the correct results but when I include time in the parameter:
created_date:[2015-08-16 07:38:00 TO 2015-08-27 07:38:02]
I get an error:
Cannot parse 'created_date:[2015-08-16 07:38:00 TO 2015-08-27 07:38:02]': Encountered " "TO" "TO "" at line 1, column 50. Was expecting one of: "]" ... "}"
I have some more query parameters before this but the above is the gist of the error.
This is an Apache Lucene query string. What is causing this to happen?
According to Lucene Java doc, date format should looks like this:
A date field shall be of the form 1995-12-31T23:59:59Z The trailing
"Z" designates UTC time and is mandatory
This format was derived to be standards compliant (ISO 8601) and is a
more restricted form of the canonical representation of dateTime from
XML schema part 2. Examples...
1995-12-31T23:59:59Z 1995-12-31T23:59:59.9Z 1995-12-31T23:59:59.99Z
1995-12-31T23:59:59.999Z
So, you miss 'T' between date and time.
For more information: https://lucene.apache.org/solr/4_10_4/solr-core/org/apache/solr/schema/DateField.html
I did it the following way
created_date:["2015-08-16 07:38:00" TO "2015-08-27 07:38:02"]
and used the keyword analyzer in cloudant
This link explains it all
https://lucene.apache.org/core/2_9_4/queryparsersyntax.html
I'm trying to convert a timestamp into yyyymm format, and have found that this should do the trick:
select convert(nvarchar(6),getnow(),112);
I try to run this just as a POC that it will return in the proper format, but I get the following error:
ERROR: 42601: syntax error at or near ","
It works fine when I take out the style argument, but obviously does not return the desired format. I have no idea what could be happening and any help would be greatly appreciated.
Thanks!
I don't know if Redshift allow to format like SQL Server does.
So I suggest you to try with datepart. Something like this:
SELECT CONVERT(NVARCHAR(6), DATE_PART(y, getnow()) || DATE_PART(mon, getnow()) )
In SQL Server, it's getdate()...
select convert(nvarchar(6),getdate(),112)
This is the equivalent in Redshift:
select to_char(sysdate,'YYYYMM')
We are using Hibernate to connect to AS/400. We are having issues with a query on the AS/400
with the LIKE clause.
The following error is shown:
java.sql.SQLException: [SQL0131] Operands of LIKE not compatible or not valid
My query is its auto generated by Hibernate:
select tab_parame0_.C1IMCD as C1_560_, tab_parame0_.C1NINB as C2_560_,
tab_parame0_.C1JXCD as C3_560_, tab_parame0_.C1HLTX as C4_560_, tab_parame0_.C1HMTX as C5_560_,
tab_parame0_.C1HDST as C6_560_, tab_parame0_.C1NGNB as C7_560_, tab_parame0_.C1NJNB as C8_560_,
tab_parame0_.C1NFNB as C9_560_, tab_parame0_.C1NHNB as C10_560_, tab_parame0_.C1HCST as C11_560_
from RYC1REP tab_parame0_
where lower(tab_parame0_.C1HLTX) like lower(?)
order by tab_parame0_.C1IMCD asc
fetch first 10 rows only
SQL0131 indicates a type mismatch.
What datatype is tab_parame0_.C1HLTX? What datatype is your query parameter?
Please include your HQL/JPQL query source code for comparison.
You may have to set up an SQL trace to see exactly what the AS/400 is receiving.
See How do I obtain trace information from my Java program using the Toolbox?
I recommend you change LIKE LOWER(:parameter) to LIKE :parameter in your source query and use .toLowerCase() when you set the parameter and see how that works.