I'm storing in a table the logs of an API in my application. In one of the columns, I store the raw Json sent in the HTTP request.
I've been tasked to create a page in my application dedicated to easily explore every entered logs, with filters, sorting, etc.
One of the sorting needed is on the date that was indicated in the Json body of the HTTP call. I've managed to do so using the Oracle's Json API :
SELECT *
FROM FUNDING_REQUEST f
ORDER BY TO_TIMESTAMP_TZ(JSON_VALUE(
f.REQUEST_CONTENTS,
'$.leasing_information.consumer_request_date_time'
), 'YYYY/MM/DD"T"HH24:MI:SS.FFTZH:TZM') ASC
This works either if $.leasing_information.consumer_request_date_time is defined or not but I have an issue when the value is wrongly formatted. In one of my test, I sent this to my API :
{
[...],
"leasing_information": {
"consumer_request_date_time": "2021-25-09T12:30:00.000+02:00",
[...]
}
}
There is no 25th month, and my SQL query now returns the following error :
ORA-01843: not a valid month.
I would like to handle this value as NULL rather than returning an error, but it seems like the TO_TIMESTAMP_TZ DEFAULT clause does not really work the way I want it to. Doing this also returns an error :
SELECT *
FROM FUNDING_REQUEST f
ORDER BY TO_TIMESTAMP_TZ(JSON_VALUE(
f.REQUEST_CONTENTS,
'$.leasing_information.consumer_request_date_time'
) DEFAULT NULL ON CONVERSION ERROR, 'YYYY/MM/DD"T"HH24:MI:SS.FFTZH:TZM') ASC
ORA-00932:inconsistent datatypes ; expected : - ; got : TIMESTAMP WITH TIME ZONE
I would also like to avoid using a PL/SQL function if possible, how can I prevent this request from returning an error ?
You don't need to extract a string and then convert that to a timestamp; you can do that within the json_value(), and return null if that errors - at least from version 12c Release 2 onwards:
SELECT *
FROM FUNDING_REQUEST f
ORDER BY JSON_VALUE(
f.REQUEST_CONTENTS,
'$.leasing_information.consumer_request_date_time'
RETURNING TIMESTAMP WITH TIME ZONE
NULL ON ERROR
) ASC
db<>fiddle, including showing what the string and timestamp extractions evaluate too (i.e. a real timestamp value, or null).
Related
I am using an sql script to parse a json into a snowflake table using dbt.
One of the cols contain this datetime value: '2022-02-09T20:28:59+0000'.
What's the correct way to define ISO datetime's data type in Snowflake?
I tried date, timestamp and TIMESTAMP_NTZ like this in my dbt sql script:
JSON_DATA:",my_date"::TIMESTAMP_NTZ AS MY_DATE
but clearly, these aren't the correct one because later on when I test it in snowflake with select * , I get this error:
SQL Error [100040] [22007]: Date '2022-02-09T20:28:59+0000' is not recognized
or
SQL Error [100035] [22007]: Timestamp '2022-02-13T03:32:55+0100' is not recognized
so I need to know which Snowflake time/date data type suits the best for this one
EDIT:
This is what I am trying now.
SELECT
JSON_DATA:"date_transmission" AS DATE_TRANSMISSION
, TO_TIMESTAMP(DATE_TRANSMISSION:text, 'YYYY-MM-DDTHH24:MI:SS.FFTZH:TZM') AS DATE_TRANSMISSION_TS_UTC
, JSON_DATA:"authorizerClientId"::text AS AUTHORIZER_CLIENT_ID
, JSON_DATA:"apiPath"::text API_PATH
, MASTERCLIENT_ID
, META_FILENAME
, META_LOAD_TS_UTC
, META_FILE_TS_UTC
FROM {{ source('INGEST_DATA', 'TABLENAME') }}
I get this error:
000939 (22023): SQL compilation error: error line 6 at position 4
10:21:46 too many arguments for function [TO_TIMESTAMP(GET(DATE_TRANSMISSION, 'text'), 'YYYY-MM-DDTHH24:MI:SS.FFTZH:TZM')] expected 1, g
However, if I comment out the the first 2 lines(related to timpstamp types), the other two work perfectly fine. What's the correct syntax of parsing json with TO_TIMESTAMP?
Not that JSON_DATA:"apiPath"::text API_PATH gives the correct value for it in my snowflake tables.
Did some testing and it seems you have 2 options.
You can either get rid of the +0000 at the end: left(column_date, len(column_date)-5)
or try_to_timestamp with format
try_to_timestamp('2022-02-09T20:28:59+0000','YYYY-MM-DD"T"HH24:MI:SS+TZHTZM')
TZH and TZM are TimeZone Offset Hours and Minutes
So there are 2 main points here.
when getting data from JSON to pass to any of the timestamp functions that want a ::TEXT object, but the values to get from JSON are still ::VARIANT so they need to be cast. This is the cause of the error you quote
(22023): SQL compilation error: error line 6 at position 4
10:21:46 too many arguments for function [TO_TIMESTAMP(GET(DATE_TRANSMISSION, 'text'), 'YYYY-MM-DDTHH24:MI:SS.FFTZH:TZM')] expected 1, g
also your SQL is wrong there it should have been
TO_TIMESTAMP(DATE_TRANSMISSION::text,
How you handle the timezone format.As other have noted you (as I did in your last question) do you want to ignore the timezone values or read them. I forgot about the TZHTZM formatting. Given you have timezone data, you should use the TO_TIMESTAMP_TZ`TRY_TO_TIMESTAMP_TZto make sure the time zone data is keep, given you second example shows+0100`
putting those together (assuming you didn't want an extra date_transmission as a variant in you data) :
SELECT
TO_TIMESTAMP_TZ(JSON_DATA:"date_transmission"::text, 'YYYY-MM-DDTHH24:MI:SS+TZHTZM') AS DATE_TRANSMISSION_TS_UTC
, JSON_DATA:"authorizerClientId"::text AS AUTHORIZER_CLIENT_ID
, JSON_DATA:"apiPath"::text AS API_PATH
, MASTERCLIENT_ID
, META_FILENAME
, META_LOAD_TS_UTC
, META_FILE_TS_UTC
FROM {{ source('INGEST_DATA', 'TABLENAME') }}
You should use timestamp (not date which does not store the time information), but probably the format you are using is not autodetected. You can specify the input format as YYYY-MM-DD"T"HH24:MI:SSTZHTZM as shown here. The autodetected one has a : between the TZHTZM.
A proprietary third-party application stores JSON strings in it's database like this one:
{"state":"complete","timestamp":1614776473000}
I need the timestamp and found out that
DB2 offers JSON functions. Since it's stored as string in the PROF_VALUE column, I guess that converting with SYSTOOLS.JSON2BSON is required, before I can use JSON_VAL to fetch the timestamp:
SELECT SYSTOOLS.JSON_VAL(SYSTOOLS.JSON2BSON(PROF_VALUE), "timestamp", "f")
FROM EMPINST.PROFILE_EXTENSIONS ext
WHERE PROF_PROPERTY_ID = 'touchpointState'
This causes an error that timestamp is invalid in the used context ( SQLCODE=-206, SQLSTATE=42703, DRIVER=4.26.14). The same error is thown when I remove the JSON2BSON call like this
SELECT SYSTOOLS.JSON_VAL(PROF_VALUE, "timestamp", "f")
Also not working with the same error (different data-types):
SELECT SYSTOOLS.JSON_VAL(SYSTOOLS.JSON2BSON(PROF_VALUE), "state", "s:1000")
SELECT SYSTOOLS.JSON_VAL(PROF_VALUE) "state", "s:1000")
I don't understand this error. My syntax is like the documented JSON_VAL ( json-value , search-string , result-type) and it is the same like in the examples, where they show how to fetch the name field of an object.
I also played around a bit with JSON_TABLE to use raw input data for testing (instead of the database data), but it seems not suiteable for that.
SELECT *
FROM TABLE(SYSTOOLS.JSON_TABLE( SYSTOOLS.JSON2BSON('{"state":"complete","timestamp":1614776473000}'), 'state','s:32')) DATA
This gave me a table with one row: Type = 2 and Value = complete.
I had two problems in my query: First it seems that double quotes " are for object references. I wasn't aware that there is any difference, because in most databases I used yet, both single ' and double quotes " are equal.
The second problem is, that JSON_VAL needs to be called without SYSTOOLS, but the reference is still needed on SYSTOOLS.JSON2BSON(PROF_VALUE).
With those changes, the following query worked:
SELECT JSON_VAL(SYSTOOLS.JSON2BSON(PROF_VALUE), 'timestamp', 'f')
FROM EMPINST.PROFILE_EXTENSIONS ext
WHERE PROF_PROPERTY_ID = 'touchpointState'
I'm using h2 to inject the db during run time and testing, everything was working fine until I started trying to make the date field current, not hard-coded.
I thought it was related to the jdbc version I've and updated spring jdbc library to be the latest, but that didn't solve the problem.
This is the code I'm using to inject the data both on run-time and testing :
This code runs during run-time, and it works perfectly before I tried to make the date current.
ResourceDatabasePopulator resourceDatabasePopulator = new ResourceDatabasePopulator();
resourceDatabasePopulator.addScript(new ClassPathResource("schema.sql"));
resourceDatabasePopulator.addScript(new ClassPathResource("data.sql"));
DatabasePopulatorUtils.execute(resourceDatabasePopulator, dataSource); // This is what the DataSourceInitializer does.
For testing purpose, I'm using this code, as I mentioned, it was working perfectly before I tried to make the date current.
DataSource dataSource = new EmbeddedDatabaseBuilder().setType(EmbeddedDatabaseType.H2)
.addScript("classpath:schema.sql")
.addScript("classpath:data.sql")
.build();
data.sql file
INSERT INTO TABLE_X (
dayxx,
xxx
) VALUES
(CONVERT(char(50), DATEADD('DAY', -1,CURRENT_DATE()),126),'xxx')
Exception
Caused by: org.h2.jdbc.JdbcSQLNonTransientException: Unknown data type: "DATEADD"; SQL statement:
INSERT INTO TABLE_X ( dayxx, xxx ) VALUES (CONVERT(char(50), DATEADD('DAY', -1,CURRENT_DATE()),126),'XXX') [50004-200]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:505)
at org.h2.message.DbException.getJdbcSQLException(DbException.java:429)
at org.h2.message.DbException.get(DbException.java:205)
at org.h2.message.DbException.get(DbException.java:181)
at org.h2.command.Parser.parseColumnWithType(Parser.java:5971)
at org.h2.command.Parser.readFunctionParameters(Parser.java:3793)
at org.h2.command.Parser.readFunction(Parser.java:3772)
at org.h2.command.Parser.readTerm(Parser.java:4305)
at org.h2.command.Parser.readFactor(Parser.java:3343)
at org.h2.command.Parser.readSum(Parser.java:3330)
at org.h2.command.Parser.readConcat(Parser.java:3305)
at org.h2.command.Parser.readCondition(Parser.java:3108)
at org.h2.command.Parser.readExpression(Parser.java:3059)
at org.h2.command.Parser.parseValuesForCommand(Parser.java:1877)
at org.h2.command.Parser.parseInsertGivenTable(Parser.java:1817)
at org.h2.command.Parser.parseInsert(Parser.java:1749)
at org.h2.command.Parser.parsePrepared(Parser.java:954)
at org.h2.command.Parser.parse(Parser.java:843)
at org.h2.command.Parser.parse(Parser.java:815)
at org.h2.command.Parser.prepareCommand(Parser.java:738)
at org.h2.engine.Session.prepareLocal(Session.java:657)
at org.h2.engine.Session.prepareCommand(Session.java:595)
at org.h2.jdbc.JdbcConnection.prepareCommand(JdbcConnection.java:1235)
at org.h2.jdbc.JdbcStatement.executeInternal(JdbcStatement.java:212)
at org.h2.jdbc.JdbcStatement.execute(JdbcStatement.java:201)
at org.springframework.jdbc.datasource.init.ScriptUtils.executeSqlScript(ScriptUtils.java:472)
... 53 more
As reported in the docs, Convert function looks like this:
CONVERT ( value , dataType )
Converts a value to another data type.
Example:
CONVERT(NAME, INT)
Since you're passing
CONVERT(char(50), DATEADD('DAY', -1,CURRENT_DATE()),126)
The error states that DATEADD isn't a valid data type, since it's the value, not the type, so try using the right syntax.
I want to get the latest date from my database.
Here is my sql statement.
select "RegDate"
from "Dev"
where "RegDate" = (
select max("RegDate") from "Dev")
It works in my database.
But how can I use it in django?
I tried these codes but it return error. These code are in views.py.
Version 1:
lastest_date = Dev.objects.filter(reg_date=max(reg_date))
Error:
'NoneType' object is not iterable
Version 2:
last_activation_date = Dev.objects.filter(regdate='reg_date').order_by('-regdate')[0]
Error:
"'reg_date' value has an invalid format. It must be in YYYY-MM-DD HH:MM[:ss[.uuuuuu]][TZ] format."
I've defined reg_date at beginning of the class.
What should I do for this?
You make things too complicated, you simply order by regdate, that's all:
last_activation_dev = Dev.objects.order_by('-regdate').first()
The .first() will return such Dev object if it is available, or None if there are no Dev objects.
If you only are interested in the regdate column itself, you can use .values_list(..):
last_activation_date = Dev.objects.order_by('-regdate').values_list('regdate', flat=True).first()
By using .filter() you actually were filtering the Dev table by Dev records such that the regdate column had as value 'reg_date', since 'reg_date' is not a valid datetime format, this thus produced an error.
The error that I found at the log is the one below.
'Illuminate\Database\QueryException' with message 'SQLSTATE[22007]:
[Microsoft][ODBC Driver 11 for SQL Server][SQL Server]Conversion
failed when converting date and/or time from character string. (SQL:
SELECT COUNT(*) AS aggregate
FROM [mytable]
WHERE [mytable].[deleted_at] IS NULL
AND [created_at] BETWEEN '2015-09-30T00:00:00' AND '2015-09-30T23:59:59'
AND ((SELECT COUNT(*) FROM [mytable_translation]
WHERE [mytable_translation].[item_id] = [mytable].[id]) >= 1)
)'
in
wwwroot\myproject\vendor\laravel\framework\src\Illuminate\Database\Connection.php:625
On the database, the DataType is datetime and is not null
Based on marc_s's answer I tried to change the format that I'm sending to the database. So I tried without the T on and [created_at] between '2015-09-30 00:00:00' and '2015-09-30 23:59:59'.
In my local, I'm using mysql, and the code works just fine. If I test the query above on the SQL Server client, both (with and without the T) works too.
How can I fix this problem without create any changes on the database itself?
The PHP/Laravel code:
$items = $items->whereBetween($key, ["'".$value_aux."T00:00:00'", "'".$value_aux."T23:59:59'"]);
With #lad2025 help, I got it to work.
Based on the point of his comments on my question, I changed in the code part (Laravel/PHP in this case) the format that I was passing. (In reality, I "removed" the format it self, and just added fields to a variable before passing to the query. This way, I let the database decide the format that he wants)
Instead of
$itens = $itens->whereBetween($key, ["'".$value_aux."T00:00:00'", "'".$value_aux."T23:59:59'"]);
I changed the code to this:
$sta = $value_aux."T00:00:00";
$end = $value_aux."T23:59:59";
$itens = $itens->whereBetween($key, [$sta, $end]);