How can I send jsonb[]? - sql

I have one SQL function
FUNCTION create_user_events(_auid uuid, _name text[], _ts int[], _params jsonb[], _id int[])
And I am tryig to use it with following parameters:
SELECT create_user_events('db74c66d','{scr_home}','{1}','{{"key":"value"}::jsonb}','{123}');
And logout is :
ОШИБКА: ошибочный литерал массива: "{{"key": "value"}::jsonb}"
How can I fix it?
Thank you for your time.
many variants with qoutes and etc.

Use the array[] notation:
create_user_events('db74c66d',
'{scr_home}',
'{1}',
array['{"key":"value"}']::jsonb[],
'{123}');
Alternatively you can just cast the first array element to jsonb: array['{"key":"value"}'::jsonb, '{"key2":"value2"}']
I prefer that in general over the "string syntax" for arrays:
create_user_events('db74c66d',
array['scr_home'],
array[1],
array['{"key":"value"}']::jsonb[],
array[123]);
This will however result in "invalid input syntax for type uuid: "db74c66d"" - but I assume that db74c66d is just a placeholder for a valid UUID.

Related

Postgres | V9.4 | Extract value from json

I have a table that one of the columns is in type TEXT and holds a json object inside.
I what to reach a key inside that json and ask about it's value.
The column name is json_representation and the json looks like that:
{
"additionalInfo": {
"dbSources": [{
"user": "Mike"
}]
}
}
I want to get the value of the "user" and ask if it is equal to "Mike".
I tried the following:
select
json_representation->'additionalInfo'->'dbSources'->>'user' as singleUser
from users
where singleUser = 'Mike';
I keep getting an error:
Query execution failed
Reason:
SQL Error [42883]: ERROR: operator does not exist: text -> unknown
Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.
Position: 31
please advice
Thanks
The error message tells you what to do: you might need to add an explicit type cast:
And as you can not reference a column alias in the WHERE clause, you need to wrap it into a derived table:
select *
from (
select json_representation::json ->'additionalInfo'->'dbSources' -> 0 ->>'user' as single_user
from users
) as t
where t.single_user = 'Mike';
:: is Postgres' cast operator
But the better solution would be to change the column's data type to json permanently. And once you upgrade to a supported version of Postgres, you should use jsonb.

PreparedStatement for Xpath attribute

I have a query with xpath. The values in the xpath is filled dynamically.
Query:
SELECT app_prof.pk_szid, app_prof.xmldata
FROM tblappprofile AS app_prof
WHERE 'Self' =
CAST((xpath('/ApplicationProfile/ComponentIDs/ComponentID[#Family="Core"]/text()', xmldata))[1] AS TEXT)
For preparedStatement:
SELECT app_prof.pk_szid, app_prof.xmldata
FROM tblappprofile AS app_prof
WHERE ? =
CAST((xpath('/ApplicationProfile/ComponentIDs/ComponentID[#Family= ? ]/text()', xmldata))[1] AS TEXT)
When I use,
preparedStatement.setString(1, "Self");
preparedStatement.setString(2, "Core");
results in org.postgresql.util.PSQLException: The column index is out of range: 2, number of columns: 1
The 'Self' is filled correctly. ? in attribute is not recognized. How to use PreparedStatement for attributes in Xpath?
Question marks inside string literals are not considered as parameter placeholders.
You need to pass the whole XPath expression as a parameter:
WHERE ? = CAST((xpath(?, xmldata))[1] AS TEXT)
another option is to dynamically create a string using the format() function:
where ? = CAST((xpath(format('/ApplicationProfile/ComponentIDs/ComponentID[#Family="%s"]/text()',?), xmldata))[1]
That way you can pass the value for #Familiy as a parameter and still keep the XPath inside the SQL if you want.

Handling Null DataType

I'm using the Over function from Piggybank to get the Lag of a row
res= foreach (group table by fieldA) {
Aord = order table by fieldB;
generate flatten(Stitch(Aord, Over(Aord.fieldB, 'lag'))) as (fieldA,fieldB,lag_fieldB) ;}
This works correctly and when I do a dump I get the expected result, the problem is when I want to use lag_fieldB for any comparison or transformation I get datatype issues.
If I do a describe it returns fieldA: long,fieldB: chararray,lag_fieldB: NULL
I'm new with PIG but I already tried casting to chararray and using ToString() and I keep getting errors like these:
ERROR 1052: Cannot cast bytearray to chararray
ERROR 1051: Cannot cast to bytearray
Thanks for your help
Ok after some looking around into the code of the Over function I found that you can instantiate the Over class to set the return type. What worked for me was:
DEFINE ChOver org.apache.pig.piggybank.evaluation.Over('chararray');
res= foreach (group table by fieldA) {
Aord = order table by fieldB;
generate flatten(Stitch(Aord, ChOver(Aord.fieldB, 'lag'))) as (fieldA,fieldB,lag_fieldB) ;}
Now the describe is telling me
fieldA: long,fieldB: chararray,lag_fieldB: chararray
And I'm able to use the columns as expected, hope this can save some time for someone else.

Returning a tuple column type from slick plain SQL query

In slick 3 with postgres, I'm trying to use a plain sql query with a tuple column return type. My query is something like this:
sql"""
select (column1, column2) as tup from table group by tup;
""".as[((Int, String))]
But at compile time I get the following error:
could not find implicit value for parameter rconv: slick.jdbc.GetResult[((Int, String), String)]
How can I return a tuple column type with a plain sql query?
GetResult[T] is a wrapper for function PositionedResult => T and expects an implicit val with PositionedResult methods such as nextInt, nextString to extract positional typed fields. The following implicit val should address your need:
implicit val getTableResult = GetResult(r => (r.nextInt, r.nextString))
More details can be found in this Slick doc.

To remove double quotes from date string in SQL

I am using JSON_EXTRACT() to retrieve the date from json.
I want to get the date without double quotes.
Here is the example of what I am doing :
JSON_EXTRACT(JSON_EXTRACT(events, "$.my_member"), "$.my_Number") as xyz
my_number holds date string as "2016-01-01 11:31:25", I want this without the double quotes.
I tried using timestamp as :
timestamp(JSON_EXTRACT(JSON_EXTRACT(events, "$.my_member"), "$.my_Number"))
but it is returning a null value to xyz.
Thanks.
Try
JSON_EXTRACT_SCALAR(JSON_EXTRACT(events, "$.my_member"), "$.my_Number")
Also, you should be able to further "optimize" your expression by building proper JSON Path and using JSON function only ones. See "hint" below
SELECT
JSON_EXTRACT_SCALAR(
'{"my_member":{"my_Number":"2016-01-01 11:31:25"}}',
"$.my_member.my_Number"
)
See more details and also difference between JSON_EXTRACT_SCALAR and JSON_EXTRACT at JSON functions
Run REPLACE
REPLACE(JSON_EXTRACT(JSON_EXTRACT(events, "$.my_member"), "$.my_Number"),"\"","") as xyz
I tried JSON_EXTRACT_SCALAR in MySQL Workbench but I got Error Code: 1305. FUNCTION manifest.JSON_EXTRACT_SCALAR does not exist
Instead I used JSON_UNQUOTE and that did the trick.
I have a column called 'buffer_time' which contains:
'{"after": {"time": "00:01:00", "is_enabled": true}, "before": {"time": "00:04:00", "is_enabled": true}}'
JSON_UNQUOTE(JSON_EXTRACT(buffer_time, '$.after.time'))
gave me: `
00:01:00
Hope that helps.