Postgres | V9.4 | Extract value from json - sql

I have a table that one of the columns is in type TEXT and holds a json object inside.
I what to reach a key inside that json and ask about it's value.
The column name is json_representation and the json looks like that:
{
"additionalInfo": {
"dbSources": [{
"user": "Mike"
}]
}
}
I want to get the value of the "user" and ask if it is equal to "Mike".
I tried the following:
select
json_representation->'additionalInfo'->'dbSources'->>'user' as singleUser
from users
where singleUser = 'Mike';
I keep getting an error:
Query execution failed
Reason:
SQL Error [42883]: ERROR: operator does not exist: text -> unknown
Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.
Position: 31
please advice
Thanks

The error message tells you what to do: you might need to add an explicit type cast:
And as you can not reference a column alias in the WHERE clause, you need to wrap it into a derived table:
select *
from (
select json_representation::json ->'additionalInfo'->'dbSources' -> 0 ->>'user' as single_user
from users
) as t
where t.single_user = 'Mike';
:: is Postgres' cast operator
But the better solution would be to change the column's data type to json permanently. And once you upgrade to a supported version of Postgres, you should use jsonb.

Related

How to fetch all rows where an array contains any of the fields array elements

I have a table that has a column video_ids, it is of a bigint[] type. I would like to find all the rows that have any of the elements from the array passed in a select statement. So, if I have a row that has a video_ids field that looks like this:
{9529387, 9548200, 9579636}
I would like to fetch it if I pass an array that has any of this video_ids. I thought I would do that with any, but I am not sure how to do this in SQL, I have tried with this:
select id, finished, failed, video_ids, invoiced_video_ids, failed_video_ids
from video_order_execution
where order_ids = any(
'{9548200, 11934626, 9579636, 11936321, 11509698, 11552728, 11592106, 11643565, 11707543, 11810386, 11846268}'
::bigint[]);
I get an error if I do that:
ERROR: operator does not exist: bigint[] = bigint Hint: No operator
matches the given name and argument types. You might need to add
explicit type casts.
How can I make such a statement that would do the job for what I need?
Use the operator && which returns true if the 2 operands have any common items:
select id, finished, failed, video_ids, invoiced_video_ids, failed_video_ids
from video_order_execution
where order_ids &&
'{9548200, 11934626, 9579636, 11936321, 11509698, 11552728, 11592106, 11643565, 11707543, 11810386, 11846268}'::bigint[];

How to resolve this sql error of schema_of_json

I need to find out the schema of a given JSON file, I see sql has schema_of_json function
and something like this works flawlessly
> SELECT schema_of_json('[{"col":0}]');
ARRAY<STRUCT<`col`: BIGINT>>
But if I query for my table name, it gives me the following error
>SELECT schema_of_json(Transaction) as json_data from table_name;
Error in SQL statement: AnalysisException: cannot resolve 'schemaofjson(`Transaction`)' due to data type mismatch: The input json should be a string literal and not null; however, got `Transaction`.; line 1 pos 7;
The Transaction is one of the columns in my table and after checking it manually I can attest that it is of String type(json).
The SQL statement has it to give me the schema of the JSON, how to do it?
after looking further into the documentation that it is clear that the word foldable means that of the static one, and a column from a table JSON won't work
for minimal reroducible example here you go:
SELECT schema_of_json(CAST('{ "a": "b" }' AS STRING))
As soon as the cast is introduced in the above statement, the schema_of_json will fail......... It needs a static JSON as it's input

Fetching attribute from JSON string with JSON_VAL cause "<attribute> is invalid in the used context" error

A proprietary third-party application stores JSON strings in it's database like this one:
{"state":"complete","timestamp":1614776473000}
I need the timestamp and found out that
DB2 offers JSON functions. Since it's stored as string in the PROF_VALUE column, I guess that converting with SYSTOOLS.JSON2BSON is required, before I can use JSON_VAL to fetch the timestamp:
SELECT SYSTOOLS.JSON_VAL(SYSTOOLS.JSON2BSON(PROF_VALUE), "timestamp", "f")
FROM EMPINST.PROFILE_EXTENSIONS ext
WHERE PROF_PROPERTY_ID = 'touchpointState'
This causes an error that timestamp is invalid in the used context ( SQLCODE=-206, SQLSTATE=42703, DRIVER=4.26.14). The same error is thown when I remove the JSON2BSON call like this
SELECT SYSTOOLS.JSON_VAL(PROF_VALUE, "timestamp", "f")
Also not working with the same error (different data-types):
SELECT SYSTOOLS.JSON_VAL(SYSTOOLS.JSON2BSON(PROF_VALUE), "state", "s:1000")
SELECT SYSTOOLS.JSON_VAL(PROF_VALUE) "state", "s:1000")
I don't understand this error. My syntax is like the documented JSON_VAL ( json-value , search-string , result-type) and it is the same like in the examples, where they show how to fetch the name field of an object.
I also played around a bit with JSON_TABLE to use raw input data for testing (instead of the database data), but it seems not suiteable for that.
SELECT *
FROM TABLE(SYSTOOLS.JSON_TABLE( SYSTOOLS.JSON2BSON('{"state":"complete","timestamp":1614776473000}'), 'state','s:32')) DATA
This gave me a table with one row: Type = 2 and Value = complete.
I had two problems in my query: First it seems that double quotes " are for object references. I wasn't aware that there is any difference, because in most databases I used yet, both single ' and double quotes " are equal.
The second problem is, that JSON_VAL needs to be called without SYSTOOLS, but the reference is still needed on SYSTOOLS.JSON2BSON(PROF_VALUE).
With those changes, the following query worked:
SELECT JSON_VAL(SYSTOOLS.JSON2BSON(PROF_VALUE), 'timestamp', 'f')
FROM EMPINST.PROFILE_EXTENSIONS ext
WHERE PROF_PROPERTY_ID = 'touchpointState'

PostgreSQL - Query nested json in text column

My situation is the following
-> The table A has a column named informations whose type is text
-> Inside the informations column is stored a JSON string (but is still a string), like this:
{
"key": "value",
"meta": {
"inner_key": "inner_value"
}
}
I'm trying to query this table by seraching its informations.meta.inner_key column with the given query:
SELECT * FROM A WHERE (informations::json#>>'{meta, inner_key}' = 'inner_value')
But I'm getting the following error:
ERROR: invalid input syntax for type json
DETAIL: The input string ended unexpectedly.
CONTEXT: JSON data, line 1:
SQL state: 22P02
I've built the query following the given link: DevHints - PostgreSQL
Does anyone know how to properly build the query ?
EDIT 1:
I solved with this workaround, but I think there are better solutions to the problem
WITH temporary_table as (SELECT A.informations::json#>>'{meta, inner_key}' as inner_key FROM A)
SELECT * FROM temporary_table WHERE inner_key = 'inner_value'

Playframework SQL query

Im trying to make a query that help me get a user by its username. What i want to do is to check if the given username is unique. so what i thought about is to uppercase the username when i do my select request and compare it with an uppercase of the given username. This is what i tried:
SQL("select * from utilisateur where upper(pseudo) = {pseudo}").on(
'pseudo -> pseudo.toUpperCase
).as(Utilisateur.utilisateur.singleOpt)
but it get this error:
[RuntimeException: SqlMappingError(too many rows when expecting a single one)]
I tried this too:
SQL("select * from utilisateur where ucase(pseudo) = {pseudo}").on(
'pseudo -> pseudo.toUpperCase
).as(Utilisateur.utilisateur.singleOpt)
and i got this error:
[PSQLException: ERROR: function ucase(character varying) does not exist Indice : No function matches the given name and argument types. You might need to add explicit type casts. Position : 33]
What should i do ?
PS:Im using PostgreSQL 9.1
First, in Play!, only use singleOpt if you're expecting exactly one row back. Otherwise, it will throw an exception. Since you do want just one row, add a LIMIT 1 to the end of your query.
The PostgreSQL function you want is upper(). Here's an example:
SQL("select * from utilisateur where upper(pseudo) = {pseudo} limit 1").on(
'pseudo -> pseudo.toUpperCase
).as(Utilisateur.utilisateur.singleOpt)
(You can play with it here: http://sqlfiddle.com/#!1/6226c/5)