Error during query property from jsonb column with double quotation marks - sql

I have the PostgreSQL table named 'orders' with the following columns:
id | params (jsonb type) | valid
1 | {"value": "120", is_active: true} | true
2 | {"value": "92", is_active: false} | true
I'm trying to perform SELECT query with filter by params.is_active = true.
Method which creates filter receives property like
filter.property = `(params ->>'is_active')::boolean`
The resulting query looks like:
select *
from "orders"
where "valid" = true
and "(params->>'is_active')::boolean" = true
limit 50
It gives me an error
ERROR: column "(params->>'is_active')::boolean" does not exist.
When I removed double quotes around jsonb column as:
select *
from "orders"
where "valid" = true
and (params->>'is_active')::boolean = true
limit 50
it worked fine.
My question: is it possible to provide another (in different format) parameter to filter method instead of
(params ->>'is_active')::boolean
to avoid such error. I think that I can remove double quotes from resulted query with regex or I can change filter method mechanism to bypass adding quotes but maybe there is an option to just provide another value.

Your jsonb seems wrong.
select '{"value": "92", is_active: false}'::jsonb;
will fail.
My question: is it possible to provide another (in different format)
parameter to filter method instead of
You can use #> and jsonb_path_exists. run following command in DBeaver should be fine. I am not sure nodejs.
WITH cte (
params
) AS (
SELECT
'{"value": 120, "is_active": true}'::jsonb
UNION ALL
SELECT
'{"value": 92, "is_active": false}'::jsonb
)
SELECT
params,
params #> '{"is_active": true}'::jsonb,
jsonb_path_exists(params, '$.is_active ? (#== true)'),
jsonb_path_exists(params, '$.value ? (# >= $min)', '{"min":120}')
FROM
cte;

Related

Filter Nested JSON in PostgreSQL DB

{
"List1": [
{
"f1": "b6ff",
"f2": "day",
"f3": "HO",
"List2": [{"f1": 1.5,"f2": "RATE"}]
}]
}
This is nested JSON in which there's a list 'List2' inside another list 'List1'.
how to filter f1 = 1.5 in List2? I have tried using #> operator used for contains but it doesn't work with nested JSON.
Assuming you are using an up-to-date Postgres version and you want to get the rows that fulfill the condition, you can use a JSON path expression:
select *
from the_table
where the_column ## '$.List1[*].List2[*].f1 == 1.5'
Alternatively you can use the #> operator, but the parameter must match the array structure in the column:
where the_column #> '{"List1":[{"List2":[{"f1": 1.5}]}]}';
You can try this in Where clause to fetch the whole record and then use Python to get that element.
SELECT .... WHERE
dbfieldname #> {'List1', '0', 'List2', '0', 'f1'} = 1.5

SQL get the value of a nested key in a jsonb field

Let's suppose I have a table my_table with a field named data, of type jsonb, which thus contains a json data structure.
let's suppose that if I run
select id, data from my_table where id=10;
I get
id | data
------------------------------------------------------------------------------------------
10 | {
|"key_1": "value_1" ,
|"key_2": ["value_list_element_1", "value_list_element_2", "value_list_element_3" ],
|"key_3": {
| "key_3_1": "value_3_1",
| "key_3_2": {"key_3_2_1": "value_3_2_1", "key_3_2_2": "value_3_2_2"},
| "key_3_3": "value_3_3"
| }
| }
so in pretty formatting, the content of column data is
{
"key_1": "value_1",
"key_2": [
"value_list_element_1",
"value_list_element_2",
"value_list_element_3"
],
"key_3": {
"key_3_1": "value_3_1",
"key_3_2": {
"key_3_2_1": "value_3_2_1",
"key_3_2_2": "value_3_2_2"
},
"key_3_3": "value_3_3"
}
}
I know that If I want to get directly in a column the value of a key (of "level 1") of the json, I can do it with the ->> operator.
For example, if I want to get the value of key_2, what I do is
select id, data->>'key_2' alias_for_key_2 from my_table where id=10;
which returns
id | alias_for_key_2
------------------------------------------------------------------------------------------
10 |["value_list_element_1", "value_list_element_2", "value_list_element_3" ]
Now let's suppose I want to get the value of key_3_2_1, that is value_3_2_1.
How can I do it?
I have tryed with
select id, data->>'key_3'->>'key_3_2'->>'key_3_2_1' alias_for_key_3_2_1 from my_table where id=10;
but I get
select id, data->>'key_3'->>'key_3_2'->>'key_3_2_1' alias_for_key_3_2_1 from my_table where id=10;
^
HINT: No operators found with name and argument types provided. Types may need to be converted explicitly.
what am I doing wrong?
The problem in the query
select id, data->>'key_3'->>'key_3_2'->>'key_3_2_1' alias_for_key_3_2_1 --this is wrong!
from my_table
where id=10;
was that by using the ->> operand I was turning a json to a string, so that with the next ->> operand I was trying to get a json key object key_3_2 out of a string object, which makes no sense.
Thus one has to use the -> operand, which does not convert json into string, until one gets to the "final" key.
so the query I was looking for was
select id, data->'key_3'->'key_3_2'->>'key_3_2_1' alias_for_key_3_2_1 --final ->> : this gets the value of 'key_3_2_1' as string
from my_table
where id=10;
or either
select id, data->'key_3'->'key_3_2'->'key_3_2_1' alias_for_key_3_2_1 --final -> : this gets the value of 'key_3_2_1' as json / jsonb
from my_table
where id=10;
More info on JSON Functions and Operators can be find here

Using Postgres JSON Functions on table columns

I have searched extensively (in Postgres docs and on Google and SO) to find examples of JSON functions being used on actual JSON columns in a table.
Here's my problem: I am trying to extract key values from an array of JSON objects in a column, using jsonb_to_recordset(), but get syntax errors. When I pass the object literally to the function, it works fine:
Passing JSON literally:
select *
from jsonb_to_recordset('[
{ "id": 0, "name": "400MB-PDF.pdf", "extension": ".pdf",
"transferId": "ap31fcoqcajjuqml6rng"},
{ "id": 0, "name": "1000MB-PDF.pdf", "extension": ".pdf",
"transferId": "ap31fcoqcajjuqml6rng"}
]') as f(name text);`
results in:
400MB-PDF.pdf
1000MB-PDF.pdf
It extracts the value of the key "name".
Here's the JSON in the column, being extracted using:
select journal.data::jsonb#>>'{context,data,files}'
from journal
where id = 'ap32bbofopvo7pjgo07g';
resulting in:
[ { "id": 0, "name": "400MB-PDF.pdf", "extension": ".pdf",
"transferId": "ap31fcoqcajjuqml6rng"},
{ "id": 0, "name": "1000MB-PDF.pdf", "extension": ".pdf",
"transferId": "ap31fcoqcajjuqml6rng"}
]
But when I try to pass jsonb#>>'{context,data,files}' to jsonb_to_recordset() like this:
select id,
journal.data::jsonb#>>::jsonb_to_recordset('{context,data,files}') as f(name text)
from journal
where id = 'ap32bbofopvo7pjgo07g';
I get a syntax error. I have tried different ways but each time it complains about a syntax error:
Version:
PostgreSQL 9.4.10 on x86_64-unknown-linux-gnu, compiled by gcc (Ubuntu 4.8.2-19ubuntu1) 4.8.2, 64-bit
The expressions after select must evaluate to a single value. Since jsonb_to_recordset returns a set of rows and columns, you can't use it there.
The solution is a cross join lateral, which allows you to expand one row into multiple rows using a function. That gives you single rows that select can act on. For example:
select *
from journal j
cross join lateral
jsonb_to_recordset(j.data#>'{context, data, files}') as d(id int, name text)
where j.id = 'ap32bbofopvo7pjgo07g'
Note that the #>> operator returns type text, and the #> operator returns type jsonb. As jsonb_to_recordset expects jsonb as its first parameter I'm using #>.
See it working at rextester.com
jsonb_to_recordset is a set-valued function and can only be invoked in specific places. The FROM clause is one such place, which is why your first example works, but the SELECT clause is not.
In order to turn your JSON array into a "table" that you can query, you need to use a lateral join. The effect is rather like a foreach loop on the source recordset, and that's where you apply the jsonb_to_recordset function. Here's a sample dataset:
create table jstuff (id int, val jsonb);
insert into jstuff
values
(1, '[{"outer": {"inner": "a"}}, {"outer": {"inner": "b"}}]'),
(2, '[{"outer": {"inner": "c"}}]');
A simple lateral join query:
select id, r.*
from jstuff
join lateral jsonb_to_recordset(val) as r("outer" jsonb) on true;
id | outer
----+----------------
1 | {"inner": "a"}
1 | {"inner": "b"}
2 | {"inner": "c"}
(3 rows)
That's the hard part. Note that you have to define what your new recordset looks like in the AS clause -- since each element in our val array is a JSON object with a single field named "outer", that's what we give it. If your array elements contain multiple fields you're interested in, you declare those in a similar manner. Be aware also that your JSON schema needs to be consistent: if an array element doesn't contain a key named "outer", the resulting value will be null.
From here, you just need to pull the specific value you need out of each JSON object using the traversal operator as you were. If I wanted only the "inner" value from the sample dataset, I would specify select id, r.outer->>'inner'. Since it's already JSONB, it doesn't require casting.

Choosing row based on json value sql

I have a column which saves json values as text. I want to select columns in query based on a certain value contained in the text field.
To be clear, I have a column server_response which saves data as follows:
{
"Success": true,
"PasswordNotExpired": true,
"Exists": true,
"Status": "A",
"Err": null,
"Statuscode": 200,
"Message": "Login Denied"
}
How can i choose columns based on if the message was/or contained Login Denied in the where clause?
This should do the trick, at least this is what i understood you want:
SELECT * FROM table WHERE server_response LIKE '%Login Denied%'
I think you need a query like this:
SELECT *
FROM yourTable
WHERE server_response->>Message LIKE '%Login Denied%'
Note: source
The -> operator returns a JSON object.
The ->> operator returns TEXT.

Rails 4 querying against postgresql column with array data type error

I am trying to query a table with a column with the postgresql array data type in Rails 4.
Here is the table schema:
create_table "db_of_exercises", force: true do |t|
t.text "preparation"
t.text "execution"
t.string "category"
t.datetime "created_at"
t.datetime "updated_at"
t.string "name"
t.string "body_part", default: [], array: true
t.hstore "muscle_groups"
t.string "equipment_type", default: [], array: true
end
The following query works:
SELECT * FROM db_of_exercises WHERE ('Arms') = ANY (body_part);
However, this query does not:
SELECT * FROM db_of_exercises WHERE ('Arms', 'Chest') = ANY (body_part);
It throws this error:
ERROR: operator does not exist: record = character varying
This does not work for me either:
SELECT * FROM "db_of_exercises" WHERE "body_part" IN ('Arms', 'Chest');
That throws this error:
ERROR: array value must start with "{" or dimension information
So, how do I query a column with an array data type in ActiveRecord??
What I have right now is:
#exercises = DbOfExercise.where(body_part: params[:body_parts])
I want to be able to query records that have more than one body_part associated with them, which was the whole point of using an array data type, so if someone could enlighten me on how to do this that would be awesome. I don't see it anywhere in the docs.
Final solution for posterity:
Using the overlap operator (&&):
SELECT * FROM db_of_exercises WHERE ARRAY['Arms', 'Chest'] && body_part;
I was getting this error:
ERROR: operator does not exist: text[] && character varying[]
so I typecasted ARRAY['Arms', 'Chest'] to varchar:
SELECT * FROM db_of_exercises WHERE ARRAY['Arms', 'Chest']::varchar[] && body_part;
and that worked.
I don't think that it has related to rails.
What if you do the follow?
SELECT * FROM db_of_exercises WHERE 'Arms' = ANY (body_part) OR 'Chest' = ANY (body_part)
I know that rails 4 supports Postgresql ARRAY datatype, but I'm not sure if ActiveRecord creates new methods for query the datatype. Maybe you can use Array Overlap I mean the && operator and then doind something like:
WHERE ARRAY['Arms', 'Chest'] && body_part
or maybe give a look to this gem: https://github.com/dockyard/postgres_ext/blob/master/docs/querying.md
And then do a query like:
DBOfExercise.where.overlap(:body_part => params[:body_parts])
#Aguardientico is absolutely right that what you want is the array overlaps operator &&. I'm following up with some more explanation, but would prefer you to accept that answer, not this one.
Anonymous rows (records)
The construct ('item1', 'item2', ...) is a row-constructor unless it appears in an IN (...) list. It creates an anonymous row, which PostgreSQL calls a "record". The error:
ERROR: operator does not exist: record = character varying
is because ('Arms', 'Chest') is being interpreted as if it were ROW('Arms', 'Chest'), which produces a single record value:
craig=> SELECT ('Arms', 'Chest'), ROW('Arms', 'Chest'), pg_typeof(('Arms', 'Chest'));
row | row | pg_typeof
--------------+--------------+-----------
(Arms,Chest) | (Arms,Chest) | record
(1 row)
and PostgreSQL has no idea how it's supposed to compare that to a string.
I don't really like this behaviour; I'd prefer it if PostgreSQL required you to explicitly use a ROW() constructor when you want an anonymous row. I expect that the behaviour shown here exists to support SET (col1,col2,col3) = (val1,val2,val3) and other similar operations where a ROW(...) constructor wouldn't make as much sense.
But the same thing with a single item works?
The reason the single ('Arms') case works is because unless there's a comma it's just a single parenthesised value where the parentheses are redundant and may be ignored:
craig=> SELECT ('Arms'), ROW('Arms'), pg_typeof(('Arms')), pg_typeof(ROW('Arms'));
?column? | row | pg_typeof | pg_typeof
----------+--------+-----------+-----------
Arms | (Arms) | unknown | record
(1 row)
Don't be alarmed by the type unknown. It just means that it's a string literal that hasn't yet had a type applied:
craig=> SELECT pg_typeof('blah');
pg_typeof
-----------
unknown
(1 row)
Compare array to scalar
This:
SELECT * FROM "db_of_exercises" WHERE "body_part" IN ('Arms', 'Chest');
fails with:
ERROR: array value must start with "{" or dimension information
because of implicit casting. The type of the body_part column is text[] (or varchar[]; same thing in PostgreSQL). You're comparing it for equality with the values in the IN clause, which are unknown-typed literals. The only valid equality operator for an array is = another array of the same type, so PostgreSQL figures that the values in the IN clause must also be arrays of text[] and tries to parse them as arrays.
Since they aren't written as array literals, like {"FirstValue","SecondValue"}, this parsing fails. Observe:
craig=> SELECT 'Arms'::text[];
ERROR: array value must start with "{" or dimension information
LINE 1: SELECT 'Arms'::text[];
^
See?
It's easier to understand this once you see that IN is actually just a shorthand for = ANY. It's an equality comparison to each element in the IN list. That isn't what you want if you really want to find out if two arrays overlap.
So that's why you want to use the array overlaps operator &&.