I have a table entries with a field details which type is jsonb and default value is '{}'::jsonb, on PostgreSQL 10.5
When I run
SELECT details->foo FROM entries
I get an
ERROR: column "foo" does not exist.
From https://www.postgresql.org/docs/10/functions-json.html I understood that I should get a NULL value when the key is not present is the JSON. Did I understand wrongly ? If so, how can I extract the field with a default value ?
You need to supply the JSON key as a string constant:
SELECT details -> 'foo'
FROM entries;
Related
I am trying to create an active record query that considers a jsonb column in a model.
My query currently:
ExampleModel.
where("jsonb_column_name -> 'Example key 1' ?| array[:example_values]", example_values: [1,2,3])
This query currently looks at the jsonb column in ExampleModel where the key is "Example key 1" and checks if it's corresponding value is contained within example_values.
What I need is to change 'Example key 1' to a list of potential keys that checks against example_values and returns all instances of ExampleModel that match.
Are there any SQL/ActiveRecord experts out there that know the syntax of how to do this?
I have a table with one column of type text which contains a json string. I would like to do a query where I select a bunch of rows (around 50) and for these rows update a single value in the json that is saved in the text field. So lets say it currently looks like this
{"amount":"45","level":1}
I want to update the amount value for every one of these rows, to for example "level" * 5.
I can't figure out a way to do this with one query since it does not seem possible to alter a single value for a text type field like this. Or am I missing something? Otherwise i will just have to alter it manually for every single row I need to change which would be a pain.
You need to first cast the value to a proper jsonb value, then you can manipulate it using JSON functions.
update the_table
set the_column = jsonb_set(the_column::jsonb, '{level}', to_jsonb((the_column::jsonb ->> 'level')::int * 5))::text
where ....
The expression (the_column::jsonb ->> 'level')::int * 5 extracts the current value of level converts it into an integer and multiplies it with 5. The to_jsonb() around it is necessary because jsonb_set() requires a jsonb value as the last parameter
The '{level}' parameter tells jsonb_set() to put the new value (see above) into the (top level) key level
And finally the whole jsonb value is converted back to a text value.
If you really store JSON in that column, you should think about converting that column to the jsonb data type to avoid all that casting back and forth.
Or maybe think about a properly normalized model where this would be as simple as set level = level * 5
I tried to create generated column from other JSON type column with
ALTER TABLE "Invoice" ADD COLUMN created Integer GENERATED ALWAYS AS (data ->> 'created') STORED;
When I execute this I get error
ERROR: column "created" is of type integer but default expression is of type text HINT: You will need to rewrite or cast the expression. SQL state: 42804
I tried to cast it with CAST function and :: operator but with no lack. Is there any way to do it? Or maybe I should generate this column differently?
Thanks
How about converting the value to an int?
ALTER TABLE "Invoice" ADD COLUMN created Integer
GENERATED ALWAYS AS ( (data ->> 'created')::int ) STORED;
I am going to fetch one column in oracle 12c table which is clob type, column name is dynamic_fields, apparently it was a json format.
The data looks like this in the column:
{
"App": 20187.7",
"CTList":
"[
{\"lineOfBusiness\":\"0005",
\"coverageId\":659376737,
\"premiumPercentage\":0,
\"lobInCt\":\"4CI5\"},
{\"lineOfBusiness\":\"0005\",
\"coverageId\":659376738,
\"premiumPercentage\":0,
\"lobInCt\":\"4CE5\"},
{\"lineOfBusiness\":\"0005\",
\"coverageId\":659376739,
\"premiumPercentage\":1,
\"lobInCt\":\"4CD5\"}]"
}
I want to use the json_value function to fetch the fields lineOfbusiness of the first element.
json_value(dynamic_fields,'$.CTList[0].lineOfBusiness')
It returns null.
Is that anything wrong I did? I do not want to use json_table to fetch the array value since it will be needs to embed into another query.
You need to fix your dynamic_fields column's format. First, start by creating your table with a check constraint to make sure your column conforms json format (
it's allowed to add a check constraint within DB version 12c ) :
create table tab
(
dynamic_fields clob constraint chk_dyn_fld
check (dynamic_fields is json)
);
If you try to insert you current value for dynamic_fields column, Oracle hurls by raising ORA-02290 error ( check constraint (<yourCurSchema>.CHK_DYN_FLD) violated ), Fix your format by adding a double-quote just before "App"'s value (^^"^^20187.7"), and remove double-quotes from before beginning and after trailing parts of square brackets, and lastly try to replace backslashes by an empty string ('') by replace() function during the insertion :
insert into tab
values(replace('{
"App": "20187.7",
"CTList":
[
{\"lineOfBusiness\":\"0005",
\"coverageId\":659376737,
\"premiumPercentage\":0,
\"lobInCt\":\"4CI5\"},
{\"lineOfBusiness\":\"0005\",
\"coverageId\":659376738,
\"premiumPercentage\":0,
\"lobInCt\":\"4CE5\"},
{\"lineOfBusiness\":\"0005\",
\"coverageId\":659376739,
\"premiumPercentage\":1,
\"lobInCt\":\"4CD5\"}]
}','\',''));
which doesn't raise any exception. And this time, you're able to get the desired value (0005) by your original query :
select json_value(dynamic_fields,'$.CTList[0].lineOfBusiness')
from tab;
Demo
I am using PostgreSQL to create a table based on json input given to my Java code, and I need validations on JSON keys that is passed on the database just like oracle but problem here is the whole jsonb datatype column name lets say data is single column. Consider I get json in below format -
{
"CountActual": 1234,
"CountActualCharacters": "thisreallyworks!"
"Date": 09-11-2001
}
Correct datatype of above json:- number(10), varchar(50), date
Now to put validations on I'm using constraints
Query 1 -
ALTER TABLE public."Detail"
ADD CONSTRAINT "CountActual"
CHECK ((data ->> 'CountActual')::bigint >=0 AND length(data ->> 'CountActual') <= 10);
--Working fine.
But for Query 2-
ALTER TABLE public."Detail"
ADD CONSTRAINT "CountActualCharacters"
CHECK ((data ->> 'CountActualCharacters')::varchar >=0 AND length(data ->> 'CountActualCharacters') <= 50);
I'm getting below error -
[ERROR: operator does not exist: character varying >= integer
HINT: No operator matches the given name and argument type(s).
You might need to add explicit type casts.]
I tried another way also like -
ALTER TABLE public."Detail"
ADD CONSTRAINT CountActualCharacters CHECK (length(data ->> 'CountActualCharacters'::VARCHAR)<=50)
Above constraints works successfully but I don't think this is the right way as my validation is not working when inserting the data -
Insert into public."Detail" values ('{"
CountActual":1234,
"CountActualCharacters":789
"Date": 11-11-2009
}');
And its shows insert successfully when passing in 789 in CountActualCharacters instead of varchar like "the78isgood!".
So please can anyone suggest me proper constraint for PostgreSQL for varchar just like number that I have written in Query 1.
And if possible for Date type also with DD-MM-YYYY format.
I just started with PostgresSQL, forgive me if I'm sounded silly but I'm really stuck here.
You can use jsonb_typeof(data -> 'CountActualCharacters') = 'string'
Note the single arrow, as ->> will try to convert anything to string.
You can read more about JSON functions in PostgreSQL here:
https://www.postgresql.org/docs/current/static/functions-json.html