I am using PostgreSQL to create a table based on json input given to my Java code, and I need validations on JSON keys that is passed on the database just like oracle but problem here is the whole jsonb datatype column name lets say data is single column. Consider I get json in below format -
{
"CountActual": 1234,
"CountActualCharacters": "thisreallyworks!"
"Date": 09-11-2001
}
Correct datatype of above json:- number(10), varchar(50), date
Now to put validations on I'm using constraints
Query 1 -
ALTER TABLE public."Detail"
ADD CONSTRAINT "CountActual"
CHECK ((data ->> 'CountActual')::bigint >=0 AND length(data ->> 'CountActual') <= 10);
--Working fine.
But for Query 2-
ALTER TABLE public."Detail"
ADD CONSTRAINT "CountActualCharacters"
CHECK ((data ->> 'CountActualCharacters')::varchar >=0 AND length(data ->> 'CountActualCharacters') <= 50);
I'm getting below error -
[ERROR: operator does not exist: character varying >= integer
HINT: No operator matches the given name and argument type(s).
You might need to add explicit type casts.]
I tried another way also like -
ALTER TABLE public."Detail"
ADD CONSTRAINT CountActualCharacters CHECK (length(data ->> 'CountActualCharacters'::VARCHAR)<=50)
Above constraints works successfully but I don't think this is the right way as my validation is not working when inserting the data -
Insert into public."Detail" values ('{"
CountActual":1234,
"CountActualCharacters":789
"Date": 11-11-2009
}');
And its shows insert successfully when passing in 789 in CountActualCharacters instead of varchar like "the78isgood!".
So please can anyone suggest me proper constraint for PostgreSQL for varchar just like number that I have written in Query 1.
And if possible for Date type also with DD-MM-YYYY format.
I just started with PostgresSQL, forgive me if I'm sounded silly but I'm really stuck here.
You can use jsonb_typeof(data -> 'CountActualCharacters') = 'string'
Note the single arrow, as ->> will try to convert anything to string.
You can read more about JSON functions in PostgreSQL here:
https://www.postgresql.org/docs/current/static/functions-json.html
Related
I am trying to make a table called Citizens in postgresql using PGadmin and inside this table there is a column called ageand I want this age to be calculated and put inside this column when I insert a new row.
My code is as shown below:
create table citizens(
first_name VARCHAR(40),
last_name VARCHAR(45),
birth_date DATETIME DEFAULT NOW(),
ssid BIGINT PRIMARY KEY,
age INTERVAL AGE(TIMESTAMP birth_date),
);
I get this error message when I run this query:
ERROR: syntax error at or near "AGE"
LINE 6: age INTERVAL AGE(TIMESTAMP birth_date),
^
SQL state: 42601
Character: 156
What you are looking for is defining
age interval generated always as age(birth_date) virtual
Unfortunately Postgres does not support generated ... virtual. Your best option then is drop the column from table then create a view which derives that column. Something like:
create view citizens_vw as
select *, age(birth_date) as age
from citizens;
Or, even better, as the comment by #ChrisMaurer has it:
create view citizens_vw as
select *, extract (year from age(birth_date))::integer as age
from citizens;
A couple notes:
Postgres does not have a datatype DATETIME. You can use TIMESTAMP or just DATE.
NOW() seems like a poor choice for a default for birth_date. This
is a case where defining it as not null and not having a default is
a better option. Do not assume new citizens are just seconds or
days old. Sometimes it is better too handle the exception rather
that assuming incorrect data (which will likely never be updated).
Well at least IMHO.
I tried to create generated column from other JSON type column with
ALTER TABLE "Invoice" ADD COLUMN created Integer GENERATED ALWAYS AS (data ->> 'created') STORED;
When I execute this I get error
ERROR: column "created" is of type integer but default expression is of type text HINT: You will need to rewrite or cast the expression. SQL state: 42804
I tried to cast it with CAST function and :: operator but with no lack. Is there any way to do it? Or maybe I should generate this column differently?
Thanks
How about converting the value to an int?
ALTER TABLE "Invoice" ADD COLUMN created Integer
GENERATED ALWAYS AS ( (data ->> 'created')::int ) STORED;
I am going to fetch one column in oracle 12c table which is clob type, column name is dynamic_fields, apparently it was a json format.
The data looks like this in the column:
{
"App": 20187.7",
"CTList":
"[
{\"lineOfBusiness\":\"0005",
\"coverageId\":659376737,
\"premiumPercentage\":0,
\"lobInCt\":\"4CI5\"},
{\"lineOfBusiness\":\"0005\",
\"coverageId\":659376738,
\"premiumPercentage\":0,
\"lobInCt\":\"4CE5\"},
{\"lineOfBusiness\":\"0005\",
\"coverageId\":659376739,
\"premiumPercentage\":1,
\"lobInCt\":\"4CD5\"}]"
}
I want to use the json_value function to fetch the fields lineOfbusiness of the first element.
json_value(dynamic_fields,'$.CTList[0].lineOfBusiness')
It returns null.
Is that anything wrong I did? I do not want to use json_table to fetch the array value since it will be needs to embed into another query.
You need to fix your dynamic_fields column's format. First, start by creating your table with a check constraint to make sure your column conforms json format (
it's allowed to add a check constraint within DB version 12c ) :
create table tab
(
dynamic_fields clob constraint chk_dyn_fld
check (dynamic_fields is json)
);
If you try to insert you current value for dynamic_fields column, Oracle hurls by raising ORA-02290 error ( check constraint (<yourCurSchema>.CHK_DYN_FLD) violated ), Fix your format by adding a double-quote just before "App"'s value (^^"^^20187.7"), and remove double-quotes from before beginning and after trailing parts of square brackets, and lastly try to replace backslashes by an empty string ('') by replace() function during the insertion :
insert into tab
values(replace('{
"App": "20187.7",
"CTList":
[
{\"lineOfBusiness\":\"0005",
\"coverageId\":659376737,
\"premiumPercentage\":0,
\"lobInCt\":\"4CI5\"},
{\"lineOfBusiness\":\"0005\",
\"coverageId\":659376738,
\"premiumPercentage\":0,
\"lobInCt\":\"4CE5\"},
{\"lineOfBusiness\":\"0005\",
\"coverageId\":659376739,
\"premiumPercentage\":1,
\"lobInCt\":\"4CD5\"}]
}','\',''));
which doesn't raise any exception. And this time, you're able to get the desired value (0005) by your original query :
select json_value(dynamic_fields,'$.CTList[0].lineOfBusiness')
from tab;
Demo
Here, I am only going to get CheckListNo (integer) from select statement "Check_No" ( character varying). when am execute this, error showing like this 'operator does not exist: character varying = integer'. So, I want to check without changing datatypes.
For an example, SELECT "Check_No"
FROM "Project_CheckList_Options" let we see I defined "check_No" column as a integer if I want to get records in character varying. I can use just "check_No":: character varying then I can see column records in character varying type without affecting integer datatype. So like that Is it possible to convert datatype when alter the table with check constraint in postgresql?
alter table "Project_Configuration"
add check("CheckListNo" in (SELECT "Check_No" FROM "Project_CheckList_Options"))
I am attempting to load a tab delimited text file which contains a column of values which happen to look exactly like a date, but aren't. It appears that the CSVREAD command scans the row, converts the text value in the column to a java.Sql.Date, and then sees that the target column is a VARCHAR and executes toString() to obtain the value...which is exactly NOT what I need. I actually need the raw unconverted text with no date processing whatsoever.
So, is there some way to turn off "helpful date-like column conversion" in the CSVREAD command?
Here's the simplest case I can make to demonstrate the undesired behavior:
CREATE TABLE x
(
name VARCHAR NOT NULL
value VARCHAR
) AS
SELECT * CSVREAD('C:\myfile.tab', null, 'UTF-8', chr(9))
;
The file contains three rows, a header and two records of values:
name\tvalue\n
x\t110313\n
y\t102911\n
Any assistance on how I can bypass the overhelpful part of CVSREAD would be greatly appreciated. Thank you.
(It seems you found this out yourself, but anyway):
For CSVREAD, all columns are strings. The CSVREAD function or the database do not try to convert values to a date, or in any other way try to detect the data type. The database only does what you ask it for, which is read the data as a string in your case.
If you do want to convert a column to a date, you need to do that explicitly, for example:
CREATE TABLE x(name VARCHAR NOT NULL, value TIMESTAMP) AS
SELECT *
FROM CSVREAD('C:\myfile.tab', null, 'UTF-8', chr(9));
If non-default parsing is needed, you could use:
CREATE TABLE x(name VARCHAR NOT NULL, value TIMESTAMP) AS
SELECT "name", parsedatetime("value", "M/d/y") as v
FROM CSVREAD('C:\myfile.tab', null, 'UTF-8', chr(9));
For people who don't have headers in there csv files the example could be like this:
CREATE TABLE x(name VARCHAR NOT NULL, value TIMESTAMP) AS
SELECT "0", parsedatetime("1", 'd-M-yyyy') as v
FROM CSVREAD('C:\myfile.tab', '0|1', 'UTF-8', '|');
Beware of the single quotes around the date format. When I tried the example from Thomas it gave me an error using H2:
Column "d-M-yyyy" not found; SQL statement:
My csv files:
firstdate|13-11-2013\n
seconddate|14-11-2013