Parameterization of an array of enums? - node-postgres

I have a table where one of the fields is an array of enums. For example let say this is what it looks like:
CREATE TYPE foobar AS ENUM (
'FOO',
'BAR'
);
CREATE TABLE my_table (
id SERIAL PRIMARY KEY,
foobarray foobar[] DEFAULT ARRAY['FOO']::foobar[]
);
When I try to use node-postgres to insert/update a row it is not clear how to parameterize the array and get it type cast to an array of enums.
When I try:
const foobarray = ["BAR"];
await pool.query("UPDATE my_table SET foobarray=$2::foobar[] WHERE id=$1", [id, foobarray]);
I get:
error: invalid input value for enum foobarray: "{"
Any ideas how to get this to work?

I figured out my issue...
I was actually pulling the value, before updating it, as follows:
SELECT foobarray FROM my_table WHERE id=$1;
This resulted in the following array being in the result:
["{", "}", "FOO"]
I didn't realize this and was trying to modify the array from the result before updating which resulted in "{" and "}" which are obviously not valid ENUM values being passed through.
I was able to solve this issue by keeping my original UPDATE query the same but modifying the SELECT query to:
SELECT foobarray::text[] FROM my_table WHERE id=$1;
This results in the following array being in the result:
["FOO"]
Tweaking it and updating now causes no problems.

Related

Need a query that works in both cases when column type is either text or text[]

Following are my 2 tables
CREATE TABLE temptext (
id text
);
CREATE TABLE temptext2 (
id _text
);
My insert queries are
INSERT INTO temptext
(id)VALUES
('a');
INSERT INTO temptext2
(id)VALUES
('{c,d,e,a}');
I have 2 queries that run perfectly
SELECT *
FROM temptext2
WHERE ( ('edfdas' = ANY ("id")))
and
SELECT *
FROM temptext
WHERE ( ("id" = ANY ('{"edfdas"}')))
If I replace temptext2 with temptext in either of 1 then it fails giving error -
ERROR: could not find array type for data type text[]
or
ANY/ALL (array) requires array on right side
I need a query that runs in both cases since I don't know the data type in the database whether it is text or _text.
PS: There can be multiple values along with 'edfdas'
One way to get around this is, is to make sure you always compare the value with an array by turning the text column into a text[]. This can be achieved by appending the column to an empty array. This works for text and text[] alike
SELECT *
FROM temptext
WHERE 'one' = any(array[]::text[]||id)
The query works for both tables. Concatenating a single value to an empty array yields an array with a single value. Concatenating and array to an empty array yields the original array.
However this will prevent the usage of any index. If you want an efficient solution (that can make use of possible indexes), you will have to use two different queries.
I find the fact that your application doesn't know how the tables are defined quite strange to be honest. Fixing that seems to be more efficient and less error prone in the long run.
ANY operator can only be used with an array on the right side, but you are trying to use it with a text value.
Try one of the following -
Use the IN operator instead of ANY, and enclose the value in parentheses to create a set. For example:
SELECT *
FROM temptext
WHERE 'edfdas' IN ("id")
SELECT *
FROM temptext2
WHERE 'edfdas' IN ("id")
This will work for both tables, because the IN operator can be used with a set on the right side, and the set will be implicitly converted to an array if the column type is an array type.
You could also use the = operator and wrap the value in quotes to create a text value, like this:
SELECT *
FROM temptext
WHERE "id" = 'edfdas'
SELECT *
FROM temptext2
WHERE "id" = 'edfdas'
This will also work for both tables, because the = operator can be used with a text value on the right side, and the text value will be implicitly converted to an array if the column type is an array type.

Invalid token error when using jsonb_insert in postgresql

As a little bit of background. I want to fill a column with jsonb values using values from other columns. Initially, I used this query:
UPDATE myTable
SET column_name =
row_to_json(rowset)
FROM (SELECT column1, column2 FROM myTable)rowset
However, this query seems to run for way too long (a few hours before I stopped it) on a dataset with 9 million records. So I looking for a solution with the second FROM clause and found the jsonb_insert function. To test the query I first ran this sample query:
SELECT jsonb_insert('{}','{column1}','500000')
Which gives {'column1':500000} as output. Perfect, so I tried to fill the value using the actual column:
SELECT jsonb_insert('{}':,'{column1}',column1) FROM myTable WHERE id = <test_id>
This gives a syntax error and a suggestion to add argument types, which leads me to the following:
SELECT jsonb_insert('{}':,'{column1}','column1')
FROM myTable WHERE id = <test_id>
SELECT jsonb_insert('{}'::jsonb,'{column1}'::jsonb,column1::numeric(8,0))
FROM myTable WHERE id = <test_id>
Both these queries give invalid input type syntax error, with Token 'column1' is invalid.
I really can not seem to find the correct syntax for these queries using documentation. Does anyone know what the correct syntax would be?
Because jsonb_insert function might need to use jsonb type for the new_value parameter
jsonb_insert(target jsonb, path text[], new_value jsonb [, insert_after boolean])
if we want to get number type of JSON, we can try to cast the column as string type before cast jsonb
if we want to get a string type of JSON, we can try to use concat function with double-quotes sign.
CREATE TABLE myTable (column1 varchar(50),column2 int);
INSERT INTO myTable VALUES('column1',50000);
SELECT jsonb_insert('{}','{column1}',concat('"',column1,'"')::jsonb) as JsonStringType,
jsonb_insert('{}','{column2}',coalesce(column2::TEXT,'null')::jsonb) as JsonNumberType
FROM myTable
sqlfiddle
Note
if our column value might be null we can try to put 'null' for coalesce function as coalesce(column2::TEXT,'null').

Searching an element in top-level json array oracle

I have an array of strings stored in oracle column as a json array in the following format:
["abc", "xyz"]
["cde", "fgh"]
["xyz"]
I have to write a query to check whether a given string is present in any of the arrays in any of the rows. In the above example I would like to see whether "xyz" is present. How should the json path be? I know I can use the 'like' clause but I don't think that is a neat way to do.
Why the query SELECT JSON_QUERY(my_column, '$[*]') FROM my_table is always returning null?
I did the following test, this may be what you are looking for:
create table t(json_v varchar2(40))
insert into t values('["abc", "xyz"]');
insert into t values('["cde", "fgh"]');
insert into t values('["xyz"]');
SELECT *
from t, json_table(t.json_v, '$[*]' columns (value PATH '$'))
WHERE value = 'xyz'
Output Result
JSON_V value
["abc", "xyz"] xyz
["xyz"] xyz
Your question two why the query always returns zero as you have to wrap the values see the JSON_QUERY syntax
SELECT JSON_QUERY(json_v, '$[*]' WITH WRAPPER) AS value FROM myTable;

postgresql using json sub-element in where clause

This might be a very basic question but I am not able to find anything on this online.
If I create a sample table :
create table dummy ( id int not null, data json );
Then, if I query the table using the following query:
select * from dummy where data->'x' = 10;
Now since there are no records in the table yet and there is no such property as 'x' in any record, it should return zero results.
But I get the following error:
postgres=# select * from dummy where data->'x' = 10;
ERROR: operator does not exist: json = integer
LINE 1: select * from dummy where data->'x' = 10;
However following query works:
select * from dummy where cast(data->>'x' as integer) = 10;
Am I missing something here or typecasting is the only way I can get an integer value from a json field ? If that's the case, does it not affect the performance when data becomes extremely large ?
Am I missing something here or typecasting is the only way I can get
an integer value from a json field ?
You're correct, typecasting is the only way to read an integer value from a json field.
If that's the case, does it not affect the performance when data
becomes extremely large ?
Postgres allows you to index functions including casts, so the index below will allow you to quickly retrieve all rows where data->>x has some integer value
CREATE INDEX dummy_x_idx ON dummy(cast("data"->>'x' AS int))
JSON operator ->> means Get JSON array element (or object field) as text, so type cast is necessary.
You could define your own JSON operator, but it would only simplify the code, without consequences for performance.

passing an array into oracle sql and using the array

I am running into the following problem, I am passing an array of string into Oracle SQL, and I would like to retrieve all the data where its id is in the list ...
here's what i've tried ...
OPEN O_default_values FOR
SELECT ID AS "Header",
VALUE AS "DisplayValue",
VALUE_DESC AS "DisplayText"
FROM TBL_VALUES
WHERE ID IN I_id;
I_id is an array described as follows - TYPE gl_id IS TABLE OF VARCHAR2(15) INDEX BY PLS_INTEGER;
I've been getting the "expression is of wrong type" error.
The I_id array can sometimes be as large as 600 records.
My question is, is there a way to do what i just describe, or do i need to create some sort of cursor and loop through the array?
What has been tried - creating the SQL string dynamically and then con-cat the values to the end of the SQL string and then execute it. This will work for small amount of data and the size of the string is static, which will caused some other errors (like index out of range).
have a look at this link: http://asktom.oracle.com/pls/asktom/f?p=100:11:620533477655526::::P11_QUESTION_ID:139812348065
effectively what you want is a variable in-list with bind variables.
do note this:
"the" is deprecated. no need for it
today.
TABLE is it's replacement
select * from TABLE( function );
since you already have the type, all you need to do is something similar to below:
OPEN O_default_values FOR
SELECT ID AS "Header",
VALUE AS "DisplayValue",
VALUE_DESC AS "DisplayText"
FROM TBL_VALUES
WHERE ID IN (select column_value form table(I_id));