passing an array into oracle sql and using the array - sql

I am running into the following problem, I am passing an array of string into Oracle SQL, and I would like to retrieve all the data where its id is in the list ...
here's what i've tried ...
OPEN O_default_values FOR
SELECT ID AS "Header",
VALUE AS "DisplayValue",
VALUE_DESC AS "DisplayText"
FROM TBL_VALUES
WHERE ID IN I_id;
I_id is an array described as follows - TYPE gl_id IS TABLE OF VARCHAR2(15) INDEX BY PLS_INTEGER;
I've been getting the "expression is of wrong type" error.
The I_id array can sometimes be as large as 600 records.
My question is, is there a way to do what i just describe, or do i need to create some sort of cursor and loop through the array?
What has been tried - creating the SQL string dynamically and then con-cat the values to the end of the SQL string and then execute it. This will work for small amount of data and the size of the string is static, which will caused some other errors (like index out of range).

have a look at this link: http://asktom.oracle.com/pls/asktom/f?p=100:11:620533477655526::::P11_QUESTION_ID:139812348065
effectively what you want is a variable in-list with bind variables.
do note this:
"the" is deprecated. no need for it
today.
TABLE is it's replacement
select * from TABLE( function );
since you already have the type, all you need to do is something similar to below:
OPEN O_default_values FOR
SELECT ID AS "Header",
VALUE AS "DisplayValue",
VALUE_DESC AS "DisplayText"
FROM TBL_VALUES
WHERE ID IN (select column_value form table(I_id));

Related

Oracle SQL: create a dumy table from a substitution variable

In Oracle SQL, I can create a dummy table thanks to the code shown below:
select crca.*
from my_real_table real_table,
table(ntde.ENCRYPT_ALL(:inputParam)) enc
where
...
I would like to be able to do same thing without using ntde.ENCRYPT_ALL, I would like to do something like that:
select crca.*
from my_real_table real_table,
table(:inputParam) enc
where
...
It does not work and I get this error:
00000 - "cannot access rows from a non-nested table item"
*Cause: attempt to access rows of an item whose type is not known at
parse time or that is not of a nested table type
*Action: use CAST to cast the item to a nested table type
Do you know how to do that please?
As the exception says, use CAST:
SELECT c.*
FROM my_real_table r
CROSS JOIN TABLE( CAST(:inputParam AS your_collection_type) ) c
This assumes that:
the bind variable contains an collection (and not a string); and
the bind variable is being passed from an application (this is a Java example); and
you have created an SQL collection type to cast the bind variable to. For example:
CREATE TYPE your_collection_type AS TABLE OF NUMBER;
Or, instead of creating your own collection type, you could use a built in collection (or VARRAY) such as SYS.ODCIVARCHAR2LIST.

Conditionally delete item inside an Array Field PostgreSQL

I'm building a kind of dictionary app and I have a table for storing words like below:
id | surface_form | examples
-----------------------------------------------------------------------
1 | sounds | {"It sounds as though you really do believe that",
| | "A different bell begins to sound midnight"}
Where surface_form is of type CHARACTER VARYING and examples is an array field of CHARACTER VARYING
Since the examples are generated automatically from another API, it might not contain the exact "surface_form". Now I want to keep in examples only sentences that contain the exact surface_form. For instance, in the given example, only the first sentence is kept as it contain sounds, the second should be omitted as it only contain sound.
The problem is I got stuck in how to write a query and/or plSQL stored procedure to update the examples column so that it only has the desired sentences.
This query skips unwanted array elements:
select id, array_agg(example) new_examples
from a_table, unnest(examples) example
where surface_form = any(string_to_array(example, ' '))
group by id;
id | new_examples
----+----------------------------------------------------
1 | {"It sounds as though you really do believe that"}
(1 row)
Use it in update:
with corrected as (
select id, array_agg(example) new_examples
from a_table, unnest(examples) example
where surface_form = any(string_to_array(example, ' '))
group by id
)
update a_table
set examples = new_examples
from corrected
where examples <> new_examples
and a_table.id = corrected.id;
Test it in rextester.
Maybe you have to change the table design. This is what PostgreSQL's documentation says about the use of arrays:
Arrays are not sets; searching for specific array elements can be a sign of database misdesign. Consider using a separate table with a row for each item that would be an array element. This will be easier to search, and is likely to scale better for a large number of elements.
Documentation:
https://www.postgresql.org/docs/current/static/arrays.html
The most compact solution (but not necessarily the fastest) is to write a function that you pass a regular expression and an array and which then returns a new array that only contains the items matching the regex.
create function get_matching(p_values text[], p_pattern text)
returns text[]
as
$$
declare
l_result text[] := '{}'; -- make sure it's not null
l_element text;
begin
foreach l_element in array p_values loop
-- adjust this condition to whatever you want
if l_element ~ p_pattern then
l_result := l_result || l_element;
end if;
end loop;
return l_result;
end;
$$
language plpgsql;
The if condition is only an example. You need to adjust that to whatever you exactly store in the surface_form column. Maybe you need to test on word boundaries for the regex or a simple instr() would do - your question is unclear about that.
Cleaning up the table then becomes as simple as:
update the_table
set examples = get_matching(examples, surface_form);
But the whole approach seems flawed to me. It would be a lot more efficient if you stored the examples in a properly normalized data model.
In SQL, you have to remember two things.
Tuple elements are immutable but rows are mutable via updates.
SQL is declarative, not procedural
So you cannot "conditionally" "delete" a value from an array. You have to think about the question differently. You have to create a new array following a specification. That specification can conditionally include values (using case statements). Then you can overwrite the tuple with the new array.
Looks like one way could to update the array with array elements that are valid by doing a select using like or some regular expression.
https://www.postgresql.org/docs/current/static/arrays.html
If you want to hold elements from array that have "surface_form" in it you have to use that entries with substring(....,...) is not null
First you unnest the array, hold only items that match, and then array_agg the stored items
Here is a little query you can run to test without any table.
SELECT
id,
surface_form,
(SELECT array_agg(examples_matching)
FROM unnest(surfaces.examples) AS examples_matching
WHERE substring(examples_matching, surfaces.surface_form) IS NOT NULL)
FROM
(SELECT
1 AS id,
'example' :: TEXT AS surface_form,
ARRAY ['example form', 'test test','second example form'] :: TEXT [] AS examples
) surfaces;
You can select data in temp table using
Then update temp table using update query on row number
Merge value using
This merge value you can update in original table
For Example
Suppose you create temp table
Temp (id int, element character varying)
Then update Temp table and nest it.
Finally update original table
Here is the query you can directly try to execute in editor
CREATE TEMP TABLE IF NOT EXISTS temp_element (
id bigint,
element character varying)WITH (OIDS);
TRUNCATE TABLE temp_element;
insert into temp_element select row_number() over (order by p),p from (
select unnest(ARRAY['It sounds as though you really do believe that',
'A different bell begins to sound midnight']) as P)t;
update temp_element set element = 'It sounds as though you really'
where element = 'It sounds as though you really do believe that';
--update table
select array_agg(r) from ( select element from temp_element)r

postgresql using json sub-element in where clause

This might be a very basic question but I am not able to find anything on this online.
If I create a sample table :
create table dummy ( id int not null, data json );
Then, if I query the table using the following query:
select * from dummy where data->'x' = 10;
Now since there are no records in the table yet and there is no such property as 'x' in any record, it should return zero results.
But I get the following error:
postgres=# select * from dummy where data->'x' = 10;
ERROR: operator does not exist: json = integer
LINE 1: select * from dummy where data->'x' = 10;
However following query works:
select * from dummy where cast(data->>'x' as integer) = 10;
Am I missing something here or typecasting is the only way I can get an integer value from a json field ? If that's the case, does it not affect the performance when data becomes extremely large ?
Am I missing something here or typecasting is the only way I can get
an integer value from a json field ?
You're correct, typecasting is the only way to read an integer value from a json field.
If that's the case, does it not affect the performance when data
becomes extremely large ?
Postgres allows you to index functions including casts, so the index below will allow you to quickly retrieve all rows where data->>x has some integer value
CREATE INDEX dummy_x_idx ON dummy(cast("data"->>'x' AS int))
JSON operator ->> means Get JSON array element (or object field) as text, so type cast is necessary.
You could define your own JSON operator, but it would only simplify the code, without consequences for performance.

Match String to Array of Strings in Postgres Database Column

I have a Django app which uses a Postgres database. I am creating a temp table by doing the following:
cursor.execute("""CREATE TEMP TABLE temp_table (pub_id INTEGER, pub_title TEXT, pub_tags TEXT[])""")
Notice that the last column (pub_tags) of temp_table contains an array of strings.
For reference, my next line of code inserts data from existing tables into the temp table, and works fine.
cursor.execute("""INSERT INTO temp_table(pub_id, pub_title, pub_tags) SELECT...etc.
For the last step, I'd like to get the pub_titles from the temp_table where, in the pub_tags column, there is a match to a string that I am entering.
For example, I'd like to get all the pub_titles where the pub_tag array contains the string "men." I'd imagine the syntax would be something like:
cursor.execute("""SELECT pub_title FROM temp_table WHERE '%men%' IN (pub_tags)""")
Which is not correct and throws a syntax error, but hopefully describes what I am trying to do. I'm just not sure how to indicate that pub_tags is an array in this context.
I have been referred to some postgres docs, for example:
http://www.postgresql.org/docs/current/static/functions-array.html, and
http://www.postgresql.org/docs/current/interactive/functions-comparisons.html#AEN18030
but no matter what I try I can't get anything to work here.
from postgres documentation it looks like the syntax might be
SELECT pub_title FROM temp_table WHERE 'men' = ANY (pub_tags)

Array parameter for TADOQuery in Delphi 2010

I need to execute a simple query:
SELECT * FROM MyTable WHERE Id IN (:ids)
Obviously, it returns the set of records which have their primary key 'Id' in the given list. How can I pass an array of integer IDs into ADOQuery.Parameters for parameter 'ids'? I have tried VarArray - it does not work. Parameter 'ids' has FieldType = ftInteger by default, if it matters.
There is no parameter type that can be used to pass a list of values to in. Unfortunately, this is one of the shortcomings of parameterized SQL.
You'll have to build the query from code to either generate the list of values, or generate a list of parameters which can then be filled from code. That's because you can pass each value as a different parameter, like this:
SELECT * FROM MyTable WHERE Id IN (:id1, :id2, :id3)
But since the list will probably have a variable size, you'll have to alter the SQL to add parameters. In that case it is just as easy to generate the list of values, although parametereized queries may be cached better, depending on which DB you use.
The IN param just takes a comma separated string of values like (1,2,3,4,5) so I assume you set the datatype to ftstring and just build the string and pass that...? Not tried it but it's what I would try...