Oracle Reading from json respond - sql

i have the next problem. Having this query:
select apex_web_service.make_rest_request(
p_url => 'https://something.com/rest/Discount'
,p_http_method => 'GET'
,p_username => 'username'
,p_password => 'password'
) as custom
from dual;
That return this:
{"hasMore":false,"items":[{"id":12,"Origin":"ALL","Part":"PO423S","Channel":"RC"},{"id":13,"Origin":"ALL","Part":"LO123D","Channel":"RC"},{"id":14,"Origin":"ALL","Part":"SD765S","Channel":"AP"}]}
I want to make a Channel group by to see how many Channels i have to insert into another table.
i try this just to list:
select d.custom.items
from (
select apex_web_service.make_rest_request(
p_url => 'https://something.com/rest/Discount'
,p_http_method => 'GET'
,p_username => 'username'
,p_password => 'password'
) as custom
from dual) d;
but i have this error:
ORA-22806: no es un objeto ni un elemento REF
22806. 00000 - "not an object or REF"
*Cause: An attempt was made to extract an attribute from an item that is
neither an object nor a REF.
*Action: Use an object type or REF type item and retry the operation.
Error en la lĂ­nea: 12, columna: 8
I also test the next:
create table temp_json (
json_data blob not null
);
alter table temp_json
add constraint temp_data_json
check ( json_data is json );
insert into temp_json
select apex_web_service.make_rest_request(
p_url => 'https://something.com/rest/Discount'
,p_http_method => 'GET'
,p_username => 'username'
,p_password => 'password
) as customDiscAplicability
from dual
;
select d.json_data.items
from temp_json d;
And teh result is this:
ITEMS
-----
(null)
I follow this tutorial: LINK
Can somebody help me ?
Regards

When you select apex_web_service.make_rest_request, it returns a string. The database doesn't know this is JSON data.
If you're on Oracle Database 18c or higher, you should be able to get around this by using treat ... as json:
select d.custom.items.id from (
select treat ( '{
"hasMore": false,
"items": [ {
"id": 12,
"Origin": "ALL",
"Part": "PO423S",
"Channel": "RC"
}, {
"id": 13,
"Origin": "ALL",
"Part": "LO123D",
"Channel": "RC"
}, {
"id": 14,
"Origin": "ALL",
"Part": "SD765S",
"Channel": "AP"
}
]
}' as json ) custom from dual
) d;
ITEMS
[12,13,14]
To understand why inserting the response into the table, then selecting it returns null, we'd need to see the exact JSON that's in there!

Related

Parsing nested JSON fields in Snowflake

I am pretty new to Snowflake and I am now trying to parse a JSON field and pull its attributes to return in the response.
I tried a few variations but every time, the attribute is populating as null.
attributes column in my table has this JSON:
{
"Status": [
"ACTIVE"
],
"Coverence": [
{
"Sub": [
{
"EndDate": [
"2020-06-22"
],
"Source": [
"Test"
],
"Id": [
"CovId1"
],
"Type": [
"CovType1"
],
"StartDate": [
"2019-06-22"
],
"Status": [
"ACTIVE"
]
}
]
}
]
}
What I tried:
SELECT DISTINCT *
from
(
TRIM(mt."attributes":Status, '[""]')::string as STATUS,
TRIM(r.value:"Sub"."Id", '[""]')::string as ID,
TRIM(r.value:"Sub"."Source", '[""]')::string as SOURCE
from "myTable" mt,
lateral flatten ( input => mt."attributes":"Coverence", outer => true) r
)
GROUP BY
STATUS,
ID,
SOURCE;
Later I tried:
SELECT DISTINCT *
from
(
TRIM(mt."attributes":Status, '[""]')::string as STATUS,
TRIM(r.value:"Id", '[""]')::string as ID,
TRIM(r.value:"Source", '[""]')::string as SOURCE
from "myTable" mt,
lateral flatten ( input => mt."attributes":"Coverence":"Sub", outer => true) r
)
GROUP BY
STATUS,
ID,
SOURCE;
But nothing worked. The STATUS is populating as expected. But ID and SOURCE are populating null.
Am I missing something or have I done something dumb? Please shed some light.
Assuming that Coverence could contain multiple Sub, therefore FLATTEN twice. At lowest level only first element is chosen (EndDate[0], Source[0] etc):
SELECT
mt."attributes":Status[0]::TEXT AS Status
,r2.value:EndDate[0]::TEXT AS EndDate
,r2.value:Source[0]::TEXT AS Source
,r2.value:Id[0]::TEXT AS Id
FROM myTable AS mt,
LATERAL FLATTEN(input => mt."attributes",
path => 'Coverence',
outer => true) r1,
LATERAL FLATTEN(input => r1.value,
path => 'Sub',
outer => true) r2;
Output:
All your elements are array type and the overall JSON is not making much sense... so to access all your individual elements, you have to use [] notation and then you can access the element values. You don't need to use flatten also, if you just have to access individual elements via index.

Shopware 6 API Bulk Update entity product and other issue single error dont update all

Require a small help in API updating the products
https://shopware.stoplight.io/docs/admin-api/faf8f8e4e13a0-bulk-payloads#performance
We have products 1, 2, 3 .....upto 100 and we update quantity in BULK from our ERP (single operation)
Due to some reason (suppose) instead of integer we send string to 80th product --- so the API response gives error (which is ok)
But not only 80th product fails to update but the complete batch fails so 1 to 100 product fails - non of them are updated
Please help how to fix this or correct us if we are wrong??
Using the reference
https://shopware.stoplight.io/docs/admin-api/faf8f8e4e13a0-bulk-payloads#performance
- We are updating "entity": "product", and "action": "upsert" with headers
// --header 'single-operation: 1'
// --header 'indexing-behavior: use-queue-indexing'
All working very good but as we are using the entity in bulk if any one of the entity fails the complete bulk gives error and non of the product from the bulk list gets updated
If we are sending 1 to 100 products and problem is only at 80th product, only that does not get updated -- but rest 99 products should get updated
The behavior you describe is the expected behavior when you use single-operation: 1, as it is described in the reference you linked.
With single-operation: 1 all operations you sent are executed inside a single database transaction, so when one of the operations fails, the others are also rollbacked.
From what you describe you want to use single-operation: 0, which means that each operation you sent has it's own database transaction, so that when one operation fails the others still go through.
Reference URL
https://shopware.stoplight.io/docs/admin-api/faf8f8e4e13a0-bulk-payloads#examples
Update product stocks
{
"stock-updates": { // Name of the transaction, choose freely
"entity": "product", // Name of the entity you would like to update
"action": "upsert", // Available actions are upsert and delete,
"payload": [ // A list of objects, each representing a subset of the entity scheme referenced in `entity`. `id` is required for upsert operations.
{
"id": "c197170c60ab472b8dc1218acaba220e",
"stock": 41
},
{
"id": "a10c59fce8214fdaa552d74e0c6347ff",
"stock": 'XXXXX'
},
{
"id": "1b13176b6e0f4bb496c9a31b4fd7e97b",
"stock": 43
}
]
}
}
We have tried the following to update the stock -- but instead of integer sending string XXXXX for one product in the bulk, so that we can see the error for one product which does not update other products. (complete bulk not updated )
Below are the scenario 1 ( single-operation: 1 ) and scenario 2 ( single-operation: 0 ) in both our stock is not getting updated.
As both scenario are not updating the stock which we feel is wrong.
Our Problem
Single Operation: 0 (Separate transaction) - ideally should work and atleast update the stock of c197170c60ab472b8dc1218acaba220e to 41 and 1b13176b6e0f4bb496c9a31b4fd7e97b to 43 but the complete bulk is update is ignored by API.
==========
Scenario 1 = single-operation: 1
Response 1: Array
(
[error] => 1
[message] => Array
(
[errors] => Array
(
[0] => Array
(
[code] => ba785a8c-82cb-4283-967c-3cf342181b40
[status] => 400
[detail] => This value should be of type int.
[template] => This value should be of type {{ type }}.
[meta] => Array
(
[parameters] => Array
(
[{{ value }}] => "XXXXX"
[{{ type }}] => int
)
)
[source] => Array
(
[pointer] => /product/2/stock
)
)
)
)
)
==========
Scenario 2 = single-operation: 0
Response 2: Array
(
[error] => 1
[message] => Array
(
[success] =>
[data] => Array
(
[product] => Array
(
[result] => Array
(
[0] => Array
(
[entities] => Array
(
)
[errors] => Array
(
)
)
[1] => Array
(
[entities] => Array
(
)
[errors] => Array
(
[0] => Array
(
[code] => ba785a8c-82cb-4283-967c-3cf342181b40
[status] => 400
[detail] => This value should be of type int.
[template] => This value should be of type {{ type }}.
[meta] => Array
(
[parameters] => Array
(
[{{ value }}] => "XXXXX"
[{{ type }}] => int
)
)
[source] => Array
(
[pointer] => /2/stock
)
)
)
)
[2] => Array
(
[entities] => Array
(
)
)
)
[extensions] => Array
(
)
)
)
)
)

Select Unique value from a JSON Array - PostgreSQL JSON column

I have the following JSON document stored in a PostgreSQL JSON column:
{
"status": "Success",
"message": "",
"data": {
"serverIp": "XXXX",
"ruleId": 32321,
"results": [
{
"versionId": 555555,
"PriceID": "8abf35ec-3e0e-466b-a4e5-2af568e90eec",
"price": 350,
"Convert": 0.8,
"Cost": 15
"Customer_ID":1
},
{
"versionId": 4444,
"PriceID": "b5a1dbd5-17b4-4847-8b3c-da334f95276a",
"price": 550,
"Convert": 0.7,
"Cost": 10,
"Customer_ID":10
}
]
}
}
I am trying to retrieve the price for specific customer_ID
I am using this query to get the price for Customer_ID=1
select json_array_elements(t.info -> 'data' -> 'results') ->> 'price'
from mytable t
where exists (
select
from json_array_elements(t.info -> 'data' -> 'results') x(elt)
where (x.elt ->> 'Customer_ID')::int = 1
)
The problem is that i am getting the same results for Customer_ID=1 and for Customer_ID=10
I am basically getting both elements of the array instead of just one.
I don't know what i am doing wrong
You can use a lateral join to unnest the array elements, and then filter on the customer with a where clause; this leaves you with just one row, from which you can extract the price:
select x.elt ->> 'price' price
from mytable t
cross join lateral json_array_elements(t.info -> 'data' -> 'results') x(elt)
where x.elt ->> 'Customer_ID' = 1
Note that you don't need to cast the customer id to an int, as it is already stored in the proper format in the json array.

Postgres: How to alter jsonb value type for each element in an array?

I have a jsonb column named items in Postgres 10.12 like this:
{
"items": [
{
"itemQty": 2,
"itemName": "snake"
},
{
"itemQty": 1,
"itemName": "x kodiyum"
}
]
}
Now I want to convert itemQty type to string for every array element so that the new values are like this:
{
"items": [
{
"itemQty": "2",
"itemName": "snake"
},
{
"itemQty": "1",
"itemName": "x kodiyum"
}
]
}
How do I do this? I have gone through the documentation for Postgres jsonb and couldn't figure out.
On the server-side, I am using Spring boot and Hibernate with com.vladmihalcea.hibernate.type.json (Hibernate Types 52) if it helps.
Thanks
You could unnest the array, modify the elements, and then rebuild it. Assuming that the primary key of your table is id, that would be:
select jsonb_build_object(
'items', jsonb_agg(
jsonb_build_object(
'itemQty', (x.obj ->> 'itemQty')::text,
'itemName', x.obj ->> 'Name'
)
)
)new_items
from mytable t
cross join lateral jsonb_array_elements(t.items -> 'items') as x(obj)
group by id
Note that the explicit cast to ::text is not really needed here, as ->> extract text values anyway: I kept it because it makes the intent clearer.
If you want an update statement:
update mytable t
set items = (
select jsonb_build_object(
'items', jsonb_agg(
jsonb_build_object(
'itemQty', (x.obj ->> 'itemQty')::text,
'itemName', x.obj ->> 'Name'
)
)
)
from jsonb_array_elements(t.items -> 'items') as x(obj)
)
Demo on DB Fiddle

Postgresql search if exists in nested jsonb

I'm new with jsonb request and i got a problem. Inside an 'Items' table, I have 'id' and 'data' jsonb. Here is what can look like a data:
[
{
"paramId": 3,
"value": "dog"
},
{
"paramId": 4,
"value": "cat"
},
{
"paramId": 5,
"value": "fish"
},
{
"paramId": 6,
"value": "",
"fields": [
{
"paramId": 3,
"value": "cat"
},
{
"paramId": 4,
"value": "dog"
}
]
},
{
"paramId": 6,
"value": "",
"fields": [
{
"paramId": 5,
"value": "cat"
},
{
"paramId": 3,
"value": "dog"
}
]
}
]
The value in data is always an array with object inside but sometimes the object can have a 'fields' value with objects inside. It is maximum one level deep.
How can I select the id of the items which as for example an object containing "paramId": 3 and "value": "cat" and also have an object with "paramId": 5 and "value" LIKE '%ish%'.
I already have found a way to do that when the object is on level 0
SELECT i.*
FROM items i
JOIN LATERAL jsonb_array_elements(i.data) obj3(val) ON obj.val->>'paramId' = '3'
JOIN LATERAL jsonb_array_elements(i.data) obj5(val) ON obj2.val->>'paramId' = '5'
WHERE obj3.val->>'valeur' = 'cat'
AND obj5.val->>'valeur' LIKE '%ish%';
but I don't know how to search inside the fields array if fields exists.
Thank you in advance for you help.
EDIT:
It looks like my question is not clear. I will try to make it better.
What I want to do is to find all the 'item' having in the 'data' column objects who match my search criteria. This without looking if the objects are at first level or inside a 'fields' key of an object.
Again for example. This record should be selected if I search:
'paramId': 3 AND 'value': 'cat
'paramId': 4 AND 'value': LIKE '%og%'
the matching ones are in the 'fields' key of the object with 'paramId': 6 and I don't know how to do that.
This can be expressed using a JSON/Path expression without the need for unnesting everything
To search for paramId = 3 and value = 'cat'
select *
from items
where data #? '$[*] ? ( (#.paramId == 3 && #.value == "cat") || exists( #.fields[*] ? (#.paramId == 3 && #.value == "cat")) )'
The $[*] part iterates over all elements of the first level array. To check the elements in the fields array, the exists() operator is used to nest the expression. #.fields[*] iterates over all elements in the fields array and applies the same expression again. I don't see a way how repeating the values could be avoided though.
For a "like" condition, you can use like_regex:
select *
from items
where data #? '$[*] ? ( (#.paramId == 4 && #.value like_regex ".*og.*") || exists( #.fields[*] ? (#.paramId == 4 && #.value like_regex ".*og.*")) )'
For now I have found a solution but it is not really clean and I don't know how it will perform in production with 10M records.
SELECT i.id, i.data
FROM ( -- A;
select it.id, it.data, i as value
from items it,
jsonb_array_elements(it.data) i
union
select it.id, it.data, f as value
from items it,
jsonb_array_elements(it.data) i,
jsonb_array_elements(i -> 'fields') f
) as i
WHERE (i.value ->> 'paramId' = '5' -- B1;
AND i.value ->> 'value' LIKE '%ish%')
OR (i.value ->> 'paramId' = '3' -- B2;
AND i.value ->> 'value' = 'cat')
group by i.id, i.data
having COUNT(*) >= 2; -- C;
A: I "flatten" the first and second level (second level is in 'fields' key)
B1, B2: These are my search criteria
C: I make sure the fields have all the criteria matching. If 3 criteria --> COUNT(*) >=3
It really doesn't look clean to me. It is working for dev purpose but I think there is a better way to do it.
If somebody have an idea Big thanks to him/her!