Mule AnyPoint Studio, efficiently insert inside a DB a large number of items from array (json) - mule

i have a json like the one below as body of a POST request.
{
"summary": {
"transactionId": "5003k00000zSuNaAAK",
"transactionNumber": "T12345",
"overall": 100,
"date": "15/05/2020",
"details": [
{
"transactionDetailId": "CC12345",
"product_code": 223242234,
"price": 1500,
"amount": 1000
},
{
"transactionDetailId": "DD12345",
"product_code": 679685675,
"price": 1100,
"amount": 90
},
{
"transactionDetailId": "SS12345",
"product_code": 345346643,
"price": 2000,
"amount": 300
},
.......other 100 items
]
}
}
In my AnyPoint Studio project, using a forEach module to loop details[] and a Bulk Insert, i'm able to execute an INSERT, and write into my postgres DB all the items of the details array.
So, for each items an INSERT will be executed.
Is there a more efficient way to perform this operation, considering array with more than 1000 items?

Better way would be to extract dteails[] as payload and then do bulk insert based on this array item. No forEach is involved and it it works much faster. Also stream is used in this case and memory demand will be much better.

Related

SQL: Unnesting a variable length JSON into multiple columns

I have a JSON array in a redshift SQL column that will vary in number of nests which I need to unnest and select the values such that they print in columns on the same row.
i.e: From: Name|JSON
to
Name|First Play Price|First Play Status| Second Play Price|Second Play Status... etc
The syntax is roughly
[
{
"price": "price1",
"status": "status1",
},
{
"price": "price2",
"status": "status2",
},
{
"price": "price3",
"status": "status3",
}
]
I'm familiar with JSON extraction however I've got a bit stuck on this varying no keys ([{},{},{}]) nest issue.
Any help or direction to resources would be greatly apreciated! Thank you

Querying Line Items of Order with JSON Functions in BigQuery

I am banging my head head here for the past 2 hours with all the available JSON_... functions in BigQuery. I've read quite a few questions here but no matter why I try, I never succeed in extracting the "amounts" from my JSON below.
This is my JSON stored in a BQ column:
{
"lines": [
{
"id": "70223039-83d6-463d-a482-7ce4d50bf0fc",
"charges": [
{
"type": "price",
"amount": 50.0
},
{
"type": "discount",
"amount": -40.00
}
]
},
{
"id": "70223039-83d6-463d-a482-7ce4d50bf0fc",
"charges": [
{
"type": "price",
"amount": 20.00
},
{
"type": "discount",
"amount": 0.00
}
]
}
]
}
Imagine the above being an order containing multiple items.
I am trying to get a sum of all amounts => 50-40+20+0. The result needs to be 30 = the total order price.
Is it possible to pull all the amount values and then have them summed up just via SQL without any custom JS functions? I guess the summing is the easy part - getting the amounts into an array is the challenge here.
Use below
select (
select sum(cast(json_value(charge, '$.amount') as float64))
from unnest(json_extract_array(order_as_json, '$.lines')) line,
unnest(json_extract_array(line, '$.charges')) charge
) total
from your_table
if applied to sample data in y our question - output is

Azure Data Factory JSON syntax

In Azure Data Factory, I have a copy activity. The data source is the response body from a REST API POST request.
The sink is a SQL table. The problem is that, even though my JSON data contains multiple rows, only the first row is getting copied.
The source data looks like the following:
{
"offset": 0,
"limit": 1000,
"total": 65,
"loaded": 34,
"unloaded": 31,
"cubeCaches": [
{
"id": "MxMUVDN0Q1MzAk5MDg6RDkxREQxMUU5RDBDNzR2NMTk6YWNsZGxwMTJtc3QuY2952aXppZW50aW5==",
"projectId": "15D91DD11E9D0C74B3319",
"source": {
"name": "12302021",
"id": "07EF95111EC7F954158",
"type": "cube"
},
"state": {
"active": true,
"dirty": false,
"infoDirty": false,
"persisted": true,
"processing": false,
"loadedState": "loaded"
},
"lastUpdateTime": "2022-01-24T14:22:30Z",
"lastHitTime": "2022-02-14T20:02:02Z",
"hitCount": 1,
"size": 798720,
"creatorId": "D4E8BFD56085",
"lastUpdateJob": 18937,
"openViewCount": 0,
"creationTime": "2022-01-24T15:07:24Z",
"historicHitCount": 22,
"dataLanguages": [],
"rowCount": 2726,
"columnCount": 9
},
{
"id": "UYwMTIxMUFNjkxMUU5RDBDMTRCNkMwMDgwRUYzNUQ0MUI6YWNsZjLmNvbQ==",
"projectId": "120D0C1480EF35D41B",
"source": {
"name": "All Clients (YTD)",
"id": "49E5B13466251CD0B54E8F",
"type": "cube"
},
"state": {
"active": true,
"dirty": false,
"infoDirty": false,
"persisted": true,
"processing": false,
"loadedState": "loaded"
},
"lastUpdateTime": "2022-01-03T01:00:01Z",
"hitCount": 0,
"size": 82488152,
"creatorId": "1E2AFB011E80EF35FF14",
"lastUpdateJob": 364091,
"openViewCount": 0,
"creationTime": "2022-02-14T01:04:55Z",
"historicHitCount": 0,
"dataLanguages": [],
"rowCount": 8146903,
"columnCount": 13
}
}
I want to add a row in the Sink table (SQL) for every "id" in the JSON. However, when I run the activity, only the first record gets copied. It's mapped correctly, but I want it to copy all rows in the JSON, not just 1.
My Mapping tab in Azure Data Factory looks like this:
What am I doing wrong here? I'm thinking there is something wrong with my "Source" syntax for each of the columns...
In $cubeCashes[0][...] you're explicitly mapping the first element from this array into columns, and that's why only one row lands in the Sink.
I don’t know a way to achieve what you intend with copy activity only. I would use the Mapping Data Flow here, and inlide I would flatten (Flatten activity) your data to get the array of objects.
Then from this flattened dataset you could use a Derived Column to map the fields in JSON into columns of your target, Select, to remove unwanted original fields, and Sink it into your target location.

How can i insert JSON into Azure SQL DB in Node.js (REST API)

I am trying to build new REST API with few get\post methods and my question is: How can i struct my table to support json? and what i should write to insert the spesific json that i get from user to my DB
DB.js:
require("dotenv").config();
const sql = require("mssql")
// Create connection to database
const config = {
userName: process.env.tedious_userName,
password: process.env.tedious_password,
server:process.env.tedious_server,
database:process.env.tedious_database
};
const connection = new Connection(config);
// Attempt to connect and execute queries if connection goes through
module.exports =connection.on("connect", err => {
if (err) {
console.error(err.message);
});
module.exports = connectDB
now i want to create some post method of getting new recipe from the user, so i get this JSON:
[
{
"username": "newuser",
"id": 1,
"name": "Hamburger",
"img": "https://image.shutterstock.com/w-705104968.jpg",
"time": 45,
"likes": 17,
"isGluten": false,
"isVegaterian": false,
"isWatched": false,
"isSave": false,
"ingredients": [
{
"amount": 5,
"product": "pound beef short ribs"
},
{
"amount": 2,
"product": "teaspoon salt"
},
{
"amount": 1.5,
"product": "tablespoons all-purpose flour"
},
{
"amount": 0.5,
"product": "teaspoon ground black pepper"
}
],
"instructions": [
{
"Step": "Preheat oven to 350 degrees F (175 degrees C). Grease and flour a 9x9 inch pan or line a muffin pan with paper liners."
},
{
"Step": "In a medium bowl, cream together the sugar and butter. Beat in the eggs, one at a time, then stir in the vanilla. Combine flour and baking powder, add to the creamed mixture and mix well. Finally stir in the milk until batter is smooth. Pour or spoon batter into the prepared pan."
},
{
"Step": "Bake for 30 to 40 minutes in the preheated oven. For cupcakes, bake 20 to 25 minutes. Cake is done when it springs back to the touch."
}
]
}
]
I need some help with define table that supports json files and insert this json above to this table.
1.You may create table like below:
create table myTable
(
Id int identity primary key,
Data nvarchar(max)
)
the Data column is where your JSON data stored.
2.Create stored procedure like below:
create procedure InsertJSON(#json nvarchar(max))
as begin
insert into myTable(Data)
values(#json)
end
3.Execute stored procedure
eg.
exec InsertJSON '{"Price":10455,"Color":"White","tags": ["toy","children","games"]}'
and check if JSON data has been stored into myTable
4.Try to query JSON data with built-in JSON_VALUE()、JSON_QUERY()
select JSON_VALUE(Data, '$.Price') from myTable
Finally you may check out this link

How do i make custom reordering for products in collection in shopify

How do i make custom reorder for products in collection in shopify. i mean for exapmle i have 'test' collection and i want to reorder products in 'test' collection by using product tag i mean i put some tags like 'firstshowup' in some product in 'test' collection so when customer click 'test' collection customer see products which have 'firstshowup' tag first and then see the rest so what iam trying here is reordering using custom reordering not using like order by bestseller or allpabetically or date created s.t
thank you so much guys in advance
Your collection will be made up of Collect objects, which have a position attribute. Assuming you're using a CustomCollection, you can modify the position of the Collects by updating the CustomCollection: http://api.shopify.com/customcollection.html#update
From the examples, to update a collection, you can use:
PUT /admin/custom_collections/#{id}.json
With the following payload:
{
"custom_collection": {
"body_html": "<p>The best selling ipod ever</p>",
"handle": "ipods",
"id": 841564295,
"published_at": "2008-02-01T19:00:00-05:00",
"sort_order": "manual",
"template_suffix": null,
"title": "IPods",
"updated_at": "2008-02-01T19:00:00-05:00",
"image": {
"created_at": "2012-12-11T12:01:29-05:00",
"src": "http://cdn.shopify.com/s/files/1/0006/9093/3842/collections/ipod_nano_8gb.jpg?0"
},
"collects": [
{
"product_id": 921728736,
"position": 1
},
{
"id": 841564295,
"position": 2
}
]
}
}