I want to merge data using lodash [closed] - lodash

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed last month.
Improve this question
[
{
"label": "google_da_desktop_1",
"data": [
0,
22.596473199939744,
32.932481810474904,
41.05015952885718,
47.99635538700579,
54.183927487925395,
59.82719604246855,
65.0546652236101,
69.95069924256464
],
"week": 1,
"group": "google_da_desktop",
"groupNo": 0
},
{
"label": "google_da_desktop_2",
"data": [
0,
22.596473199939744,
32.932481810474904,
41.05015952885718,
47.99635538700579,
54.183927487925395,
59.82719604246855,
65.0546652236101,
69.95069924256464
],
"week": 1,
"group": "google_da_desktop",
"groupNo": 0
}
]
I want to group these data into groups and sum the data, but is there any way?
I tried using lodash groupBy but it didn't work

Related

Querying Line Items of Order with JSON Functions in BigQuery

I am banging my head head here for the past 2 hours with all the available JSON_... functions in BigQuery. I've read quite a few questions here but no matter why I try, I never succeed in extracting the "amounts" from my JSON below.
This is my JSON stored in a BQ column:
{
"lines": [
{
"id": "70223039-83d6-463d-a482-7ce4d50bf0fc",
"charges": [
{
"type": "price",
"amount": 50.0
},
{
"type": "discount",
"amount": -40.00
}
]
},
{
"id": "70223039-83d6-463d-a482-7ce4d50bf0fc",
"charges": [
{
"type": "price",
"amount": 20.00
},
{
"type": "discount",
"amount": 0.00
}
]
}
]
}
Imagine the above being an order containing multiple items.
I am trying to get a sum of all amounts => 50-40+20+0. The result needs to be 30 = the total order price.
Is it possible to pull all the amount values and then have them summed up just via SQL without any custom JS functions? I guess the summing is the easy part - getting the amounts into an array is the challenge here.
Use below
select (
select sum(cast(json_value(charge, '$.amount') as float64))
from unnest(json_extract_array(order_as_json, '$.lines')) line,
unnest(json_extract_array(line, '$.charges')) charge
) total
from your_table
if applied to sample data in y our question - output is

I am trying to access the data stored in a snowflake table using python sql. Below is the columns given below i want to access

Below is the data-sample and i want to access columns value,start. This data i dumped in one column(DN) of a table (stg)
{
"ok": true,
"metrics": [
{
"name": "t_in",
"data": [{"value": 0, "group": {"start": "00:00"}}]
},
{
"name": "t_out",
"data": [{"value": 0,"group": {"start": "00:00"}}]
}
]
}
##consider many lines stored in same column in different rows.
Below query only fetched data for name. I want to access other columns value also. This query is a part of python script.
select
replace(DN : metrics[0].name , '"' , '')as metrics_name, #able to get
replace(DN : metrics[2].data , '"' , '')as metrics_data_value,##suggestion needed
replace(DN : metrics.data.start, '"','') as metrics_start, ##suggestion needed
replace(DN : metrics.data.group.finish, '"','') as metrics_finish, ##suggestion needed
from stg
Do i need to iterate over data and group? If yes, please suggest the code.
Here is an example of how to query that data.
Set up sample data:
create or replace transient table test_db.public.stg (DN variant);
insert overwrite into test_db.public.stg (DN)
select parse_json('{
"ok": true,
"metrics": [
{
"name": "t_in",
"data": [
{"value": 0, "group": {"start": "00:00"}}
]
},
{
"name": "t_out",
"data": [
{"value": 0,"group": {"start": "00:00"}}
]
}
]
}');
Select statement example:
select
DN:metrics[0].name::STRING,
DN:metrics[1].data,
DN:metrics[1].data[0].group.start::TIME,
DN:metrics[1].data[0].group.finish::TIME
from test_db.public.stg;
Instead of querying individual indexes of the JSON arrays, I think you'll want to use the flatten function which is documented here.
Here is how you do it with the flatten which is what I am guessing you want:
select
mtr.value:name::string,
dta.value,
dta.value:group.start::string,
dta.value:group.finish::string
from test_db.public.stg stg,
lateral flatten(input => stg.DN:metrics) mtr,
lateral flatten(input => mtr.value:data) dta

Querying BigQuery Events data in PowerBI

Hi I have analytics events data moved from firebase to BigQuery and need to create visualization in PowerBI using that BigQuery dataset. I'm able to access the dataset in PowerBI but some fields are in array type I generally use UNNEST while querying in console but how to run the query inside PowerBI. Is there any other option available? Thanks.
Table In BigQuery
What we did until the driver fully supports arrays is to flatten in a view: create a view in bigquery with UNNEST() and query that in PBI instead.
You might need to Transform(parse Json into columns/rows) your specific column in your case event_params
So I have below Json as example for you.
{
"quiz": {
"sport": {
"q1": {
"question": "Which one is correct team name in NBA?",
"options": [
"New York Bulls",
"Los Angeles Kings",
"Golden State Warriros",
"Huston Rocket"
],
"answer": "Huston Rocket"
}
},
"maths": {
"q1": {
"question": "5 + 7 = ?",
"options": [
"10",
"11",
"12",
"13"
],
"answer": "12"
},
"q2": {
"question": "12 - 8 = ?",
"options": [
"1",
"2",
"3",
"4"
],
"answer": "4"
}
}
}
}
I had this json added to my table. currently it has only 1 column
Now I go to Edit queries and go on Transform Tab, there you find Parse, In my case I have Json
When you parse as Json you will have expandable column
Now click on expanding it and sometimes it asks for expand to new row.
Finally you will have such a Table

View and Table collective name/type? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I want to create a database table which has a column that stores the name of a table or view. Calling my column "ViewOrTableName" seems a bit clumsy. Is there a generic type/name that could mean "table" or "View"?
Specifically I'm doing this in MS SQL, if the vernacular varies.
The T-SQL documentation for FROM says
FROM (Transact-SQL)
Specifies the tables, views, derived tables, and joined tables used in
DELETE, SELECT, and UPDATE statements in SQL Server 2017.
Syntax
[ FROM { <table_source> } [ ,...n ] ]
<table_source> ::=
{
table_or_view_name [ [ AS ] table_alias ]
[ <tablesample_clause> ]
[ WITH ( < table_hint > [ [ , ]...n ] ) ]
| rowset_function [ [ AS ] table_alias ]
[ ( bulk_column_alias [ ,...n ] ) ]
| user_defined_function [ [ AS ] table_alias ]
| OPENXML <openxml_clause>
| derived_table [ [ AS ] table_alias ] [ ( column_alias [ ,...n ] ) ]
| <joined_table>
| <pivoted_table>
| <unpivoted_table>
| #variable [ [ AS ] table_alias ]
| #variable.function_call ( expression [ ,...n ] )
[ [ AS ] table_alias ] [ (column_alias [ ,...n ] ) ]
| FOR SYSTEM_TIME <system_time>
}
So maybe you'd like to use table_source, or table_or_view_name.

JSON input in pentaho data integration [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have json inside json array coming from mongoDb.
"{ "_id" : { ""$oid"" : ""54b76bce44ae90e9e919d6e1""} ,
"_class" : ""com.argusoft.hkg.nosql.model.HkEventDocument"" , "featureName" : "EVENT" ,
"instanceId" : 577 ,
"fieldValue" :
{ "reg_label_type" : "types" ,
"third_custom_field" : "tttttttttttttttt" ,"people_capacity1": "20"}
, "sectionList" :
[ { "sectionName" : "REGISTRATION" ,
"customFields" :
[ { "_id" : 577 , "fieldValue" :{ "multiselect1" : [ "[]"] , "datess" : { "$date" : "2015-01-16T18:30:00.000Z"}}}]} , { "sectionName" : "INVITATIONCARD" , "customFields" : [ ]}] , "franchiseId" : 2}";
And I want to access the fieldValue array in json input. How can I do that?