cannot extra data from a faunaDb feild with value as a list - faunadb

I have a very simple collection in faunaDb
it is like this:-
data : {
field2: { }
field2: { }
field3: { }
field4: [
{}
{}
]
}
field 1 & field 2 can be extracted using an indexed field , ie any of the 1,2,3, field, but filed 4 always returns null
i use
Paginate(match(index("index_name"), "searchfield")))
is there something i'm missing?

Related

Flatten complex json using Databricks and ADF

I have following json which I have flattened partially using explode
{
"result":[
{
"employee":[
{
"employeeType":{
"name":"[empName]",
"displayName":"theName"
},
"groupValue":"value1"
},
{
"employeeType":{
"name":"#bossName#",
"displayName":"theBoss"
},
"groupValue":[
{
"id":"1",
"type":{
"name":"firstBoss",
"displayName":"CEO"
},
"name":"Martha"
},
{
"id":"2",
"type":{
"name":"secondBoss",
"displayName":"cto"
},
"name":"Alex"
}
]
}
]
}
]
}
I need to get following fields:
employeeType.name
groupValue
I am able to extract those fields and value. But, if name value starts with # like in "name":"#bossName#", I am getting groupValue as string from which I need to extract id and name.
"groupValue":[
{
"id":"1",
"type":{
"name":"firstBoss",
"displayName":"CEO"
},
"name":"Martha"
},
{
"id":"2",
"type":{
"name":"secondBoss",
"displayName":"cto"
},
"name":"Alex"
}
]
How to convert this string to json and get the values.
My code so far:
from pyspark.sql.functions import *
db_flat = (df.select(explode("result.employee").alias("emp"))
.withColumn("emp_name", col(emp.employeeType.name))
.withColumn("emp_val",col("emp.groupValue")).drop("emp"))
How can I extract groupValue from db_flat and get id and name from it. Maybe use python panda library.
Since you see they won't be dynamic. You can traverse through the json while mapping like as below. Just identify record and array, specify index [i] as needed.
Example:
id --> $['employee'][1]['groupValue'][0]['id']
name --> $['employee'][1]['groupValue'][0]['type']['name']

'match each' one element in the array [duplicate]

This question already has an answer here:
Using match each contains for json array items assertion
(1 answer)
Closed 1 year ago.
My question about selective asserting with 'match each'.
Below is a sample json body:
* def data =
"""
{
"companies": [
{
"companyDetails": {
"name": "companyName",
"location": {
"address": "companyAddress",
"street": "companyStreet"
}
}
},
{
"companyDetails": {
"name": "companyName",
"location": {
"address": "companyAddress",
"street": "companyStreet"
}
}
}
]
}
"""
How can we assert whether each 'companyDetails' object in the response contains a certain element, e.g. name only?
* match each data.companies contains { companyDetails: { name: '#notnull' } }
When I use above step, this returns below error:
$[0] | actual does not contain expected | all key-values did not match, expected has un-matched keys
So, is there any way we can assert only one field inside this object? For example, assert whether each companyDetails object contains name, but ignores the other elements such as location? Thanks
This will work:
* match each data.companies == { companyDetails: { name: '#notnull', location: '#notnull' } }
This will also work:
* match data.companies contains deep { companyDetails: { name: '#notnull' } }
Sometimes it is better to transform the JSON before a match:
* match each data..companyDetails contains { name: '#notnull' }

extract all occurrences of same field from request body splunk

I have a same field multiple times in one request body and need to find the value for each occurrence. like subTypeCodeId filed. result should have subTypeCodeId = 2
subTypeCodeId = 3
{
"Items": [
{
"emailId": "#stny.com",
"item": {
"subTypeCodeId": "2"
}
},
{
"emailId": "#comcast.com",
"item": {
"subTypeCodeId": "3"
}
}
]
}
splunk query: index="gcp_prod_ecomm_cx_wallet" "1570081534220" "API_NAME:wallet.addItemsToWalletBulk" |rex "subTypeCodeId\x5C\":\x5C\"(?.*)\""
Use the max_match option of rex. It will make subTypeCodeId a multi-value field containing all values.
index="gcp_prod_ecomm_cx_wallet" "1570081534220" "API_NAME:wallet.addItemsToWalletBulk"
| rex max_match=0 "subTypeCodeId\x5C\":\x5C\"(?<subTypeCodeId>.*)\""
You also might want to look into the spath command, which can parse json data.

How to create a function that will add a a value to array in object

I want to create a function that will add a grade to specific student and subject
This is how my document looks
"_id" : ObjectId("5b454b545b4545b"),
"name" : "Bob",
"last_name" : "Bob",
"nr_album" : "222",
"grades" ; {
"IT" : [
3,
3,
5,
4
]
}
This is what I came up with
function addGrade(
nr_album,grades,value
) {
db.studenci.update (
{nr_album: nr_album},
{ $push: { [grades]: value}});
}
addGrade("222","grades.IT",100)
It`s working properly, but what I want to achieve is to except "grades.IT" pass only "IT" in the parameters.
You can use template strings in ES2015.
Pass Arguments like that
addGrade("222","IT",100)
Get parameters "IT" and make it dynamically required string
function addGrade(nr_album, grades, value) {
const string = `grades.{$grades}`
db.studenci.update({
nr_album: nr_album
}, {
$push: { [string]: value }
});
}

dojo 1.6 DataGrid cannot display lists?

In dojo 1.7.2, if I create a data store containing array values, dojox.grid.DataGrid displays them with no problem, separating each item with a coma.
However, in dojo 1.6, it takes only the first element of my array. I have a project where I have to use version 1.6. Is there any workaround for this in that version ?
To illustrate the problem, here are 2 examples :
On dojo 1.6 : http://jsfiddle.net/psoares/HbFNY/
On dojo 1.7 : http://jsfiddle.net/psoares/QLm65/
Thanks !
Apparently the problem comes from ItemFileReadStore rather than from the grid.
I modified my code for 1.6 to use ObjectStore and MemoryStore instead, and it worked.
See http://jsfiddle.net/psoares/HbFNY/16/
this is a flaw and yet it is not.. The construct of your JSON is not quite right as any value is not allowed as array unless it is one of the childAttrs. Due to nature of [1,2,3].toString() that is why your attempts on setting values as arrays are considered valid.
The get/set in an ItemFileReadStore works with its items as such:
store._arrayOfAllItems = {
value1 : { values : [ 'realvalue' ] },
value2 : { values : [ 'realvalue' ] }
};
the getter then says
store.get = function(itemById, val) { return itemById[val][0]; }
// why only the first arrayslot is pulled from store---/^
In your JSON construct, what prohibits you from settings the values as such following?
var data = {
id: 'id',
label: 'id',
items: [
{
id: "value1",
values: "a,b,c" // permit-able string value
},
{
id: "value2",
values: "foo"}
]
};
If you want multiple values by same key of one ID then you must deliver the data as children and handle them as such, like;
data: {
id: 'id',
label: 'id',
childrenAttrs: [ 'items', 'children'], // << default behavior
items: [ {
id: "value1",
children: [
{ id: "value1_1", values: 'a' },
{ id: "value1_2", values: 'b' },
{ id: "value1_3", values: 'c' }
]
}, {
id: "value2",
values: "foo"
} ]
}
However, only the dojox.grid.TreeGrid will allow using multi-lvl datastores