managing complex JSON - ruby-on-rails-3

I have a complex input JSON, something like this:
#json = {"project":"bla",
"analysis": {"id":"123","title":"Test"}, 
"data":{"axis": {"name":"column", "label":"demo","values": ["one", "two"]},
"series": [{"label":"text", "values":["1", "2", "3"]},
{"label":"text2", "values":["4", "5", "6"]}
]}}
(more complex and length that this example)
two question:
it exists a RoR method that can delete on the fly something in the json? something like
#json.destroy['analysis'] that eliminate only the "analysis" key-value couple?
how can i navigate (for example) in the values of the series? I can do if the json is simple with a do each, but here must I do a concecation of do each?

You can convert the JSON to a hash, remove the key, and then convert back to JSON:
require "json"
hash = JSON.parse(#json)
hash.delete("analysis")
#json = hash.to_json

Related

Apply multiple format rules to a single rjsf field

I like the rjsf format api, it works great for me:
"format": "alphanumeric"
But I would like to assign multiple format rules to a single field, and use transformErrors api to display a different message for each, giving user more precise feedback about what's wrong. Something along the lines of:
"format": ["alphanumeric", "mustBeginWithLetter"]
but this array notation doesn't work and breaks the formatting instead :)
Is there a clean way to achieve what I want?
"allOf": [ {"format": "alphanumeric"}, {"format": "mustBeginWithLetter"} ]

How to extract data from array in a JSON message using CloudWatch Logs Insights?

I log messages that are JSON objects. The JSON has an array that contains key/value pairs:
{
...
"arr": [{"key": "foo", "value": "bar"}, ...],
...
}
Now I want to filter results that contains a specific key and extract the values for a specific key in the array.
I've tried using regex, something like parse #message /.*"key":"my_specific_key","value":(?<value>.*}).*/ which extracts the value but also returns the rest of the message. Also it doesn't filter the results.
How can I filter results and extract the values for a specific key?
If in your log entry in the cloudwatch log group they are actually showing up as json, you can just reference the key directly in any place you would a field.
(don't need the #, cloudwatch appends that automatically to all default values)
If you are using python, you can use aws_lambda_powertools to do this as well, in a very slick way (and its an actual aws product)
If they are showing up in your log as a string, then it may be an escaped string and you'll have to match it -exactly- - including spaces and what not. when you parse, you will want to do something like this:
if this is the string of your log message '{"AKey" : "AValue", "Key2" : "Value2"}
parse #message "{\"*\" : \"*\",\"*\" : \"*\"} akey, akey_value, key2, key2_value
then you can filter or count or anything against those variables. parse is specifically a statement to match a pattern and assign the wildcard to a variable, one at a time in order
tho with a complex json, if your above regex works than all you need is a filter statement
field #message
| pares #message ... your regex as value_var
| filer value_var /some more regex/
if its not a string in the log entry, but an actual json, you can just reference against the key:
filter a_key ~="some value" (or regex here)
https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/CWL_AnalyzeLogData-discoverable-fields.html
for more info

Need Pentaho JSON without array

I wanted to output json data not as array object and I did the changes mentioned in the pentaho document, but the output is always array even for the single set of values. I am using PDI 9.1 and I tested using the ktr from the below link
https://wiki.pentaho.com/download/attachments/25043814/json_output.ktr?version=1&modificationDate=1389259055000&api=v2
below statement is from https://wiki.pentaho.com/display/EAI/JSON+output
Another special case is when 'Nr. rows in a block' = 1.
If used with empty json block name output will looks like:
{
"name" : "item",
"value" : 25
}
My output comes like below
{ "": [ {"name":"item","value":25} ] }
I have resolved myself. I have added another JSON input step and defined as below
$.wellDesign[0] to get the array as string object

Is it possible to prevent ORDS from escaping my GeoJSON?

I have a problem with Oracle ORDS escaping my GeoJSON with "
{
"id": 1,
"city": "New York",
"state_abrv": "NY",
"location": "{\"type\":\"Point\",\"coordinates\":[-73.943849, 40.6698]}"
}
In Oracle DB it is stated correctly:
{"type":"Point","coordinates":[-73.943849, 40.6698]}
Need help to figure out why the " are added and how to prevent this from happening
add this column alias to your restful service handler query for the JSON column
SELECT id,
jsons "{}jsons" --this one
FROM table_with_json
Then when ords sees the data for the column, it won't format it as JSON because it already IS json
You can use whatever you want, in your case it should probably be
"{}location"

How to extract this json into a table?

I've a sql column filled with json document, one for row:
[{
"ID":"TOT",
"type":"ABS",
"value":"32.0"
},
{
"ID":"T1",
"type":"ABS",
"value":"9.0"
},
{
"ID":"T2",
"type":"ABS",
"value":"8.0"
},
{
"ID":"T3",
"type":"ABS",
"value":"15.0"
}]
How is it possible to trasform it into tabular form? I tried with redshift json_extract_path_text and JSON_EXTRACT_ARRAY_ELEMENT_TEXT function, also I tried with json_each and json_each_text (on postgres) but didn't get what expected... any suggestions?
desired results should appear like this:
T1 T2 T3 TOT
9.0 8.0 15.0 32.0
I assume you printed 4 rows. In postgresql
SELECT this_column->'ID'
FROM that_table;
will return column with JSON strings. Use ->> if you want text column. More info here: https://www.postgresql.org/docs/current/static/functions-json.html
In case you were using some old Postgresql (before 9.3), this gets harder : )
Your best option is to use COPY from JSON Format. This will load the JSON directly into a normal table format. You then query it as normal data.
However, I suspect that you will need to slightly modify the format of the file by removing the outer [...] square brackets and also the commas between records, eg:
{
"ID": "TOT",
"type": "ABS",
"value": "32.0"
}
{
"ID": "T1",
"type": "ABS",
"value": "9.0"
}
If, however, your data is already loaded and you cannot re-load the data, you could either extract the data into a new table, or add additional columns to the existing table and use an UPDATE command to extract each field into a new column.
Or, very worst case, you can use one of the JSON Functions to access the information in a JSON field, but this is very inefficient for large requests (eg in a WHERE clause).