athena / presto query - sql

I have a json saved in Athena table like
{
"VALIDATION_TYPE": "ROW_BY_ROW",
"DATABASE": "erp",
"TABLES": {
"APPLICATION_STATUS_TYPE": {
"BATCH_VALIDATION": {
"BATCHES": [{
"0": {
"FAILED": "FALSE",
"FAILURE_MSG": ""
}
}, {
"1": {
"FAILED": "TRUE",
"FAILURE_MSG": "NULL POINTER EXCEPTION"
}
}]
}
},
"APPLICATION": {
"BATCH_VALIDATION": {
"BATCHES": [{
"0": {
"FAILED": "FALSE",
"FAILURE_MSG": ""
}
}, {
"1": {
"FAILED": "TRUE",
"FAILURE_MSG": "NULL POINTER EXCEPTION"
}
}]
}
}
}
}
I need to write a query in Athena to find, find all the FAILED=TRUE records like below
output:
VALIDATION_TYPE,DATABASE,TABLE,ID,FAILED,FAILURE,FAIURE_MSG
----------------------------------------------------
ROW_BY_ROW,erp,APPLICATION_STATUS_TYPE,1,TRUE,NULL POINTER EXCEPTION
ROW_BY_ROW,erp,APPLICATION,1,TRUE,NULL POINTER EXCEPTION
I have tried various functions like TRANSFORM,UNNEST,JSON_EXTRACT etc , but no luck yet. Please advise if there are any specific functions I can use.
Thanks in advance

One trick which can be used is to cast some json parts as map(varchar, something) and/or array's:
-- sample data
with dataset(json_val) as (
values (json '{
"VALIDATION_TYPE": "ROW_BY_ROW",
"DATABASE": "erp",
"TABLES": {
"APPLICATION_STATUS_TYPE": {
"BATCH_VALIDATION": {
"BATCHES": [{
"0": {
"FAILED": "FALSE",
"FAILURE_MSG": ""
}
}, {
"1": {
"FAILED": "TRUE",
"FAILURE_MSG": "NULL POINTER EXCEPTION"
}
}]
}
},
"APPLICATION": {
"BATCH_VALIDATION": {
"BATCHES": [{
"0": {
"FAILED": "FALSE",
"FAILURE_MSG": ""
}
}, {
"1": {
"FAILED": "TRUE",
"FAILURE_MSG": "NULL POINTER EXCEPTION"
}
}]
}
}
}
}')
)
-- query
select json_extract_scalar(json_val, '$.VALIDATION_TYPE') VALIDATION_TYPE,
json_extract_scalar(json_val, '$.DATABASE') DATABASE,
t1.k "TABLE",
t3.k ID,
t3.map_v['FAILED'] FAILED,
t3.map_v['FAILURE_MSG'] FAILURE_MSG
from dataset
, unnest(cast(json_extract(json_val, '$.TABLES') as map(varchar, json))) as t1(k, v)
, unnest(cast(json_extract(t1.v, '$.BATCH_VALIDATION.BATCHES') as array(map(varchar, map(varchar, json))))) as t2(m)
, unnest(t2.m) as t3(k, map_v);
Output:
VALIDATION_TYPE
DATABASE
TABLE
ID
FAILED
FAILURE_MSG
ROW_BY_ROW
erp
APPLICATION
0
FALSE
ROW_BY_ROW
erp
APPLICATION
1
TRUE
NULL POINTER EXCEPTION
ROW_BY_ROW
erp
APPLICATION_STATUS_TYPE
0
FALSE
ROW_BY_ROW
erp
APPLICATION_STATUS_TYPE
1
TRUE
NULL POINTER EXCEPTION
And then you can apply the filtering.

Related

Indexing Firebase Realtime Database

What I am trying to accomplish is accessing a single item in my database. I set up my indexing but when I query it, I get back an empty object.
Accessing it like this shows an empty object
https://yourapp.firebaseio.com/ticket.json?orderBy=%22TicketNumber%22&equalTo=2240
{
"-N6sEsiL25tQSzlNjvc4": {
"form": {
"Company": "Test",
"CompanyInvolved": "Test",
"DateCompleted": "2022-07-27T21:51:00.000Z",
"DateDepartShop": "2022-07-19T15:43:00.000Z",
"DateDepartSite": "2022-07-21T18:47:00.000Z",
"Details": "D2",
"ItemCode": "IC2",
"MoreDetails": [
[
{
"Details": "D1",
"ItemCode": "IC1",
"Quantity": "Q1"
}
]
],
"Quantity": "Q2",
"RecievedBy": "Test",
"Signature": "file:///Users/arod/Library/Developer/CoreSimulator/Devices/4498CCA0-C6F4-4E9F-BCAE-19ADB385E758/data/Containers/Data/Application/F0A9086E-A773-43C3-861F-91D3666AB627/Library/Caches/ExponentExperienceData/%2540arod1207%252FAOS/sign.png",
"ThirdParty": true,
"TicketNumber": 2240,
"TimeArriveShop": "2022-07-21T21:51:00.000Z",
"TimeArriveSite": "2022-07-19T18:47:00.000Z",
"TimeDepartShop": "2022-07-19T18:43:00.000Z",
"TimeDepartSite": "2022-07-21T18:51:00.000Z",
"TodaysDate": "2022-07-20T15:43:00.000Z"
}
},
My rules are as followed
{
"rules": {
".read": "true",
".write": "true",
"ticket": {
".indexOn": "TicketNumber"
}
}
}
Since the value you want to index lives under form/TicketNumber of each direct child node of the ticket path, that is also what you must define the index for.
So:
{
"rules": {
".read": "true",
".write": "true",
"ticket": {
".indexOn": "form/TicketNumber"
}
}
}
Same: that form/TicketNumber is also the path you need to specify for the orderBy parameter of your query.

How to check a particular value on basis of condition in karate

Goal: Match the check value is correct for 123S and 123O response in API
First check the value on this location x.details[0].user.school.name[0].codeable.text if it is 123S then check if x.details[0].data.check value is abc
Then check if the value on this location x.details[1].user.school.name[0].codeable.text is 123O then check if x.details[1].data.check is xyz
The response in array inter changes it is not mandatory first element is 123S sometime API returns 123O as first array response.
Sample JSON.
{
"type": "1",
"array": 2,
"details": [
{
"path": "path",
"user": {
"school": {
"name": [
{
"value": "this is school",
"codeable": {
"details": [
{
"hello": "yty",
"condition": "check1"
}
],
"text": "123S"
}
}
]
},
"sample": "test1",
"id": "22222"
},
"data": {
"check": "abc"
}
},
{
"path": "path",
"user": {
"school": {
"name": [
{
"value": "this is school",
"codeable": {
"details": [
{
"hello": "def",
"condition": "check2"
}
],
"text": "123O"
}
}
]
},
"sample": "test",
"id": "11111"
},
"data": {
"check": "xyz"
}
}
]
}
How I did in Postman but how to replicate same in Karate?
var jsonData = pm.response.json();
pm.test("Body matches string", function () {
for(var i=0;i<jsonData.details.length;i++){
if(jsonData.details[i].user.school.name[0].codeable.text == '123S')
{
pm.expect(jsonData.details[i].data.check).to.equal('abc');
}
if(jsonData.details[i].user.school.name[0].codeable.text == '123O')
{
pm.expect(jsonData.details[i].data.check).to.equal('xyz');
}
}
});
2 lines. And this takes care of any number of combinations of lookup values :)
* def lookup = { '123S': 'abc', '123O': 'xyz' }
* match each response.details contains { data: { check: '#(lookup[_$.user.school.name[0].codeable.text])' } }

POSTGRESQL query to extract attributes in JSON

I have the below JSON in a particular DB column. I need a query to extract fields stored within the savings rate(to and from).
{
"data": [
{
"data": {
"intro_visited": {
"portfolio_detail_investment_journey": true,
"dashboard_investments": true,
"portfolio_list_updates": true,
"portfolio_detail_invested": true,
"portfolio_list_offering": true,
"dashboard_more_bottom_bar": true
}
},
"type": "user_properties",
"schema_version": "1"
},
{
"data": {
"savings_info": {
"remind_at": 1583475493291,
"age": 100,
"savings_rate": {
"to": "20",
"from": "4"
},
"recommendation": {
"offering_name": "Emergency Fund",
"amount": "1,11,111",
"offering_status": "not_invested",
"ideal_amount": "1,11,111",
"offering_code": "liquid"
}
}
},
"type": "savings_info",
"schema_version": "1"
}
]
}
To get the "To"
$..data.savings_info.savings_rate.to
To get the "From"
$..data.savings_info.savings_rate.from
This script works
SELECT
<column> ->'data'->2->'data'->'savings_info'->'savings_rate'->>'to' AS to_rate
from <table>

JSON Schema - anyOf within conditional?

I am trying to define a JSON schema with conditionals. I built an MVE which already doesn't work as I expect it.
The object I want to validate is:
{
"keiner": false,
"abdominal": true,
"zervikal": false
}
The conditional rule is simple. When "keiner" is true, both other values have to be false. If "keiner" is false, at least one of the other two has to be true.
I wrote this schema:
{
"type": "object",
"properties": {
"keiner": { "type": "boolean" },
"abdominal": { "type": "boolean" }
},
"if": {
"properties": {
"keiner": { "const": true }
}
},
"then": {
"properties" : {
"abdominal": { "const": false },
"zervikal": {"const": false }
}
},
"else": {
"properties": {
"anyOf": [
{ "abdominal": { "const": true } },
{ "zervikal": { "const" : true } }
]
}
}
}
But the Newtonsoft online validator gives the error message
Unexpected token encountered when reading value for 'anyOf'. Expected StartObject, Boolean, got StartArray.
for the line in which ´anyOf´ starts. This confuses me, as all examples I can find show anyOf followed by an array of options.
So what am I doing wrong? Why cannot I have a startArray after anyOf, and how do I write the schema correctly?
I guess this is the schema you are looking for:

Max Response Limitation im OTA_AirLowFareSearchRQ

I'm working with Sabre REST API. I have a issue with the OTA_AirLowFareSearchRQ, I try limit the response number using the MaxResponses in the json structure but seems that I make something wrong because the response give to me 95 answers in the cert environment (https://api.cert.sabre.com/).
The json request that I use is:
{
"OTA_AirLowFareSearchRQ": {
"Target": "Production",
"PrimaryLangID": "ES",
"MaxResponses": "15",
"POS": {
"Source": [{
"RequestorID": {
"Type": "1",
"ID": "1",
"CompanyName": {}
}
}]
},
"OriginDestinationInformation": [{
"RPH": "1",
"DepartureDateTime": "2016-04-01T11:00:00",
"OriginLocation": {
"LocationCode": "BOG"
},
"DestinationLocation": {
"LocationCode": "CTG"
},
"TPA_Extensions": {
"SegmentType": {
"Code": "O"
}
}
}],
"TravelPreferences": {
"ValidInterlineTicket": true,
"CabinPref": [{
"Cabin": "Y",
"PreferLevel": "Preferred"
}],
"TPA_Extensions": {
"TripType": {
"Value": "Return"
},
"LongConnectTime": {
"Min": 780,
"Max": 1200,
"Enable": true
},
"ExcludeCallDirectCarriers": {
"Enabled": true
}
}
},
"TravelerInfoSummary": {
"SeatsRequested": [1],
"AirTravelerAvail": [{
"PassengerTypeQuantity": [{
"Code": "ADT",
"Quantity": 1
}]
}]
},
"TPA_Extensions": {
"IntelliSellTransaction": {
"RequestType": {
"Name": "10ITINS"
}
}
}
}
}
MaxResponses could be something for internal development which is part of the schema but does not affect the response.
What you can modify is in the IntelliSellTransaction. You used 10ITINS, but the values that will work should be 50ITINS, 100ITINS and 200ITINS.
EDIT2 (as Panagiotis Kanavos said):
RequestType values depend on the business agreement between your company and Sabre. You can't use 100 or 200 without modifying the agreement.
"TPA_Extensions": {
"IntelliSellTransaction": {
"RequestType": {
"Name": "50ITINS"
}
}
}
EDIT1:
I have searched a bit more and found:
OTA_AirLowFareSearchRQ.TravelPreferences.TPA_Extensions.NumTrips
Required: false
Type: object
Description: This element allows a user to specify the number of itineraries returned.