IBM Worklight - How to construct a JSON object in SQL adapter - ibm-mobilefirst

To construct a JSON object in a SQL adapter I have tried the following:
{
'PatientID':4,
'FName':'test',
'LName':'test',
'AGE':1,
'DOB':1988-09-01,
'GENDER':'m',
'BG':'A+'
}
However I get an error:
{
"errors": [
"Runtime: Method createSQLStatement was called inside a JavaScript function."
],
"info": [
],
"isSuccessful": false,
"warnings": [
]
}
Full size image

First, in the "Invoke Procedure Data" window for your adapter, don't wrap the object in quotes. If you do, it will think that the entire thing is a string.
If you remove the beginning and ending quotes then you almost have it correct. The window will take valid JSON objects, but only if all non integers are strings. Since 1988-09-01 is not a valid integer, it must be wrapped in quotes. You should be able to copy/paste this object into the wizard:
{
'PatientID':4,
'FName':'test',
'LName':'test',
'AGE':1,
'DOB':"1988-09-01",
'GENDER':'m',
'BG':'A+'
}

createSQLStatement API should not be used inside of your functions. You it outside of functions, just like tutorial shows (slide 10) http://public.dhe.ibm.com/software/mobile-solutions/worklight/docs/v600/04_03_SQL_adapter_-_Communicating_with_SQL_database.pdf

Related

Kafka Lenses SQL - How to WHERE filter based on objects nested in an array

I am working in Kafka Lenses v2.2.2. I need to filter based on a the value of an object inside an array.
Sample message (redacted for simplicity):
{
"payload": {
"Data": {
"something" : "stuff"
},
"foo": {
"bar": [
{
"id": "8177BE12-F69B-4A51-B12E-976D2AE37487",
"info": "more_data"
},
{
"id": "06A846C5-2138-4107-A5B0-A2FC21B9F32D",
"info": "more_data"
}
]
}
}
In lenses this actually appears as a nested object with a integer properties... 0, 1, etc.
So I've tried this, but it is throwing an error: .0 appears out of place
SELECT *
FROM topic_name
WHERE payload.foo.bar.0.id = "8177BE12-F69B-4A51-B12E-976D2AE37487"
LIMIT 10
I tried wrapping the 0 in double/single quotes as well and that throws a 500 error.
I copied and pasted the UUID from the first message in the topic, so it's definitely there. I also copy and pasted the labels to rule out typos. I am thinking there is some special way to access arrays with nested objects like this, but I'm struggling to find any documentation or videos discussing it.
I can be confident the value is stored in the first array element, but methods that can search all objects would be awesome as well.
The syntax (if you know the array index - as in my initial question) is:
SELECT *
FROM topic_name
WHERE payload.foo.bar[0].id = "8177BE12-F69B-4A51-B12E-976D2AE37487"
LIMIT 10
Though I am still struggling to do this if the array index is unknown and you need to check them all. I'm assuming at this point it's not possible without a series of OR statements in the WHERE clause that checks them all.

VB.NET Processing Json from GitLab

I'm using the API of GitLab in VB.Net.
To request groups, I'm using GET /groups.
GitLab returns a JSON string like this:
[
{
"id":5,
"web_url":"https://XXXXX/groups/AAAA",
"name":"AAAA",
"path":"AAAA",
"description":"blabla",
"visibility":"private",
"share_with_group_lock":false,
"require_two_factor_authentication":false,
"two_factor_grace_period":48,
"project_creation_level":"developer",
},
{
"id":8,
"web_url":"https://XXXXX/groups/BBBBBB",
"name":"BBBBBB",
"path":"BBBBBB",
"description":"",
"visibility":"private",
"share_with_group_lock":false,
"require_two_factor_authentication":false,
"two_factor_grace_period":48,
"parent_id":null,
"ldap_cn":null,
"ldap_access":null
},
etc ...
]
It's quite complicated to parse it with Newtonsoft.Json so I would like first to convert it to an array of Dictionary.
Then, I will be able to loop through the array and get myrow("id") for instance.
I couldn't find how to do this, could you help me please?
String (list of Dictionary) -> List (Dictionary)

Inability to call SPROC from Azure Logic Apps - can't find syntax for the parameters

Statement of intent:
I'm trying to automate a workflow, moving data periodically from a CSV in Sharepoint into a table in Azure SQL Database. I've gotten so far as 1) Formatting a JSON array, and 2) Creating a SPROC that successfully takes the text of the JSON Array, and imports it into the appropriate table.
Array appears like:
JSON = [{"col1":"col1Data","col2":"col2Data", ...}, <600-some more iterations>]
Invocation of stored procedure in SQL Management Studio looks like:
EXECUTE SprocName #json=N'<text of JSON above>'
===========================================
Problem:
Lack of documentation allowing me to properly format one of the following two SQL Connectors' parameters to link these two statements together:
Both Execute a Query (v2) and Execute a Stored Procedure (v2) require that parameters or query text be provided, but no indication of how said parameters should be formatted.
For example, in terms of executing a stored procedure that takes a single parameter #json, the following text "looks" correct, but results in an error:
"body": "#json=N'+#string(outputs('Convert_Rows_To_Json').body)+'"
Error:
Failed to save logic app UpdateDomainCoverage. The template validation failed: 'The template action 'Execute_stored_procedure_(V2)' at line '1' and column '3148' is not valid: "The template language expression 'json=N'+#string(outputs('Convert_Rows_To_Json').body)+'' is not valid: the string character '=' at position '4' is not expected.".'.
I've tried a number of variations, for both the #json parameter on Execute Stored Procedure, or simply building the query from whole cloth in Execute SQL, to no avail. Suggestions?
Here is sample from Code View of calling a stored procedure with parameter 'from' that takes a datetime value. When you pick the sproc in the Designer it should show all the parameters for you to populate.
"Get_jobs": {
"inputs": {
"body": {
"from": "#{convertFromUtc( variables('SelectTime'), variables('timeZone'), 'yyyy-MM-dd HH:mm:ss')}"
},
"host": {
"connection": {
"name": "#parameters('$connections')['sql_2']['connectionId']"
}
},
"method": "post",
"path": "/datasets/default/procedures/#{encodeURIComponent(encodeURIComponent('[dbo].[GetJobs]'))}"
},
"runAfter": {
"Refresh_data_for_BI": [
"Succeeded"
]
},
"type": "ApiConnection"
},
OK, I've been messing with this on-and-off in between other tasks today, and finally got tired of trying to get it done in the input of the "Execute query".
Brute Force Solution: I added another Javascript step, with the following code:
var input = workflowContext.actions.Convert_Rows_To_Json.outputs.body;
var sqlQuery = 'EXECUTE [ImportDomainCoverage] N\'' + input + '\'';
return sqlQuery;
It's not pretty (one more step), but it works.
Now to see if I can modify things sufficiently to parameterize the table name, rather than needing six steps for each table.
Finally figured out the syntax. Didn't find any documentation, just tried working from one error message to another.
"Pump_data_into_target_table": {
"inputs": {
"body": {
"json": "#{body('Pull_FeedbackItems_from_source').ResultSets['Table1']}"
},
"headers": {
"Content-Type": "application/json"
},
"host": {
"connection": {
"name": "#parameters('$connections')['sql_2']['connectionId']"
}
},
"method": "post",
"path": "/v2/datasets/#{encodeURIComponent(encodeURIComponent('servername.database.windows.net'))},#{encodeURIComponent(encodeURIComponent('dbname'))}/procedures/#{encodeURIComponent(encodeURIComponent('sprocname'))}"
},
"runAfter": {
"Pull_FeedbackItems_from_Source": [
"Succeeded"
]
},
"type": "ApiConnection"
}
The fundamental answer to my question was: provide the parameter/value pairs as a JSON object. See the value of the "body" element in the listing above. For this to work, though, one also has to enter the "headers" element, which I didn't even see documented on the API call. Was led to that by an error message stating that the content type was plain text, when it was clearly json.

Using Junit assert on json array of elements fails on first element

I'm trying to use Katalon Studio for some webservice automation. I have been following this guide on how to parse returned Json body using jsonslurper.
https://docs.katalon.com/katalon-studio/tutorials/parse_json_responses.html
Everything is working fine as described in the guide. I wanted to see if I can use junit asserts, specifically the assertEquals() for better error text.
Given we have this
import groovy.json.JsonSlurper
String jsonString = {"menu": {
"id": "file",
"tools": {
"actions": [
{"id": "new", "title": "New File"},
{"id": "open", "title": "Open File"},
{"id": "close", "title": "Close File"}
],
"errors": []
}}}
JsonSlurper slurper = new JsonSlurper()
Map parsedJson = slurper.parseText(jsonString)
def array1 = parsedJson.menu.tools.actions
String onlickValue1 = ""
for(def member : array1) {
assertEquals("Open File", member.title)
break
}
What I'm having trouble with, is that my assert will thrown an error when comparing the very first title element it encounters (which is "New File").
What I intend is to loop through all the elements in the array and assert my expected value against all of them. If my expected value doesn't exist, then I'd fail.
I feel like I'm missing something, because we've done something similar in the past with java, but I just can't see it here.
So I figured out the problem was my inexperience/ignorance. When looking for solutions online I failed to understand with absolute certainty what the code I'm trying to implement is doing. I was using a for.each loop to assert elements in the array against my expected value. Which of course was failing, correctly, for every element that didn't match my expected value. So I made it work by adding an if statement as below:
String expectedValue = ''
for(def member : array1) {
if (member.title=="Open File")
{
expectedValue = member.title
}
break
}
assertEquals("Open File", member.title)
Also a simpler way I discovered is to use assertJ in the following way
assertThat(member).contains("Open File")
I understand there are better solutions to achieve what I'm trying to do. But for purposes of this question I considered it solved.

Working with dynamic/anonymous objects and JSON.NET

.NET 4.0; VS 2010.
We're consuming a web service that does not offer a WSDL. The data that is returned is not particularly complicated so we thought we would work with dynamic/anonymous types. Here is an example of the JSON returned from one of the service methods (this string has been verified with JSONLint):
[
{
"value": "AAA"
},
{
"value": "BBB"
},
{
"value": "CCC"
},
{
"value": "DDD"
},
{
"value": "EEE"
},
{
"value": "FFF"
}
]
Tried using:
dynamic respDyn = JsonConvert.DeserializeObject(jsonStringAbove);
In this case, no errors are thrown, but in trying to access the resp variable, the Visual Studio debugger reports "The name 'resp' does not exist in the current context".
Tried LINQ next:
var respLinq = JObject.Parse(jsonStringAbove);
Which results in a runtime error: Error reading JObject from JsonReader. Current JsonReader item is not an object: StartArray. Path '', line 1, position 1.
Found this article that recommended different parsing methods depending on the format of the JSON:
if (jsonStringAbove.StartsWith("["))
{
var arr = JArray.Parse(jsonStringAbove);
}
else
{
var obj = JObject.Parse(jsonStringAbove);
}
When var arr = JArray.Parse(jsonStringAbove); is hit, the debugger simply exists the method and returns to the calling procedure. No error is thrown. If the leading and trailing square brackets are removed, another run time error similar to the results in the second example is encountered.
So. Not sure where to turn at this point. Seems like what we're trying to do is very straightforward which make me think I'm missing something blatantly obvious.
Not sure why, but the solution to this was to declare my variables as fields within the class. Variables that were local to the methods I was working with simply did not work. Once declared as class-wide variables, the code behaved as expected. Very odd. I suspect that this problem may be specific to my VS environment and/or solution configuration as it does not appear to be occurring with anyone else. Lucky me.