Is there a specific scope that I can use WL.Server.SetActiveUser? - authentication

I am trying to create an adapter based authentication in Worklight. I have added my Realm, Security test, and Login Module to the authenticationConfig file. I have tried to follow along with the modules provided by IBM. I've copied the exact syntax and even hard coded values for the WL.Server.setActiveUser method. But I continue to get an error. Is there a certain scope I can use this method in? Does anyone see or know where my error is?
I continue to get the follow error:
LOG: Request [login]
LOG: Request [/apps/services/api/GPC2/android/query]
LOG: response [/apps/services/api/GPC2/android/query] success: /*-secure-
{"responseID":"1","isSuccessful":true,"resultSet REMOVED LINE THAT CONTAINED DB RESULTS FOR SECURITY
[/apps/services/api/GPC2/android/query] exception.
SCRIPT5007: Unable to get value of the property 'setActiveUser': object is null or undefined
var lname= responseData.invocationResult.resultSet[0].somelastname;
var gpcid = responseData.invocationResult.resultSet[0].someid;
var fname = responseData.invocationResult.resultSet[0].somefname;
WL.Logger.debug("Login :: SUCCESS" + lname + " " + gpcid + " " + fname); //this line does write the values to the log
//WL.Client.login("NotificationsRealm");
WL.Server.setActiveUser ("NotificationsRealm", {
userId: gpcid,
displayName: fname,
attributes: {
firstName: fname,
lastName : lname,
isUserAuthenticated: 1,
}
});

Looking at the API documentation for WL.Server.setActiveUser, it should be like this:
WL.Server.setActiveUser ("ACMERealm", {
userId: "38017840288",
displayName: "John Doe",
attributes: {
"firstName": "John",
"lastName": "Doe",
"lastLogin": "2010-07-13 19:25:08.0 GMT",
}
})
Looks like you are missing the double quotes for the attributes?

Related

Google App Script Big Query - GoogleJsonResponseException: API call to bigquery.jobs.query failed with error: Query parameter 'X' not found

I have been struggling with this for a couple of days now and I felt like I should reach out. This might be very simple but I am not from a programming background and I haven't found any resources to solve this so far.
Basically, I want to parameterize a SQL query that is running for BigQuery within Google APp Script, it takes a variable from a user from a Google From they have submitted and I wanted to ensure that this won't be injectable by parameterizing the query, however, I got the following error that I could not fix:
GoogleJsonResponseException: API call to bigquery.jobs.query failed with error: Query parameter 'account_name' not found at [1:90]
Here is how I run the query:
//Query
const sqlQuery = 'SELECT district FROM `table` WHERE account_name = #account_name AND ent_theatre=("X") LIMIT 1;'
const request = {
query: sqlQuery,
params: { account_name: queryvar },
useLegacySql: false,
};
// Run Query
var queryResult = BigQuery.Jobs.query(request,projectID);
I have created the query based on Google's documentation
Your syntax for request object is not correct. The right syntax for the BigQuery.Jobs.query Request is like below:
const request = {
query: sqlQuery,
queryParameters: [
{
name: "account_name",
parameterType: { type: "STRING" },
parameterValue: { value: queryvar }
}
],
useLegacySql: false,
};
For more detail about QueryRequest Object refer to this link.

The provided key element does not match the schema. GraphQL Mutation error

I am trying to test/run a mutation that creates groupChat in my DynamoDB by id,groupChatName, messages, createdTime, createdUser, users. I have 2 seperate tables, UserTable and GroupChatTable.The problem is I keep getting data is null and an error that says "the provided key element does not match the schema. ErrorCode: Validation Exception, request ID." Resolvers are attached to my tables so I am not sure why I am getting this error.
The weird thing is when I check the groupChatTable, my mutation is saved incorrectly as an input.This is what it looks like,
Ex: {"createdTime":{"S":"12:00"},"createdUser":{"S":"Me"},........
Below is the Mutation,Schema type,and Resolver.
createGroupChat(input:{
id: 4
groupChatName: "newgroup"
messages: "we love this group"
createdTime:"12:00"
createdUser: "Me"
users:"we, me"
}) {
id
groupChatName
messages
createdTime
createdUser
users
}
}```
```type GroupChat {
id: ID!
groupChatName: String!
messages: String
createdTime: String!
createdUser: String!
users: String
}```
```{
"version" : "2017-02-28",
"operation" : "PutItem",
"key" : {
"id": $util.dynamodb.toDynamoDBJson($util.autoId()),
},
"attributeValues" : $util.dynamodb.toMapValuesJson($ctx.args)
}```
It looks like the way data is being stored through resolver is incorrect and when it returns it doesn't match the schema
Instead of using $util.dynamodb.toMapValuesJson(($ctx.args))
use: $util.dynamodb.toMapValuesJson($util.parseJson($util.toJson($ctx.args.input)))

Karate contains and all key-values did not match error

I try to learn Karate but have some issue and I can't resolve it by myself.
So my Feature is looking rather simple:
Feature: Alerting get the list of all alerts
Background:
* url 'url'
Scenario: Retrieve all alerts
Given path '5c348c553a892c000bb1f2dd'
When method get
Then status 200
And match response contains {id: 5c348c553a892c000bb1f2dd}
The case here is to fetch a response and make sure that given ID is on the list. As far I understand this documentation keyword contains should lookup only for the given phrase but I get an error: reason: all key-values did not match
This is my console output:
allAlertsGet.feature:10 - path: $, actual: {data={name=Baelish of Harrenhal, user=griffin, id=5c348c553a892c000bb1f2dd, tags=["Gared"], triggers={prometheus=[{"js_id":"Qarth","labels":["Harry Potter and the Sorcerer's Stone"],"operator":"==","query":"up","value":"1"}]}, trigger_interval=398s, receivers={slack=[{"holdoffTime":"0s","id":"Stalls","message":"Dark and difficult times lie ahead. Soon we must all face the choice between what is right and what is easy.","revokeMessage":"Every flight begins with a fall.","token":"Buckbeak"}]}, hold_cap=4, max_cap=16, factor=2, createDate=1546947669, triggered_date=1546948867, mute_until=0, muted=false, status=3}}, expected: {id=5c348c553a892c000bb1f2dd}, reason: all key-values did not match
What I have missed? I use karate 0.9.0.
Pay attention to the nested structure of your JSON. You can paste this snippet into a Scenario and try it, this is a tip - you can experiment quickly without making HTTP requests like this:
* def response = { data: { name: 'Baelish of Harrenhal', user: 'griffin', id: '5c348c553a892c000bb1f2dd' } }
* match response.data contains { id: '5c348c553a892c000bb1f2dd' }
EDIT: just to show off a few other ways to do assertions:
* match response.data.id == '5c348c553a892c000bb1f2dd'
* match response..id contains '5c348c553a892c000bb1f2dd'
* def id = { id: '5c348c553a892c000bb1f2dd' }
* match response == { data: '#(^id)' }
* match response contains { data: '#(^id)' }

API_Defining an array

How can I define array below in Intellij? I am sending this API request and getting error:
sharedAccountDetails [
Account details.
SharedAccountItem{
accountNumber string
Account number of the user.
accountName string
Account name of the user.
accountType string
Account type saving/current etc.
branchCode string
Branch Code.
}]
This is my request below:
And request { 'channel': 'email'}
And request { 'accountNumber': '000000000'}
And request { 'accountName': 'Mr Bytes C'}
And request { 'accountType': 'Current Account'}
And request { 'branchCode': '000'}
It is requesting for the arrays. how do I define them?
Thanks in advance.
Regards,
Tshilidzi
Not sure I understand but can you look at the demo examples of Karate carefully. Looks like you have to understand how to use JSON. See this example below:
And request
"""
{
sharedAccountDetails: [
{ accountNumber: '000000000', 'accountName': 'Mr Bytes C', 'accountType': 'Current Account' }
]
}
"""

TableData.insertAll with templateSuffix - frequent 503 errors

We are using TableData.insertAll with a templateSuffix and are experiencing frequent 503 errors with our usage pattern.
We set the templateSuffix based on two pieces of information - the name of the event being inserted and the data of the event being inserted. E.g. 'NewPlayer20160712'. The table ID is set to 'events'.
In most cases this works as expected, but relatively often it will fail and return an error. Approximately 1 in every 200 inserts will fail, which seems way too often for expected behaviour.
The core of our event ingestion service looks like this:
//Handle all rows in rowsBySuffix
async.mapLimit(Object.keys(rowsBySuffix), 5, function(suffix) {
//Construct request for suffix
var request = {
projectId: "tactile-analytics",
datasetId: "discoducksdev",
tableId: "events",
resource: {
"kind": "bigquery#tableDataInsertAllRequest",
"skipInvalidRows": true,
"ignoreUnknownValues": true,
"templateSuffix": suffix, // E.g. NewPlayer20160712
"rows": rowsBySuffix[suffix]
},
auth: jwt // valid google.auth.JWT instance
};
//Insert all rows into BigQuery
var cb = arguments[arguments.length-1];
bigquery.tabledata.insertAll(request, function(err, result) {
if(err) {
console.log("Error insertAll. err=" + JSON.stringify(err) + ", request.resource=" + JSON.stringify(request.resource));
}
cb(err, result);
});
}, arguments[arguments.length-1]);
A typical error would look like this:
{
   "code": 503,
   "errors": [
      {
         "domain": "global",
         "reason": "backendError",
         "message": "Error encountered during execution. Retrying may solve the problem."
      }
   ]
}
The resource part for the insertAll that fails looks like this:
{
   "kind": "bigquery#tableDataInsertAllRequest",
   "skipInvalidRows": true,
   "ignoreUnknownValues": true,
   "templateSuffix": "GameStarted20160618",
   "rows": [
      {
         "insertId": "1f4786eaccd1c16d7ce865fea4c7af89",
         "json": {
            "eventName": "gameStarted",
            "eventSchemaHash": "unique-schema-hash-value",
            "eventTimestamp": 1466264556,
            "userId": "f769dc78-3210-4fd5-a2b0-ca4c48447578",
            "sessionId": "821f8f40-ed08-49ff-b6ac-9a1b8194286b",
            "platform": "WEBPLAYER",
            "versionName": "1.0.0",
            "versionCode": 12345,
            "ts_param1": "2016-06-04 00:00",
            "ts_param2": "2014-01-01 00:00",
            "i_param0": 598,
            "i_param1": 491,
            "i_param2": 206,
            "i_param3": 412,
            "i_param4": 590,
            "i_param5": 842,
            "f_param0": 5945.442,
            "f_param1": 1623.4111,
            "f_param2": 147.04747,
            "f_param3": 6448.521,
            "b_param0": true,
            "b_param1": false,
            "b_param2": true,
            "b_param3": true,
            "s_param0": "Im guesior ti asorne usse siorst apedir eamighte rel kin.",
            "s_param1": "Whe autiorne awayst pon, lecurt mun.",
            "eventHash": "1f4786eaccd1c16d7ce865fea4c7af89",
            "collectTimestamp": "1468346812",
            "eventDate": "2016-06-18"
         }
      }
   ]
}
We have noticed that, if we avoid including the name of the event in the suffix (e.g. the NewPlayer part) and instead just have the date as the suffix, then we never experience these errors.
Is there any way that this can be made to work reliably?
Backend errors happen, we usually see 5 from 10000 requests. We simply retry, and we have more constant rate, and we can provide a reconstructable use case we put a ticket on the Bigquery issue tracker. This way if there is something wrong with our project it can be investigated.
https://code.google.com/p/google-bigquery/