Is there a way to push the message to two backends in the async agent? - rabbitmq

(As context, I am using RabbitMQ as the message broker, integrated by KrakenD. The APIs are using Nestjs.)
I understand that the async agent in KrakenD can push the data consumed to multiple backends:
KrakenD contacts the defined backend(s) list passing the event data when a new message kicks in.
However, passing two different backends here result to logger indicating a context exceeded for both of the APIs. If I just put a single backend in the list, it returns what's expected.
Here's the working code:
"backend": [
{
"url_pattern": "/newOrder",
"method": "POST",
"host": [ "http://127.0.0.1:3300" ],
"disable_host_sanitize": false,
"extra_config": {
"modifier/martian": {
"header.Modifier": {
"scope": [
"request"
],
"name": "Content-Type",
"value": "application/json"
}
}
}
},
{
"url_pattern": "/newOrderNotification",
"method": "POST",
"host": [ "http://127.0.0.1:3200" ],
"disable_host_sanitize": false,
"extra_config": {
"modifier/martian": {
"header.Modifier": {
"scope": [
"request"
],
"name": "Content-Type",
"value": "application/json"
}
}
}
}
],
Hope I can receive any advice on this. Thanks!

You can connect a single async agent to several backends but KrakenD does not support distributed transactions, so no more than one non-safe backend request (as defined at RFC 2616, section 9) is allowed per pipe. From the documentation (https://www.krakend.io/docs/backends/):
Even though you can use several backends in one endpoint, KrakenD does not allow you to define multiple non-safe (write) backends. This is a (sometimes controversial) design decision to disable the gateway to handle transactions.
If you need to have a write method (POST, PUT, DELETE, PATCH) together with other GET methods, use the sequential proxy and place a maximum of 1 write method at the end of the sequence.
If you want to send a secondary non-safe request, you can add a minimal lua snippet using the http_response helper (https://www.krakend.io/docs/endpoints/lua/#making-additional-requests-http_response) just like this:
{
"extra_config": {
"modifier/lua-proxy": {
"pre": "local r = request.load(); http_response.new('http://127.0.0.1:3200/newOrderNotification', "POST", r:body())"
}
}
}

Related

Adding two jwt validators to api endpoint through krakenD

I have two authentication mechanisms that I need to enable through proxy using krakenD. Each authentication has their own jwk-url to validate the keys of the token. I am using krakenD community edition 1.3 and following the krakenD documentation for jwt validation here https://www.krakend.io/docs/v1.3/authorization/jwt-validation/. Is there a way to add two jwt validators to the same endpoint?
{
"endpoint": "/paas/hydra/test/ab/projects/{project}",
"method": "GET",
"output_encoding": "no-op",
"headers_to_pass": ["Authorization"],
"extra_config": {
"github.com/devopsfaith/krakend-jose/validator": {
"alg": "RS256",
"jwk-url": "{{ .jwk.host }}/oauth2/v1/keys",
"cache": true,
"cache_duration": 1800,
"disable_jwk_security": true
},
"github.com/devopsfaith/krakend-jose/validator": {
"alg": "RS256",
"jwk-url": "https://authbluetokens.aexp.com/v2/app2app/tokens/keys",
"cache": true,
"cache_duration": 1800,
"disable_jwk_security": true
}
},
"backend": [
{
"method": "GET",
"encoding": "no-op",
"host": ["http://localhost:8080"],
"url_pattern": "/v6/workspaces/{project}",
"extra_config": {
"github.com/devopsfaith/krakend/transport/http/client/executor": {
"name": "bomoktacustom",
"audienceid": "0oasx1emolGCrwnht0x7,0oasx1nyh6ytqb96z0x7,0oaohi3lo9nr9lMXu0x7,0oarc7drtoyMIbAic0x7,0oaqnh8rpuaCUFhD70x7,0oaqctxj5rYvoUVuy0x7,0oappvfm4hre783rH0x7,0oawdujgrqi4DPyBz0x7,*.aexp.com"
},
"github.com/devopsfaith/krakend-ratelimit/juju/proxy": {
"maxRate": 6,
"capacity": 6
},
"github.com/devopsfaith/krakend-martian": {
"header.Modifier": {
"scope": ["request", "response"],
"name": "Content-Type",
"value": "application/json"
}
}
}
}
]
},
As shown in the code, I have tried adding two krakend-jose/validator to the same endpoint. The current behavior of this implementation is krakenD ignores the first validator and only utilizes the second one. When using a token that requires the first validator krakenD returns Error #01: no Keys has been found. But using a token with the second validator works. The behavior I need is krakenD allowing to validate both type of tokens. Any help would be much appreciated!!
The functionality you are looking for works only in the enterprise edition: https://www.krakend.io/docs/enterprise/authentication/multiple-identity-providers/ (it was also available on EE v1.3)
The community version expects 1 identity provider per endpoint only. Duplicating the keys or any other approach is not going to have the desired outcome.

Ocelot - adding multiple AuthenticationProviderKeys for one Downstream

I have 2 authentication methods (2 different login pages) for my project that return the JWT token. Some microservices are supposed to accept only one of the two methods, but others should be accessible and authorized by both methods (either one or the other). Essentially my problem is in the AuthenticationOptions part of the Ocelot configuration:
{
"DownstreamPathTemplate": "/FM",
"DownstreamScheme": "http",
"DownstreamHostAndPorts": [
{
"Host": "localhost",
"Port": 5285
}
],
"AuthenticationOptions": {
"AuthenticationProviderKey": "firstauthenticationmethod",
"AllowedScopes": []
},
"UpstreamPathTemplate": "/API/GetFMData",
"UpstreamHttpMethod": [ "Get" ]
},
Here I can provide only one AuthenticationProviderKey for the FM microservice. However this one I would like to be authorized whether the user is giving JWT token from the first authentication method or from the second one. I can't supply an array of strings to this property like this for example:
"AuthenticationOptions": {
"AuthenticationProviderKey": ["firstauthenticationmethod", "secondauthenticationmethod"],
"AllowedScopes": []
}
nor provide an array of AuthenticationOptions like this for example:
"AuthenticationOptions": [{
"AuthenticationProviderKey": "firstauthenticationmethod",
"AllowedScopes": []
},
{
"AuthenticationProviderKey": "secondauthenticationmethod",
"AllowedScopes": []
}],
Both of these are not allowed in the Ocelot config file. Is there a way to make this microservice configurable in Ocelot to allow either of the authentication methods?

Is it possible to call lambda function from other lambda functions in AWS serverless API?

I am creating a serverless API using AWS SAM template and ASP.Net Core.
I wanted to know if it is possible to call a common lamnda function from multiple lambda functions?
I have 2 APIs for user authentication.
/user/authenticate
/admin/authenticate
Now when the user calls these API endpoints I want to call a common lambda function which will look like following:
public AuthResponse Authenticate(AuthInfo info, int role);
I get a user role based on which API endpoint is called. For example if /user/authetication is called then role=1 otherwise role=0.
And then I want Authenticate() lambda to perform user authentication based on the AuthInfo + Role.
I want to do this because all my users are stored in the same table and I would like to cross verify if user has the correct role to access the feature.
I will also share a portion of serverless.template used for above APIs.
/admin/authenticate
"Handler": "Server::API.Admin::Authenticate",
"Description" : "Allows admin to authenticate",
"Runtime": "dotnetcore2.1",
"CodeUri": "",
"MemorySize": 256,
"Timeout" : 300,
"Role": {"Fn::GetAtt" : [ "LambdaExecutionRole", "Arn"]},
"FunctionName" : "AdminAuthenticate",
"Events":
{
"PutResource":
{
"Type": "Api",
"Properties":
{
"Path": "/v1/admin/authenticate",
"Method": "POST"
}
}
}
}
}
/user/authenticate
"Handler": "Server::API.User::Authenticate",
"Description" : "Allows user to authenticate",
"Runtime": "dotnetcore2.1",
"CodeUri": "",
"MemorySize": 256,
"Timeout" : 300,
"Role": {"Fn::GetAtt" : [ "LambdaExecutionRole", "Arn"]},
"FunctionName" : "UserAuthenticate",
"Events":
{
"PutResource":
{
"Type": "Api",
"Properties":
{
"Path": "/v1/user/authenticate",
"Method": "GET"
}
}
}
}
}
As you can see above, 2 lambda functions are created AdminAuthenticate and UserAuthentication. I want these lambda functions to share the common code.
Does anyone has any idea how to do it?
Thanks and Regards.
I can think about 2 options to achieve your goal. In the first option, you use multiple Lambda functions, one for each endpoint, both pointing to your same codebase. In the second option, you have a single Lambda function that handles all authentication needs.
Single codebase, multiple functions
In this case, you can define your template file with 2 functions but use the CodeUri property to point to the same codebase.
{
"AWSTemplateFormatVersion": "2010-09-09",
"Transform": "AWS::Serverless-2016-10-31",
"Resources": {
"AdminFunction": {
"Type": "AWS::Serverless::Function",
"Properties": {
"Handler": "Server::API.Admin::Authenticate",
"Description": "Allows admin to authenticate",
"Runtime": "dotnetcore2.1",
"CodeUri": "./codebase_path/",
"MemorySize": 256,
"Timeout": 300,
"FunctionName": "AdminAuthenticate",
"Events": {
"PutResource": {
"Type": "Api",
"Properties": {
"Path": "/v1/admin/authenticate",
"Method": "POST"
}
}
}
}
},
"UserFunction": {
"Type": "AWS::Serverless::Function",
"Properties": {
"Handler": "Server::API.User::Authenticate",
"Description": "Allows user to authenticate",
"Runtime": "dotnetcore2.1",
"CodeUri": "./codebase_path/",
"MemorySize": 256,
"Timeout": 300,
"FunctionName": "UserAuthenticate",
"Events": {
"PutResource": {
"Type": "Api",
"Properties": {
"Path": "/v1/user/authenticate",
"Method": "POST"
}
}
}
}
}
}
}
Single codebase, single function
In this case, you will expose 2 endpoints on API Gateway, but they will be directed to the same handler on your function. Therefore, you will need to write some logic on your code to handle the login properly. The event object passed to your Lambda function will have information on the original URL in the path property (reference -- even though this is for Lambda proxy, still applies).
The template file in this case would be similar to (note I replaced Admin/User terms to "Any", since this will handle any form of authentication):
{
"AWSTemplateFormatVersion": "2010-09-09",
"Transform": "AWS::Serverless-2016-10-31",
"Resources": {
"AnyFunction": {
"Type": "AWS::Serverless::Function",
"Properties": {
"Handler": "Server::API.Any::Authenticate",
"Description": "Allows any to authenticate",
"Runtime": "dotnetcore2.1",
"CodeUri": "./hello_world/",
"MemorySize": 256,
"Timeout": 300,
"Events": {
"UserEndpoint": {
"Type": "Api",
"Properties": {
"Path": "/v1/user/authenticate",
"Method": "POST"
}
},
"AdminEndpoint": {
"Type": "Api",
"Properties": {
"Path": "/v1/admin/authenticate",
"Method": "POST"
}
}
}
}
}
}
}
You can invoke lambda functions through any lambda function, using aws sdk in your chosen language and there you have invoke function defined.
For a reference here is the link to boto3 invoke definition.
Also
The approach you are using for authentication using common codebase is not the right one.
If you need a lambda function to check or authentication particular request, you can setup custom authorizer for that, which in your terms i can say, share lambda code or call it first before invoking the lambda you setup for the particular endpoint with the possibility to pass the custom data if you want.
A Lambda authorizer (or as a custom authorizer) is an API Gateway feature that uses a Lambda function to control access to your API.
If still this doesn't solve your problem and you want common codebase, you can point out as many as api endpoint's to same lambda function.
Then you have to handle event['resources'] inside your codebase.
This is not the recommended way, but you can use it.
You can refer aws samples to setup custom authorizer or the documentation is fair enough to clear all your doubts.

Log data from dataPower to splunk

The question might be looking easy but I am quite struck on this.
I have a requirement whereby I have to store data regarding Timestamp,latency,serviceName etc in a variable and then log that into splunk.
However I am unable to call splunk through datapower xslt.
How can we call splunk through datapower using XSLT
Thanks
Splunk has several interfaces, but XSLT is not one of them. Lucky for you, there's already a Splunk app that can collect data from Datapower and index it. See https://splunkbase.splunk.com/app/3517/.
I would consider using the Splunk HTTP Event Collector.
You can use XSLT ou Gatewayscript, in conjunction with the Datapower urlopen function (available in both language), to make a simple http call to the collector.
I found here (Code under Apache license) that the call is as simple as a call to https://SPLUNK_SVR:8088/services/collector/event/1.0 with the following body:
{
"source": "chicken coop",
"sourcetype": "httpevent",
"index": "main",
"host": "farm.local",
"event": {
"message": {
"chickenCount": 500
"msg": "Chicken coup looks stable.",
"name": "my logger",
"put": 98884,
"temperature": "70F",
"v": 0
},
"severity": "info"
}
}
I think it would work better on the datapower by using gateway script, an example of such a call can be found here. Look for the first example. You will find similar code, in which I modified the "Data" section:
//Could be added to a library
var urlopen = require('urlopen');
var jsonData = '{
"source": "Datapower",
"sourcetype": "SOMETHING DYNAMIC",
"index": "main",
"host": "GET_THIS_FROM_DP_VARIABLES",
"event": {
"message": {
"SOMECOUNTER": 500
"msg": "SOME INTERESTING INFORMATION.",
"name": "GET_THIS_FROM_DP_VARIABLES",
"put": 3333,
"yadayada": "foo",
"bar": 0
},
"severity": "info"
}
}';
var options = {
target: 'https://SPLUNK_SVR:8088/services/collector/event/1.0',
method: 'POST',
headers: { },
contentType: 'text/plain',
timeout: 60,
sslClientProfile: 'AN_EXISTING_SSL_PROFILE_ON_DATAPOWER',
data: jsonData};
urlopen.open(options, function(error, response) {
if (error) {
// an error occurred during the request sending or response header parsing
console.error("Splunk Logging - urlopen error: "+JSON.stringify(error));
} else {
// get the response status code
var responseStatusCode = response.statusCode;
var responseReasonPhrase = response.reasonPhrase;
console.log("Splunk Logging - status code: " + responseStatusCode);
console.log("Splunk Logging - reason phrase: " + responseReasonPhrase);
// no need to read response data - This is just logging
}
});

Loopback Connector REST API

How to create an external API on Loopback?
I want to get the external API data and use it on my loopback application, and also pass the input from my loopback to external API and return result or response.
Loopback has the concept of non-database connectors, including a REST connector. From the docs:
LoopBack supports a number of connectors to backend systems beyond
databases.
These types of connectors often implement specific methods depending
on the underlying system. For example, the REST connector delegates
calls to REST APIs while the Push connector integrates with iOS and
Android push notification services.
If you post details on the API call(s) you want to call then I can add some more specific code samples for you. In the mean time, this is also from the documentation:
datasources.json
MyModel": {
"name": "MyModel",
"connector": "rest",
"debug": false,
"options": {
"headers": {
"accept": "application/json",
"content-type": "application/json"
},
"strictSSL": false
},
"operations": [
{
"template": {
"method": "GET",
"url": "http://maps.googleapis.com/maps/api/geocode/{format=json}",
"query": {
"address": "{street},{city},{zipcode}",
"sensor": "{sensor=false}"
},
"options": {
"strictSSL": true,
"useQuerystring": true
},
"responsePath": "$.results[0].geometry.location"
},
"functions": {
"geocode": ["street", "city", "zipcode"]
}
}
]
}
You could then call this api from code with:
app.dataSources.MyModel.geocode('107 S B St', 'San Mateo', '94401', processResponse);
You gonna need https module for calling external module inside loopback.
Suppose you want to use the external API with any model script file. Let the model name be Customer
Inside your loopback folder. Type this command and install https module.
$npm install https --save
common/models/customer.js
var https = require('https');
Customer.externalApiProcessing = function(number, callback){
var data = "https://rest.xyz.com/api/1";
https.get(
data,
function(res) {
res.on('data', function(data) {
// all done! handle the data as you need to
/*
DO SOME PROCESSING ON THE `DATA` HERE
*/
enter code here
//Finally return the data. the return type should be an object.
callback(null, data);
});
}
).on('error', function(err) {
console.log("Error getting data from the server.");
// handle errors somewhow
callback(err, null);
});
}
//Now registering the method
Customer.remoteMethod(
'extenalApiProcessing',
{
accepts: {arg: 'number', type: 'string', required:true},
returns: {arg: 'myResponse', type: 'object'},
description: "A test for processing on external Api and then sending back the response to /externalApiProcessing route"
}
)
common/models/customer.json
....
....
//Now add this line in the ACL property.
"acls": [
{
"principalType": "ROLE",
"principalId": "$everyone",
"permission": "ALLOW",
"property": "extenalApiProcessing"
}
]
Now explore the api at /api/modelName/extenalApiProcessing
By default its a post method.
For more info. Loopback Remote Methods