How to configure Stormpath as middleware in Sails.js - express

What is the best way to implement the following code in sails.js v0.10.5? Should I be handling this with a policy, and if so, how? The init() function required by Stormpath requires Express (app) as an argument. Currently, I am using the following code in sails.config.http.js as custom middleware.
customMiddleware: function(app) {
var stormpathMiddleware = require('express-stormpath').init(app, {
apiKeyFile: '',
application: '',
secretKey: ''
});
app.use(stormpathMiddleware);
}

Yes, this is the preferred way of enabling custom Express middleware with Sails if it does more than just handling a request (as in your case, where .init requires app). For simpler cases where you want to implement custom middleware that just handles requests, you can add the handler to sails.config.http.middleware and also add the handler name to the sails.config.http.middleware.order array. See the commented out defaults in config/http.js for an example using myRequestLogger.
Also note that the $custom key in the sails.config.http.middleware.order array indicates where the customMiddleware code will be executed, so you can change the order if necessary.

Related

Blazor - Circular references - serialization and deserialization default options

In a Blazor WebAssembly app I have one single server side method that returns results with circular references.
I found out that I can handle this situation on the server side by adding the following:
builder.Services.AddControllersWithViews()
.AddJsonOptions(options =>
{
options.JsonSerializerOptions.ReferenceHandler = ReferenceHandler.Preserve;
});
and on the client side:
var options = new JsonSerializerOptions() { ReferenceHandler = ReferenceHandler.Preserve };
var r = await _http.GetFromJsonAsync<MyObject>>($"api/mycontroller/mymethod", options);
Unfortunately this way reference handling is enabled for every method on server. This introduces "$id" keys in almost all my methods results.
This would force me to change every client call to introduce ReferenceHandler.Preserve option.
Is there a way to specify ReferenceHandler.Preserve for some methods only (server side) or alternatively an option to force ReferenceHandler.Preserve for every GetFromJsonAsync (client side)?
You can use custom middleware in your sever . In custom middleware , you can put the method in it and do judge the URL passed by blazor. If the url meets the requirements, execute the method in the middleware, If not ,Just ignore this method.

How to pass request headers set using headers option for ServerSideEvent (If Platform is browser) in React js

I have got to know that, the new npm package #microsoft/signalR provided options to pass custom header to httpClient used to make SSE calls in javascript (using headers option in withUrl).
But found a difference in the code (git code) where I see, the same custom header isn't forwarded if the request is from Browser or WebWorker. If otherwise, it is forwareded (git code)
I would like to understand, is there a security reason for not forwarding the header? If yes, is there a way to get it working? i.e, set custom header when making HTTP requests if the transport type is SSE (ServerSentEvent).
The reason is because browsers do not support sending headers with EventSource
https://developer.mozilla.org/en-US/docs/Web/API/EventSource/EventSource
Answering my question for future readers.
I have got it working for my requirement, Where I need to pass custom headers to all the signalR calls irrespective of transport type, starting from the negotiate call.
I could be able to send the custom header using the headers option while creating hubConnectionBuilder.withUrl(url, options) (have given detailed answer here)
To the point:
For SSE, as mentioned by Brennan, we cant set the custom header with the Native EventSource constructor. But I have achieved it using the EventSource polyfill using this package (npm package)
Two points to note down, if you are using signalR and try to achieve the same as mine.
By default signalR uses native EventSource, but there is a property we can set in the same options parameter in withUrl.
Extend the polyfill constructor and add the custom headers.
import { EventSourcePolyfill } from 'event-source-polyfill';
function EventSourceWithCustomHeader(url, options) {
return new EventSourcePolyfill(url, {
...options,
headers: {
...options.headers,
"custom-header-name": "value"
}
});
}
const conn = new signalR.HubConnectionBuilder()
.withUrl("/chat", {
headers: {
"custom-header-name": "value"
},
EventSource: EventSourceWithCustomHeader,
})
.build();

Enabling binary media types breaks Option POST call (CORS) in AWS Lambda

New to AWS..
We have a .NET Core Microservice running on a serverless aws instance as lambda functions.
Our Controller looks like this
[Route("api/[controller]")]
[ApiController]
public class SomeController : ControllerBase
{
[HttpGet()]
[Route("getsomedoc")]
public async Task<IActionResult> GetSomeDoc()
{
byte[] content;
//UI needs this to process the document
var contentDisposition = new System.Net.Http.Headers.ContentDispositionHeaderValue("attachment");
contentDisposition.FileName = "File Name";
Response.Headers[HeaderNames.ContentDisposition] = contentDisposition.ToString();
return File(content, "application/octet-stream");
}
[HttpPost()]
[Route("somepost")]
public async Task<IActionResult> SomePost()
{
return null;
}
}
URL's
{{URL}}/getsomedoc
{{URL}}/somepost
We have enabled 'Binary Media Types' in AWS package settings to / for the getsomedoc to work otherwise it was returning the byte array back instead of the file.
But this is breaking our 'somepost' call when UI is accessing the API using
Method: OPTIONS & Access-Control-Request-Method as POST
When we remove the binary media type the 'somepost' starts working.
Looking for suggestions as why this might be happening? and what can we add/remove from gateway to get this fixed.
Well we ended up resolving this in a strange way.
Added two gateways for the lambda
- on one of them have binary enabled
- Disabled on the other one.
For
getsomedoc - Using the one where binary media types are enabled
postsomedoc - Using the other one
Wish there was a better way!!
I have found this same behavior with my API. While looking everywhere for some help, I found a few things that address the issue:
Basically, this bug report says the problem is having CORS enabled while also using the generic Binary Media Type "*/*". Apparently the OPTIONS method gets confused by this. They discuss this in terms of using Serverless, but it should apply to using the console or other ways of interacting with AWS.
They link to a possible solution: you can modify the Integration Response of the OPTIONS method - change the Mapping Template's Content-Type to an actual binary media type, like image/jpeg. They say this allows you to leave the binary media type in Settings as "*/*". This is a little hacky, but at least it is something.
There also was this alternate suggestion in the issues section of this GitHub repo that is a little less hacky. You can set the content handling parameter of the OPTIONS Integration Request to "CONVERT_TO_TEXT"... but you can only do this via CloudFormation or the CLI (not via the console). This is also the recommended solution by some AWS Technicians.
Another possible workaround is to setup a custom Lambda function to handle the OPTIONS request, this way the API gateway may have the "*/*" Binary Media Type.
Create a new lambda function for handling OPTIONS requests:
exports.handler = async (event) => {
const response = {
statusCode: 200,
headers:{
'access-control-allow-origin':'*',
'Access-Control-Allow-Headers': 'access-control-allow-origin, content-type, access-control-allow-methods',
'Access-Control-Allow-Methods':"GET,POST,PUT,DELETE,OPTIONS"
},
body: JSON.stringify("OK")
};
return response;
};
In your API Gateway OPTION method, change the integration type from Mock to Lambda Function.
Make sure to check 'Use Lambda proxy integration'
Select the correct region and point to the created Lambda Function
This way any OPTIONS request made from the browser will trigger the Lambda function and return the custom response.
Be aware this solution might involve costs.

How to call an express.js handler from another handler

I'm building an isomorphic React application which is using express.js on the server. The client app makes a number of AJAX requests to other express handler which currently entails them making multiple HTTP requests to itself.
As an optimisation I'd like to intercept requests I know the server handles and call them directly (thus avoiding the cost of leaving the application bounds). I've got as far as accessing the apps router to know which routes it handlers however I'm struggling to find the best way to start a new request. So my question is:
How do I get express to handle an HTTP request that comes from a programatic source rather than the network?
I would suggest create a common service and require it in both the handlers. What I do is break the business logic in the service and create controllers which handles the request and call specific services in this way u can use multiple services in same controller eg.
router.js
var clientController = require('../controllers/client-controller.js');
module.exports = function(router) {
router.get('/clients', clientController.getAll);
};
client-controller.js
var clientService = require('../services/client-service.js');
function getAll(req, res) {
clientService.getAll().then(function(data) {
res.json(data);
}, function(err) {
res.json(err);
});
}
module.exports.getAll = getAll;
client-service.js
function getAll() {
// implementation
}
module.exports.getAll = getAll;
u can also use something like http://visionmedia.github.io/superagent/ to make http calls from controllers and make use of them.

how to use both express.io and passport.socketio authentication features globally

socket.io supports a single global authorization method with no middleware feature. Both express.io and passport.socketio depend on this feature as an injection point for their logic.
express.io attaches the express session to the request, and passport.socketio attaches the inflated user object. how do I combine the two features elegantly?
The only way I found is grabbing the authorization callback of express.io from socket.io and wiring it to be passport.socketio's success callback:
app.io.configure(function() {
var expressAuth = app.io.get('authorization');
var sessionConfig = {
...
success : expressAuth // callback on success
};
app.io.set('authorization', passportSocketIo.authorize(sessionConfig));
});
This works for me, but It's coupled with the order of the 'authorization' registrations. Any better ideas?