I'm relatively new to Strongloop's Loopback.
A project I'm working on requires HTTP-Digest to use as authentication.
I have setup the ACL on the models (and endpoints). SPA client uses REST to consume services.
I'm stuck on how to use http digest auth (username:realm:password) / nonce instead of the plain login of username/password.
I still would like to use the token auth also.
I'm currently looking at the ff 3 projects though:
loopback-component-auth
passport-http
loopback-component-passport
Any help would be appreciated! Thank you!
You can use Express Middleware to configure HTTP authentication:
Use this node module: http-auth
Create digest-auth.js boot script in server/boot folder
var auth = require('http-auth');
var basic = auth.basic({
realm: "<your authentication realm>",
file: __dirname + "<path to your .htpasswd file"
});
module.exports = function (app) {
app.use(auth.connect(basic));
// Setup route.
app.get("/", (req, res) => {
res.send("Secured resource access granted!");
});
}
You can check more option available with "http-auth" module to use "username:realm:password" for authentication
Hope this would help you !
Related
I'm looking for a better way to authenticate Google Cloud Function with a service account. Right now I'm storing the credentials json file on the backend. This is the code for my app https://github.com/ChristianOConnor/spheron-react-api-stack. This app could be deployed on any hosting platform, but at the moment the app is built to deploy on a Web3 protocol called Spheron. TLDR, Spheron runs the backend express server on a web3 friendly content serving/hosting platform called Akash. This means that whoever is hosting my backend express server has access to my GCP service account's credentials. You can see all of the code in the link I provided but just for ease of access this is the server.js file which will be on Akash.
server.js
var express = require("express");
var app = express();
require("dotenv").config();
const GoogleAuth = require("google-auth-library").GoogleAuth;
const cors = require("cors");
app.use(
cors({ origin: process.env.ORIGIN, credentials: process.env.CREDENTIALS })
);
app.get("/hello", async function (req, res) {
const keyInJsn = JSON.parse(process.env.CREDENTIALS_STR);
const auth = new GoogleAuth({
credentials: keyInJsn,
});
const url = process.env.RUN_APP_URL;
//Create your client with an Identity token.
const client = await auth.getIdTokenClient(url);
const result = await client.request({ url });
const resData = result.data;
res.send(resData);
});
var server = app.listen(8081, function () {
var host = server.address().address;
var port = server.address().port;
console.log("Example app listening at http://localhost:", port);
});
process.env.CREDENTIALS_STR is the service account credentials set up in this format:
CREDENTIALS_STR={"type": "service_account","project_id": "<PROJECT ID>","private_key_id": "<PRIVATE KEY ID>","private_key": "-----BEGIN PRIVATE KEY-----\<PRIVATE KEY>\n-----END PRIVATE KEY-----\n","client_email": "<SERVICE ACCOUNT NAME>#<PROJECT NAME>.iam.gserviceaccount.com","client_id": "<CLIENT ID>","auth_uri": "https://accounts.google.com/o/oauth2/auth","token_uri": "https://oauth2.googleapis.com/token","auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs","client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/<SERVICE ACCOUNT NAME>.iam.gserviceaccount.com"}
The Akash provider can see this string. Is there a better way to do authentication for a GCP service account that doesn't expose the credntials to a hosting/server provider?
Also don't be throw off by the web3 stuff. This app essentially works the same as a traditional web2 app with a backend and a client. If it helps you to think about it different, picture that I'm deploying on Netlify with a static client and a Netlify Function.
The compromise I came to was creating an API Gateway for the function. This allows the function to be called without any credentials and still run from a service account. It creates a separate quasi-vulnerability though, as anyone with the API Gateway link can also call the function unauthenticated.
First, I enabled Service Management APIs, API Gateway API, and Service Control API. Then I made an API Gateway with my service account that runs my referenced cloud function. I uploaded a file like this for the api spec:
swagger: '2.0'
info:
title: api-gateway-cloud-function
description: API Gateway Calling Cloud Function
version: 1.0.0
schemes:
- https
produces:
- application/json
paths:
/whateveryouwanttocallthispath:
get:
summary: My Cloud Function
operationId: whatever
x-google-backend:
address: <CLOUD_RUN_URL>
responses:
'200':
description: OK
You can test it by running the function via curl command in a bash terminal curl {gatewayId}-{hash}.{region_code}.gateway.dev/v1/whateveryouwanttocallthispath. It works with no credential json file.
The problem is that you could achieve a similar result by just allowing the function to be called unauthenticated... Idk if this method has many benefits.
I need to secure a web page with a token stored in a cookie or url param. All examples I can find for using forwardAuth middleware seems to be for securing an API, as it's easy to supply headers in an API request. Sending custom headers isn't an option w/ the browser, so I need to used cookies.
I would like to have the auth token passed in through a query string arg, eg ?token=ABCDEFG, then stored in a cookie for future requests. Here's what the workflow looks like:
I've tried experimenting with forwardAuth to see how I can do this. The auth endpoint reads the Authorization header, but I need something that reads the cookie in the request and transforms that to an Authorization header.
Is there any way this can be done with Traefik?
It looks like the answer is yes. Originally I had thought traefik wouldn't forward cookies, but it does in fact appear to forward cookies.
I ended up creating a "sidecar" auth container on the same host as traefik so that auth requests would be faster.
The auth function looks like this (node/express):
app.get('/auth', (req, res) => {
logger.info('CHECKING AUTH');
const url = new URL(`${req.headers['x-forwarded-proto']}://` +
`${req.headers['x-forwarded-host']}` +
`${req.headers['x-forwarded-uri']}`);
const urlAuthToken = url.searchParams.get('token');
if (urlAuthToken) {
url.searchParams.delete('token');
const domain = BASE_DOMAIN;
const sameSite = false;
const secure = url.protocol === 'https:';
return res
.cookie('auth-token', urlAuthToken, {domain, sameSite, secure})
.redirect(url.toString());
}
// Simulate credentials check
if (req.cookies['auth-token'] === 'my-little-secret') {
return res.status(200).send();
}
return res.status(401).send('<h1>401: Unauthorized</h1>');
});
I want to activate ambassador authservice to only require authentication on certain routes/urls. Now if you install the basic http auth service it requires this auth for all services by default. So how can I configure ambassador or the auth service (separate service with ExAuth) to only require auth on certain routes/urls?
Ambassador version 0.51.2
kubernetes version 1.14
auth service I am using as base: https://github.com/datawire/ambassador-auth-httpbasic
If you see the server.js example in https://github.com/datawire/ambassador-auth-httpbasic you'll see that it's only authenticating for /extauth/qotm/quote*. If you are using the same server.js to start you'll have to add another app.all section with whatever you want to authenticate. For example:
app.all('/extauth/myapp/myurl*', authenticate, function (req, res) {
var session = req.headers['x-myapp-session']
if (!session) {
console.log(`creating x-myapp-session: ${req.id}`)
session = req.id
res.set('x-myapp-session', session)
}
console.log(`allowing MyApp request, session ${session}`)
res.send('OK (authenticated)')
})
Or you can implement this server using a different language if you'd like too.
I am building an application using hapi.js . The clients of this application are going to be either a web application, so authentication is via JWT in the coookie or via OAuth2 clients which are going to be sending the Bearer key header.
Is there some way that the framework allows using both schemes for the same route? I want the authentication to fail if both schemes fail, but pass if either of the go through.
Look at http://hapijs.com/api#route-options under auth.strategies. This will allow you to set multiple strategies for your route. You can define the behaviour with auth.mode.
hapi supports multiple authentication strategies for a route. Register the indiviual plugins for authentication and set the default auth scheme afterwards.
var Hapi = require('hapi')
var BasicAuth = require('hapi-auth-basic')
var CookieAuth = require('hapi-auth-cookie')
// create new server instance
var server = new Hapi.Server()
// register plugins to server instance
server.register([ BasicAuth, CookieAuth ], function (err) {
if (err) {…}
server.auth.strategy('simple', 'basic', { validateFunc: basicValidationFn })
server.auth.strategy('session', 'cookie', { password: '…' })
server.auth.default('simple')
})
Each authentication scheme may require dedicated configuration (like a cookie password, a validation function, etc.) that you need to provide.
I decided to use Fine Uploader for my current AngularJS project (which is connected to hosted on Firebase) because it has many core features that I will need in an uploader already built in but, I am having trouble understanding how to use Firebase's email & password authentication method to communicate with AWS (Amazon Web Services) to allow my users to use Fine Uploader S3 to upload content. Based on Fine Uploader blog post Uploads without any server code, the workflow goes like:
Authenticate your users with the help of an identity provider, such as Google
Use the temporary token from your ID provider to grab temporary access keys from AWS
Pass the keys on to Fine Uploader S3
Your users can now upload to your S3 bucket
The problem is that I won't be using OAuth 2.0 (which is used by Google, Facebook or Amazon to provide user identities) to allow my user's to sign into my app and upload content. Instead I will be using Firebase's email & password authentication.
So how can I make Firebase's email & password authentication method create a temporary token to grab temporary access keys from AWS and pass those keys on to Fine Uploader S3 to allow my users to upload content to S3?
To connect AWS with an outside application, Cognito is going to be a good solution. It will let you generate an OpenID token using the AWS Node SDK and your secret keys in your backend, that you can then use with the AWS JavaScript SDK and WebIdentityCredentials in your client.
Note that I'm unfamiliar with your specific plugin/tool, but this much will at least get you the OpenID and in my work it does let me connect using WebIdentityCredentials, which I imagine is what they are using.
Configure Cognito on AWS
Setup on Cognito is fairly easy - it is more or less a walkthrough. It does involve configuring IAM rules on AWS, though. How to set this up is pretty project specific, so I think I need to point you to the official resources. They recently made some nice updates, but I am admittedly not up to speed on all the changes.
Through the configuration, you will want to setup a 'developer authenticated identity', take note of the 'identity pool id', and the IAM role ARN setup by Cognito.
Setup a Node Server that can handle incoming routes
There are a lot of materials out there on how to accomplish this, but you want to be sure to include and configure the AWS SDK. I also recommend using body-parser as it will make reading in your POST requests easier.
var app = express();
var bodyParser = require('body-parser');
var AWS = require('aws-sdk');
app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json());
Create POST Function to talk with Cognito
Once you have your server setup, you then reach out to Cognito using getOpenIdTokenForDeveloperIdentity. In my setup, I use authenticated users because I expect them to come back and want to be able to continue the associations, so that is why I send in a UserID in req.body.UserIDFromAngularApp.
This is my function using express.router().
.post(function(req, res) {
if(req.body.UserIDFromAngularApp) {
var cognitoidentity = new AWS.CognitoIdentity();
var params = {
IdentityPoolId: 'your_cognito_identity_pool_id',
Logins: {
'your_developer_authenticated_identity_name': req.body.UserIDFromAngularApp
}
};
cognitoidentity.getOpenIdTokenForDeveloperIdentity(params, function(err, data) {
if (err) { console.log(err, err.stack); res.json({failure: 'Connection failure'}); }
else {
console.log(data); // so you can see your result server side
res.json(data); // send it back
}
});
}
else { res.json({failure: 'Connection failure'}); }
});
If all goes well, that will return an OpenID Token back to you. You can then return that back to your Angular application.
POST from Angular, Collect from Promise
At the very least you need to post to your new node server and then collect the OpenID token out of the promise. Using this pattern, that will be found in data.Token.
It sounds like from there you may just need to pass that token on to your plugin/tool.
In case you need to handle authentication further, I have included code to handle the WebIdentityCredentials.
angular.module('yourApp').factory('AWSmaker', ['$http', function($http) {
return {
reachCognito: function(authData) {
$http.post('http://localhost:8888/simpleapi/aws', {
'UserIDFromAngularApp': authData.uid,
})
.success(function(data, status, headers, config) {
if(!data.failure) {
var params = {
RoleArn: your_role_arn_setup_by_cognito,
WebIdentityToken: data.Token
};
AWS.config.credentials = new AWS.WebIdentityCredentials(params, function(err) {
console.log(err, err.stack);
});
}
});
}
}]);
This should get you on your way. Let me know if I can help further.
Each OAuth provider has a slightly unique way of handling things, and so the attributes available in your Firebase authenticated token vary slightly based on provider. For example, when utilizing Facebook, the Facebook auth token is stored at facebook.accessToken in the returned user object:
var ref = new Firebase(URL);
ref.authWithOAuthPopup("facebook", function(error, authData) {
if (authData) {
// the access token for Facebook
console.log(authData.facebook.accessToken);
}
}, {
scope: "email" // the permissions requested
});
All of this is covered in the User Authentication section of the Web Guide.