Looking for a better way to authenticate Google Cloud Function with a service account. Right now I'm storing the credentials json file on the backend - authentication

I'm looking for a better way to authenticate Google Cloud Function with a service account. Right now I'm storing the credentials json file on the backend. This is the code for my app https://github.com/ChristianOConnor/spheron-react-api-stack. This app could be deployed on any hosting platform, but at the moment the app is built to deploy on a Web3 protocol called Spheron. TLDR, Spheron runs the backend express server on a web3 friendly content serving/hosting platform called Akash. This means that whoever is hosting my backend express server has access to my GCP service account's credentials. You can see all of the code in the link I provided but just for ease of access this is the server.js file which will be on Akash.
server.js
var express = require("express");
var app = express();
require("dotenv").config();
const GoogleAuth = require("google-auth-library").GoogleAuth;
const cors = require("cors");
app.use(
cors({ origin: process.env.ORIGIN, credentials: process.env.CREDENTIALS })
);
app.get("/hello", async function (req, res) {
const keyInJsn = JSON.parse(process.env.CREDENTIALS_STR);
const auth = new GoogleAuth({
credentials: keyInJsn,
});
const url = process.env.RUN_APP_URL;
//Create your client with an Identity token.
const client = await auth.getIdTokenClient(url);
const result = await client.request({ url });
const resData = result.data;
res.send(resData);
});
var server = app.listen(8081, function () {
var host = server.address().address;
var port = server.address().port;
console.log("Example app listening at http://localhost:", port);
});
process.env.CREDENTIALS_STR is the service account credentials set up in this format:
CREDENTIALS_STR={"type": "service_account","project_id": "<PROJECT ID>","private_key_id": "<PRIVATE KEY ID>","private_key": "-----BEGIN PRIVATE KEY-----\<PRIVATE KEY>\n-----END PRIVATE KEY-----\n","client_email": "<SERVICE ACCOUNT NAME>#<PROJECT NAME>.iam.gserviceaccount.com","client_id": "<CLIENT ID>","auth_uri": "https://accounts.google.com/o/oauth2/auth","token_uri": "https://oauth2.googleapis.com/token","auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs","client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/<SERVICE ACCOUNT NAME>.iam.gserviceaccount.com"}
The Akash provider can see this string. Is there a better way to do authentication for a GCP service account that doesn't expose the credntials to a hosting/server provider?
Also don't be throw off by the web3 stuff. This app essentially works the same as a traditional web2 app with a backend and a client. If it helps you to think about it different, picture that I'm deploying on Netlify with a static client and a Netlify Function.

The compromise I came to was creating an API Gateway for the function. This allows the function to be called without any credentials and still run from a service account. It creates a separate quasi-vulnerability though, as anyone with the API Gateway link can also call the function unauthenticated.
First, I enabled Service Management APIs, API Gateway API, and Service Control API. Then I made an API Gateway with my service account that runs my referenced cloud function. I uploaded a file like this for the api spec:
swagger: '2.0'
info:
title: api-gateway-cloud-function
description: API Gateway Calling Cloud Function
version: 1.0.0
schemes:
- https
produces:
- application/json
paths:
/whateveryouwanttocallthispath:
get:
summary: My Cloud Function
operationId: whatever
x-google-backend:
address: <CLOUD_RUN_URL>
responses:
'200':
description: OK
You can test it by running the function via curl command in a bash terminal curl {gatewayId}-{hash}.{region_code}.gateway.dev/v1/whateveryouwanttocallthispath. It works with no credential json file.
The problem is that you could achieve a similar result by just allowing the function to be called unauthenticated... Idk if this method has many benefits.

Related

How to validate a token that is send by socket.io using passport ? I am using passport-azure-ad strategy

I have an application that is using passport-azure-ad strategy to authenticate users. When the client sends a post or get request, I have a middleware that checks if the request is valid or not with passport.authenticate('oath-bearer', {session: false})(requires, next) and this work perfectly Fine.
But, on this same application, I am also using socket.io for uploading images. When the client tries to establish socket connection with the server, it sends a token on the header like this - `const socket = io('http://localhost:3000', auth: {token : 'eyhadjhad...'})`. I have access to this token on the server side like this - const token = socket.handshake.auth.token . Now I am having trouble authenticating this token.
Is there a way I can add a middleware on namespaces like the one I have for routes? for example like this -
io.of('/fileUpload')
.use((socket, next) =>
passport.authenticate(token)
)).on('connection', (socket) => {
console.log('user authenticated, allow upload')
})

Octokit - how to authenticate as an app (JWT)

So I'm building a github app, and I am wanting to be able to authenticate as the app so I can do graphql calls as that user. So, I can authenticate as the app, and get the JWT, but I can't seem to use the JWT. Code looks like:
const { Octokit } = require("#octokit/core");
const { createAppAuth} = require("#octokit/auth-app");
const fs = require('fs')
const auth = createAppAuth( {
appId: process.env.APP_ID,
privateKey: fs.readFileSync(__dirname + "/../" + process.env.PRIVATE_KEY_PATH, "utf-8"),
clientId: process.env.CLIENT_ID,
clientSecret: process.env.WEBHOOK_SECRET
})
// Send requests as GitHub App
async function main() {
const {token} = await auth({type: "app"})
console.log(token);
const appOctokit = new Octokit({
baseUrl: 'https://github.<company>.com/api/v3',
auth: `token ${token}`
});
const { slug } = await appOctokit.request("GET /user");
console.log("authenticated as %s", slug);
}
main().then().catch(err => {
console.log(err.message)
console.log(err.stack)
console.log("oops")
})
I end up getting an HttpError: Bad Credentials.
What am I missing?
The reason for the bad credentials error though is that you are trying to authenticate as the app for the GET /user request. This is a user-specific request, which requires an OAuth token.
Try sending GET /app instead, it should work.
If you do want to authenticate as a user, then there are two ways to receive an OAuth token through a GitHub App (GitHub calls these user-to-server token, because the token is authorized by both, the app and the user).
OAuth Web flow
OAuth Device flow
For the Web Flow, see https://github.com/octokit/auth-app.js/#user-authentication-web-flow. You will need a server that can receive the http redirect from GitHub. You can use the #octokit/app SDK which exports a node middleware for that and other OAuth related usecases , as well as webhooks: https://github.com/octokit/app.js/#middlewares
For the OAuth Device Flow, see https://github.com/octokit/auth-app.js/#user-authentication-device-flow.
If you want to authenticate using the OAuth Device Flow without exposing the OAuth Client Secret, you can use the dedicated OAuth Device Flow authentication strategy: https://github.com/octokit/auth-oauth-device.js

Using ambassador authservice to only require basic auth on some routes/urls (or services)

I want to activate ambassador authservice to only require authentication on certain routes/urls. Now if you install the basic http auth service it requires this auth for all services by default. So how can I configure ambassador or the auth service (separate service with ExAuth) to only require auth on certain routes/urls?
Ambassador version 0.51.2
kubernetes version 1.14
auth service I am using as base: https://github.com/datawire/ambassador-auth-httpbasic
If you see the server.js example in https://github.com/datawire/ambassador-auth-httpbasic you'll see that it's only authenticating for /extauth/qotm/quote*. If you are using the same server.js to start you'll have to add another app.all section with whatever you want to authenticate. For example:
app.all('/extauth/myapp/myurl*', authenticate, function (req, res) {
var session = req.headers['x-myapp-session']
if (!session) {
console.log(`creating x-myapp-session: ${req.id}`)
session = req.id
res.set('x-myapp-session', session)
}
console.log(`allowing MyApp request, session ${session}`)
res.send('OK (authenticated)')
})
Or you can implement this server using a different language if you'd like too.

How to integrate HTTP Digest Auth into Strongloop's Loopback?

I'm relatively new to Strongloop's Loopback.
A project I'm working on requires HTTP-Digest to use as authentication.
I have setup the ACL on the models (and endpoints). SPA client uses REST to consume services.
I'm stuck on how to use http digest auth (username:realm:password) / nonce instead of the plain login of username/password.
I still would like to use the token auth also.
I'm currently looking at the ff 3 projects though:
loopback-component-auth
passport-http
loopback-component-passport
Any help would be appreciated! Thank you!
You can use Express Middleware to configure HTTP authentication:
Use this node module: http-auth
Create digest-auth.js boot script in server/boot folder
var auth = require('http-auth');
var basic = auth.basic({
realm: "<your authentication realm>",
file: __dirname + "<path to your .htpasswd file"
});
module.exports = function (app) {
app.use(auth.connect(basic));
// Setup route.
app.get("/", (req, res) => {
res.send("Secured resource access granted!");
});
}
You can check more option available with "http-auth" module to use "username:realm:password" for authentication
Hope this would help you !

How to use Firebase's email & password authentication method to connect with AWS to make Fine Uploader S3 work?

I decided to use Fine Uploader for my current AngularJS project (which is connected to hosted on Firebase) because it has many core features that I will need in an uploader already built in but, I am having trouble understanding how to use Firebase's email & password authentication method to communicate with AWS (Amazon Web Services) to allow my users to use Fine Uploader S3 to upload content. Based on Fine Uploader blog post Uploads without any server code, the workflow goes like:
Authenticate your users with the help of an identity provider, such as Google
Use the temporary token from your ID provider to grab temporary access keys from AWS
Pass the keys on to Fine Uploader S3
Your users can now upload to your S3 bucket
The problem is that I won't be using OAuth 2.0 (which is used by Google, Facebook or Amazon to provide user identities) to allow my user's to sign into my app and upload content. Instead I will be using Firebase's email & password authentication.
So how can I make Firebase's email & password authentication method create a temporary token to grab temporary access keys from AWS and pass those keys on to Fine Uploader S3 to allow my users to upload content to S3?
To connect AWS with an outside application, Cognito is going to be a good solution. It will let you generate an OpenID token using the AWS Node SDK and your secret keys in your backend, that you can then use with the AWS JavaScript SDK and WebIdentityCredentials in your client.
Note that I'm unfamiliar with your specific plugin/tool, but this much will at least get you the OpenID and in my work it does let me connect using WebIdentityCredentials, which I imagine is what they are using.
Configure Cognito on AWS
Setup on Cognito is fairly easy - it is more or less a walkthrough. It does involve configuring IAM rules on AWS, though. How to set this up is pretty project specific, so I think I need to point you to the official resources. They recently made some nice updates, but I am admittedly not up to speed on all the changes.
Through the configuration, you will want to setup a 'developer authenticated identity', take note of the 'identity pool id', and the IAM role ARN setup by Cognito.
Setup a Node Server that can handle incoming routes
There are a lot of materials out there on how to accomplish this, but you want to be sure to include and configure the AWS SDK. I also recommend using body-parser as it will make reading in your POST requests easier.
var app = express();
var bodyParser = require('body-parser');
var AWS = require('aws-sdk');
app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json());
Create POST Function to talk with Cognito
Once you have your server setup, you then reach out to Cognito using getOpenIdTokenForDeveloperIdentity. In my setup, I use authenticated users because I expect them to come back and want to be able to continue the associations, so that is why I send in a UserID in req.body.UserIDFromAngularApp.
This is my function using express.router().
.post(function(req, res) {
if(req.body.UserIDFromAngularApp) {
var cognitoidentity = new AWS.CognitoIdentity();
var params = {
IdentityPoolId: 'your_cognito_identity_pool_id',
Logins: {
'your_developer_authenticated_identity_name': req.body.UserIDFromAngularApp
}
};
cognitoidentity.getOpenIdTokenForDeveloperIdentity(params, function(err, data) {
if (err) { console.log(err, err.stack); res.json({failure: 'Connection failure'}); }
else {
console.log(data); // so you can see your result server side
res.json(data); // send it back
}
});
}
else { res.json({failure: 'Connection failure'}); }
});
If all goes well, that will return an OpenID Token back to you. You can then return that back to your Angular application.
POST from Angular, Collect from Promise
At the very least you need to post to your new node server and then collect the OpenID token out of the promise. Using this pattern, that will be found in data.Token.
It sounds like from there you may just need to pass that token on to your plugin/tool.
In case you need to handle authentication further, I have included code to handle the WebIdentityCredentials.
angular.module('yourApp').factory('AWSmaker', ['$http', function($http) {
return {
reachCognito: function(authData) {
$http.post('http://localhost:8888/simpleapi/aws', {
'UserIDFromAngularApp': authData.uid,
})
.success(function(data, status, headers, config) {
if(!data.failure) {
var params = {
RoleArn: your_role_arn_setup_by_cognito,
WebIdentityToken: data.Token
};
AWS.config.credentials = new AWS.WebIdentityCredentials(params, function(err) {
console.log(err, err.stack);
});
}
});
}
}]);
This should get you on your way. Let me know if I can help further.
Each OAuth provider has a slightly unique way of handling things, and so the attributes available in your Firebase authenticated token vary slightly based on provider. For example, when utilizing Facebook, the Facebook auth token is stored at facebook.accessToken in the returned user object:
var ref = new Firebase(URL);
ref.authWithOAuthPopup("facebook", function(error, authData) {
if (authData) {
// the access token for Facebook
console.log(authData.facebook.accessToken);
}
}, {
scope: "email" // the permissions requested
});
All of this is covered in the User Authentication section of the Web Guide.