How to add a behavior to an existing AWS cloudfront distribution for an API gateway using AWS CDK (typescript preferred)? - amazon-s3

I am trying to implement a CDK project that will deploy a static website in an s3 bucket along with a CloudFront distribution. I also have an API gateway that I need to access via the same cloud-front URL. I am able to do this from the AWS Management console. But when I try to implement this using CDK, I am getting circular dependency errors.
const cdn = new cloudfront.Distribution(this, "websitecdn", {
defaultBehavior: {origin: new origins.S3Origin(s3_bucket)}
});
const api = new apigw.RestApi(this, 'someapi', {defaultCorsPreflightOptions: enableCors})
const loginApi = api.root.addResource('login', {defaultCorsPreflightOptions: enableCors})
loginApi.addMethod('POST', new apigw.LambdaIntegration(loginLambda, {
proxy: false,
integrationResponses: [LambdaIntegrationResponses]}),
{
methodResponses: [LambdaMethodResponses]
})
const apiOrigin = new origins.RestApiOrigin(api)
cdn.addBehavior("/prod/*",apiOrigin,{
allowedMethods: cloudfront.AllowedMethods.ALLOW_ALL,
cachePolicy: cloudfront.CachePolicy.CACHING_DISABLED,
viewerProtocolPolicy: ViewerProtocolPolicy.HTTPS_ONLY,
})
Everything works fine until I try to add the behavior for the API gateway in the CDN. But when I add that, it starts throwing circular dependency errors.
What I am trying to do using AWS CDK typescript:
deploy a static s3 website
create a CloudFront Distribution for this website -> let's call it cdn_x
deploy backend API (Lambda functions with API Gateway)
Add the API gateway URL as a behavior to cdn_x so that I can use the same URL for API calls as well (I do not have a custom domain)
I was expecting the deployment to go through fine as I was able to go it in the AWS management console (Web UI of AWS). But trying to do the same using AWS CDK throws circular dependency errors.

It is unclear from your example how the stacks and resources in your CDK project are created and related. I'm unable to use your code examples.
In the meantime, I created a TypeScript example using Multiple Behaviors in CloudFront with Amazon API Gateway under the /api/* path and a S3 bucket as default behavior to serve static assets /*
The final CDK structure is the following:
The CDK codebase uses multiple stacks:
cloudfront-stack.ts
rest-api-stack.ts
s3-stack.ts
waf-stack.ts
And resources are passed as references in bin/infra.ts
const app = new cdk.App();
const s3Stack = new S3Stack(app, "S3Stack");
const restApiStack = new RestApiStack(app, "RestApiStack");
const wafStack = new WafStack(app, "WafStack", {
restApi: restApiStack.restApi,
});
const cloudFrontStack = new CloudFrontStack(app, "CloudFrontStack", {
bucketAssets: s3Stack.bucketAssets,
restApi: restApiStack.restApi,
wafCloudFrontAclArn: wafStack.wafCloudFrontAclArn,
wafRestApiOriginVerifyHeader: wafStack.wafRestApiOriginVerifyHeader,
wafRestApiOriginVerifyHeaderValue: wafStack.wafRestApiOriginVerifyHeaderValue,
});
GitHub repository:
https://github.com/oieduardorabelo/cdk-cloudfront-behavior-api-gateway-waf-protection
I trust the example above will clarity some of your questions.

Related

Looking for a better way to authenticate Google Cloud Function with a service account. Right now I'm storing the credentials json file on the backend

I'm looking for a better way to authenticate Google Cloud Function with a service account. Right now I'm storing the credentials json file on the backend. This is the code for my app https://github.com/ChristianOConnor/spheron-react-api-stack. This app could be deployed on any hosting platform, but at the moment the app is built to deploy on a Web3 protocol called Spheron. TLDR, Spheron runs the backend express server on a web3 friendly content serving/hosting platform called Akash. This means that whoever is hosting my backend express server has access to my GCP service account's credentials. You can see all of the code in the link I provided but just for ease of access this is the server.js file which will be on Akash.
server.js
var express = require("express");
var app = express();
require("dotenv").config();
const GoogleAuth = require("google-auth-library").GoogleAuth;
const cors = require("cors");
app.use(
cors({ origin: process.env.ORIGIN, credentials: process.env.CREDENTIALS })
);
app.get("/hello", async function (req, res) {
const keyInJsn = JSON.parse(process.env.CREDENTIALS_STR);
const auth = new GoogleAuth({
credentials: keyInJsn,
});
const url = process.env.RUN_APP_URL;
//Create your client with an Identity token.
const client = await auth.getIdTokenClient(url);
const result = await client.request({ url });
const resData = result.data;
res.send(resData);
});
var server = app.listen(8081, function () {
var host = server.address().address;
var port = server.address().port;
console.log("Example app listening at http://localhost:", port);
});
process.env.CREDENTIALS_STR is the service account credentials set up in this format:
CREDENTIALS_STR={"type": "service_account","project_id": "<PROJECT ID>","private_key_id": "<PRIVATE KEY ID>","private_key": "-----BEGIN PRIVATE KEY-----\<PRIVATE KEY>\n-----END PRIVATE KEY-----\n","client_email": "<SERVICE ACCOUNT NAME>#<PROJECT NAME>.iam.gserviceaccount.com","client_id": "<CLIENT ID>","auth_uri": "https://accounts.google.com/o/oauth2/auth","token_uri": "https://oauth2.googleapis.com/token","auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs","client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/<SERVICE ACCOUNT NAME>.iam.gserviceaccount.com"}
The Akash provider can see this string. Is there a better way to do authentication for a GCP service account that doesn't expose the credntials to a hosting/server provider?
Also don't be throw off by the web3 stuff. This app essentially works the same as a traditional web2 app with a backend and a client. If it helps you to think about it different, picture that I'm deploying on Netlify with a static client and a Netlify Function.
The compromise I came to was creating an API Gateway for the function. This allows the function to be called without any credentials and still run from a service account. It creates a separate quasi-vulnerability though, as anyone with the API Gateway link can also call the function unauthenticated.
First, I enabled Service Management APIs, API Gateway API, and Service Control API. Then I made an API Gateway with my service account that runs my referenced cloud function. I uploaded a file like this for the api spec:
swagger: '2.0'
info:
title: api-gateway-cloud-function
description: API Gateway Calling Cloud Function
version: 1.0.0
schemes:
- https
produces:
- application/json
paths:
/whateveryouwanttocallthispath:
get:
summary: My Cloud Function
operationId: whatever
x-google-backend:
address: <CLOUD_RUN_URL>
responses:
'200':
description: OK
You can test it by running the function via curl command in a bash terminal curl {gatewayId}-{hash}.{region_code}.gateway.dev/v1/whateveryouwanttocallthispath. It works with no credential json file.
The problem is that you could achieve a similar result by just allowing the function to be called unauthenticated... Idk if this method has many benefits.

Google Cloud Storage report download no access

I am running node to download the sales report from google cloud storage.
I got the credentials.json file. Now the problem is every time I run my application I get "xxxxxxx#gmail.com" does not have storage.objects.get access to the Google Cloud Storage object".
Yes, this email is nowhere registered on the google cloud storage or given rights to, but it should work with the credentials alone, no?
The credentials are directly from the google cloud storage and have this information :
client_secret,project_id,redirect_uri,client_id...
My sample Code:
// Imports the Google Cloud client library.
const {Storage} = require('#google-cloud/storage');
const projectId = 'xxxxxxxxxxxxxxxxxx'
const key = 'credentials.json'
const bucketName = 'pubsite.......'
const destFileName = './test'
const fileName = 'salesreport_2020.zip'
// Creates a client
const storage = new Storage({projectId, key});
async function downloadFile() {
const options = {
destination: destFileName,
};
// Downloads the file
await storage.bucket(bucketName).file(fileName).download(options);
console.log(
`gs://${bucketName}/${fileName} downloaded to ${destFileName}.`
);
}
downloadFile().catch(console.error);
You are using the wrong type of credentials file.
Your code is written to use a service account JSON key file. You mention that the credentials file contains client_secret. That means you are trying to use OAuth 2.0 Client IDs.
Look in the file credentials.json. It should contain "type": "service_account". If you see {"installed": or {"web": at the start of the file, then you have the wrong credentials.
Creating and managing service account keys
Also, you are specifying the parameters wrong in the line:
const storage = new Storage({projectId, key});
Replace with:
const storage = new Storage({projectId: projectId, keyFilename: key});
Because you are seeing the random gmail address, that likely means the storage client is using Application default credentials instead of the ones you intend. There are two paths forward:
Embrace application default credentials. Remove the options you are passing in to the Storage constructor, and instead set the GOOGLE_APPLICATION_CREDENTIALS environmental variable to you json service account file.
Fix the Storage constructor to pass in credentials properly. The issue may be something as simple as you needing to pass the full path to the credentials file (ie /a/b/c/credentials.json). Possibly the storage options are not being processed right, try being explicit like
const storage = new Storage({projectId: 'your-project-id', keyFilename: '/path/to/keyfile.json'});

CRUD operations using DynamoDB with expressjs (node js)

I am trying to create a route which will perform some CRUD operations on DynamoDB.
At high level , it can be understood as :
The node js server application is running .(i.e. command 'node server.js' is being triggered)
The user uses POSTMAN of chrome browser to do route requests.
The user does a GET request for 'http://localhost:8080/listtablesofdynamodb'.
The specific route connected with this url gets hit which should do dynamodb specific activity. (like connecting to dynamodb ,fetching table names and showing it in callback method.)
the reason I am asking this question is because I could not find any relevant tutorial of how to do dynamodb activity by using express js of node. All I could find is console applications on aws website which seems not useful for me.
Any kind of help is highly appreciated.
Access key required
All you need to d is make a DynamoDB object to connect too
var ddb = require('dynamodb').ddb({ accessKeyId: '< your_access_key_id >', secretAccessKey: '< your_secret_access_key >' });
put this under your require statements, turn on your server. Then you can just fill out the routes to do the CRUD operations you need.
To test it use
ddb.listTables({}, function(err, res) {console.log(res);});
This will list all the tables in your db.
for full source check here
Best of luck
fortunately, I could manage to use aws-sdk in my route. the solution have two stages:
Run the code in your aws account's EC2 instance and attach an IAM role which allows the ec2 instance to talk to dynamodb. (in this way you don't need to hard-code access key in your code) see this article.
can take reference of the below code for initial code scaffolding.
`
var express = require('express');
var router = express.Router();
var AWS = require("aws-sdk");
AWS.config.update({
region: "us-west-2",
endpoint: "dynamodb endpoint specific to your aws account"
});
var dynamodb = new AWS.DynamoDB();
var params = {
ExclusiveStartTableName: "stringvalue",
Limit: 10
};
/* GET users listing. */
router.get('/', function (req, res) {
console.log("entered into dynadb route");
dynamodb.listTables(params, function (err, data) {
if (err) console.log(err, err.stack); // an error occurred
else {
res.send(data);
}
});
});
module.exports = router;
`

What is the proper way to set up API endpoints for usage with Keystone?

It's not clear in the docs how one would use existing Keystone models to expose API endpoints that return json within a Keystone.js app. I would simply like to be able expose REST API endpoints with Keystone and be able to use the Keystone CMS capabilities to manage content via interacting with those endpoints. Thanks!
Now that they've standardized the admin API I found that it's pretty trivial to use the same methods. For my read only APIs that are powering my react app I've done put something like this in my routes/index.js
router.get('/api/:list/:format(export.csv|export.json)',middleware.initList,require('keystone/admin/server/api/list/download'));
And I've made my own version of the admin initList middleware:
exports.initList = function(req, res, next) {
console.log('req.keystone', req.keystone);
req.keystone = keystone;
req.list = keystone.list(req.params.list);
if (!req.list) {
if (req.headers.accept === 'application/json') {
return res.status(404).json({ error: 'invalid list path' });
}
req.flash('error', 'List ' + req.params.list + ' could not be found.');
}
next();
};
You may consider using:
restful-keystone by #creynders, or
keystone-rest by #danielpquinn
I've never actually used either of these because I have my own implementation, which I will open source once Keystone implements it plugin architecture (see Keystone Issue #912: Proposed Keystone Package Architecture).
I suspect many other similar modules will start surfacing once Keystone is more "plugin friendly".

How to use Firebase's email & password authentication method to connect with AWS to make Fine Uploader S3 work?

I decided to use Fine Uploader for my current AngularJS project (which is connected to hosted on Firebase) because it has many core features that I will need in an uploader already built in but, I am having trouble understanding how to use Firebase's email & password authentication method to communicate with AWS (Amazon Web Services) to allow my users to use Fine Uploader S3 to upload content. Based on Fine Uploader blog post Uploads without any server code, the workflow goes like:
Authenticate your users with the help of an identity provider, such as Google
Use the temporary token from your ID provider to grab temporary access keys from AWS
Pass the keys on to Fine Uploader S3
Your users can now upload to your S3 bucket
The problem is that I won't be using OAuth 2.0 (which is used by Google, Facebook or Amazon to provide user identities) to allow my user's to sign into my app and upload content. Instead I will be using Firebase's email & password authentication.
So how can I make Firebase's email & password authentication method create a temporary token to grab temporary access keys from AWS and pass those keys on to Fine Uploader S3 to allow my users to upload content to S3?
To connect AWS with an outside application, Cognito is going to be a good solution. It will let you generate an OpenID token using the AWS Node SDK and your secret keys in your backend, that you can then use with the AWS JavaScript SDK and WebIdentityCredentials in your client.
Note that I'm unfamiliar with your specific plugin/tool, but this much will at least get you the OpenID and in my work it does let me connect using WebIdentityCredentials, which I imagine is what they are using.
Configure Cognito on AWS
Setup on Cognito is fairly easy - it is more or less a walkthrough. It does involve configuring IAM rules on AWS, though. How to set this up is pretty project specific, so I think I need to point you to the official resources. They recently made some nice updates, but I am admittedly not up to speed on all the changes.
Through the configuration, you will want to setup a 'developer authenticated identity', take note of the 'identity pool id', and the IAM role ARN setup by Cognito.
Setup a Node Server that can handle incoming routes
There are a lot of materials out there on how to accomplish this, but you want to be sure to include and configure the AWS SDK. I also recommend using body-parser as it will make reading in your POST requests easier.
var app = express();
var bodyParser = require('body-parser');
var AWS = require('aws-sdk');
app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json());
Create POST Function to talk with Cognito
Once you have your server setup, you then reach out to Cognito using getOpenIdTokenForDeveloperIdentity. In my setup, I use authenticated users because I expect them to come back and want to be able to continue the associations, so that is why I send in a UserID in req.body.UserIDFromAngularApp.
This is my function using express.router().
.post(function(req, res) {
if(req.body.UserIDFromAngularApp) {
var cognitoidentity = new AWS.CognitoIdentity();
var params = {
IdentityPoolId: 'your_cognito_identity_pool_id',
Logins: {
'your_developer_authenticated_identity_name': req.body.UserIDFromAngularApp
}
};
cognitoidentity.getOpenIdTokenForDeveloperIdentity(params, function(err, data) {
if (err) { console.log(err, err.stack); res.json({failure: 'Connection failure'}); }
else {
console.log(data); // so you can see your result server side
res.json(data); // send it back
}
});
}
else { res.json({failure: 'Connection failure'}); }
});
If all goes well, that will return an OpenID Token back to you. You can then return that back to your Angular application.
POST from Angular, Collect from Promise
At the very least you need to post to your new node server and then collect the OpenID token out of the promise. Using this pattern, that will be found in data.Token.
It sounds like from there you may just need to pass that token on to your plugin/tool.
In case you need to handle authentication further, I have included code to handle the WebIdentityCredentials.
angular.module('yourApp').factory('AWSmaker', ['$http', function($http) {
return {
reachCognito: function(authData) {
$http.post('http://localhost:8888/simpleapi/aws', {
'UserIDFromAngularApp': authData.uid,
})
.success(function(data, status, headers, config) {
if(!data.failure) {
var params = {
RoleArn: your_role_arn_setup_by_cognito,
WebIdentityToken: data.Token
};
AWS.config.credentials = new AWS.WebIdentityCredentials(params, function(err) {
console.log(err, err.stack);
});
}
});
}
}]);
This should get you on your way. Let me know if I can help further.
Each OAuth provider has a slightly unique way of handling things, and so the attributes available in your Firebase authenticated token vary slightly based on provider. For example, when utilizing Facebook, the Facebook auth token is stored at facebook.accessToken in the returned user object:
var ref = new Firebase(URL);
ref.authWithOAuthPopup("facebook", function(error, authData) {
if (authData) {
// the access token for Facebook
console.log(authData.facebook.accessToken);
}
}, {
scope: "email" // the permissions requested
});
All of this is covered in the User Authentication section of the Web Guide.