I am trying to create a route which will perform some CRUD operations on DynamoDB.
At high level , it can be understood as :
The node js server application is running .(i.e. command 'node server.js' is being triggered)
The user uses POSTMAN of chrome browser to do route requests.
The user does a GET request for 'http://localhost:8080/listtablesofdynamodb'.
The specific route connected with this url gets hit which should do dynamodb specific activity. (like connecting to dynamodb ,fetching table names and showing it in callback method.)
the reason I am asking this question is because I could not find any relevant tutorial of how to do dynamodb activity by using express js of node. All I could find is console applications on aws website which seems not useful for me.
Any kind of help is highly appreciated.
Access key required
All you need to d is make a DynamoDB object to connect too
var ddb = require('dynamodb').ddb({ accessKeyId: '< your_access_key_id >', secretAccessKey: '< your_secret_access_key >' });
put this under your require statements, turn on your server. Then you can just fill out the routes to do the CRUD operations you need.
To test it use
ddb.listTables({}, function(err, res) {console.log(res);});
This will list all the tables in your db.
for full source check here
Best of luck
fortunately, I could manage to use aws-sdk in my route. the solution have two stages:
Run the code in your aws account's EC2 instance and attach an IAM role which allows the ec2 instance to talk to dynamodb. (in this way you don't need to hard-code access key in your code) see this article.
can take reference of the below code for initial code scaffolding.
`
var express = require('express');
var router = express.Router();
var AWS = require("aws-sdk");
AWS.config.update({
region: "us-west-2",
endpoint: "dynamodb endpoint specific to your aws account"
});
var dynamodb = new AWS.DynamoDB();
var params = {
ExclusiveStartTableName: "stringvalue",
Limit: 10
};
/* GET users listing. */
router.get('/', function (req, res) {
console.log("entered into dynadb route");
dynamodb.listTables(params, function (err, data) {
if (err) console.log(err, err.stack); // an error occurred
else {
res.send(data);
}
});
});
module.exports = router;
`
Related
I'm working on an application with RESTful API endpoints that needs proper authorization security using an RBAC system. So far, I've looked into Keycloak. It looks promising at first but doesn't support granular authorization control of an endpoint, which is a hard requirement. For example, if I have the endpoint /object/<object:id>, a list of object IDs [1,2,3,4] and a test user, there's no way to restrict the test user to only have access to object IDs [1,2] but not [3,4] for the same endpoint. It seems the user will have access to all the IDs or none. Perhaps this can be accomplished by customizing or extending the base Keycloak server but there isn't enough documentation on the Keycloak website on how to do so.
I've done a search for other RBAC permissions systems but haven't been able to find much. Are there any authorization systems out there that can accomplish this?
but doesn't support granular authorization control of an endpoint
Check out Auth0's Fine Grained Authorization solution: https://docs.fga.dev/. (Disclaimer: I am employed by Auth0).
In your specific case you would need to create an authorization model like
type object
relations
define reader as self
And then add the following tuples in the FGA store using the Write API:
(user:test, relation:reader, object:1)
(user:test, relation:reader, object:2)
Then, in your API, you would do something like this:
const { Auth0FgaApi } = require('#auth0/fga')
const express = require('express')
const app = express()
const fgaClient = new Auth0FgaApi({
storeId: process.env.FGA_STORE_ID, // Fill this in!
clientId: process.env.FGA_CLIENT_ID, // Fill this in!
clientSecret: process.env.FGA_CLIENT_SECRET // Fill this in!
});
app.get('/objects/:id', async (req, res) => {
try {
const { allowed } = await fgaClient.check({
tuple_key: {
user: req.query.user,
relation: 'reader',
object: "object:" + req.params.id
}
});
if (!allowed) {
res.status(403).send("Unauthorized!")
} else {
res.status(200).send("Authorized!")
}
} catch (error) {
res.status(500).send(error)
}
});
const port = 3000
app.listen(port, () => {
console.log(`Example app listening on port ${port}`)
})
I have a local RxDB database and I want to connect it with CouchDB. Everything seems to works fine except for authentication. I have no idea how to add it differently then inserting credentials in database url:
database.tasks.sync({
remote: `http://${username}:${pass}#127.0.0.1:5984/tododb`,
});
I would like to use JWT auth but can't find how to add a token to sync request. I found only some solutions for PouchDB (pouchdb-authentication plugin) but can't get it working with RxDB.
RxDB is tightly coupled with PouchDB and uses its sync implementation under the hood. To my understanding, the only way to add custom headers to a remote PouchDB instance (which is what is created for you when you pass a url as the remote argument in sync), is to intercept the HTTP request:
var db = new PouchDB('http://example.com/dbname', {
fetch: function (url, opts) {
opts.headers.set('X-Some-Special-Header', 'foo');
return PouchDB.fetch(url, opts);
}
});
PouchDB replication documentation (sync) also states that:
The remoteDB can either be a string or a PouchDB object. If you have a fetch override on a remote database, you will want to use PouchDB objects instead of strings, so that the options are used.
Luckily, RxDB's Rx.Collection.sync does not only accept an server url as the remote argument, but also another RxCollection or a PouchDB-instance.
RxDB even reexport the internally used PouchDB module, so you do not have to install PouchDB as a direct dependency.
import { ..., PouchDB } from 'rxdb';
// ...
const remotePouch = new PouchDB('http://27.0.0.1:5984/tododb', {
fetch: function (url, opts) {
opts.headers.set('Authorization', `Bearer ${getYourJWTToken()}`)
return PouchDB.fetch(url, opts);
}
})
database.tasks.sync({
remote: remotePouch,
});
I'm creating an API where I want to create dynamically multiple graphql endpoints according to the user's preferences without restarting the server.
In general is something very simple, the user will specify the graphql schema and the resolvers, and after the user will push a button to generate a graphql endpoint which will be under the mydomain/randomString
The only code that I can provide so far is how am I creating a graphql endpoint in my expressjs app, which happens on one of the initial files that run when the nodejs server starts.
const {ApolloServer, gql} = require('apollo-server-express');
const typeDefs = gql(fs.readFileSync('./server/graphql/schema.graphql', {encoding: 'utf-8'}));
const resolvers = require ('./server/graphql/resolvers');
const graphqlServer = new ApolloServer({
typeDefs
,resolvers
,engine: {
apiKey: "aRandomString"
}
,introspection: true
// ,context: ({req}) => ({user: req.user && db.users.get(req.user.sub)})
});
After the user generates the endpoint, he/she will be able to query on that graphql endpoint.
As mentioned in other newbie question (Google Assistant - Account linking with Google Sign-In) I have an Express app which supports Google authentication and authorization via Passport and now with the help of #prisoner my Google Action (which runs off the same Express app) supports Google login in this way https://developers.google.com/actions/identity/google-sign-in.
My question now is how can I use the varous middlewares that my Express app has as part of the Google Assistant intent fullfillments? A couple of examples:
1) I have an intent
// Handle the Dialogflow intent named 'ask_for_sign_in_confirmation'.
gapp.intent('Get Signin', (conv, params, signin) => {
if (signin.status !== 'OK') {
return conv.ask('You need to sign in before using the app.');
}
const payload = conv.user.profile.payload
console.log(payload);
conv.ask(`I got your account details, ${payload.name}. What do you want to do next?`)
});
Now just because the user is signed in to Google in my action presumably doesn't mean that they have authenticated (via the Google Passport strategy) into my Express app generally? However from the above I do have access to payload.email which would enable me to use my site Google login function
passportGoogle.authenticate('google',
{ scope: ['profile', 'email'] }));'
which essentially uses Mongoose to look for a user with the same details
User.findOne({ 'google.id': profile.id }, function(err, user) {
if (err)
return done(err);
// if the user is found, then log them in
if (user) {
return done(null, user);
....
ok, I would need to modify it to check the value of payload.email against google.email in my DB. But how do I associate this functionality from the Express app into the intent fullfillment?
2) Given the above Get Signin intent how could I exectute an Express middleware just to console.log('hello world') for now? For example:
gapp.intent('Get Signin', (conv, params, signin) => {
if (signin.status !== 'OK') {
return conv.ask('You need to sign in before using the app.');
}
authController.assistantTest;
const payload = conv.user.profile.payload
console.log(payload);
conv.ask(`I got your account details, ${payload.name}. What do you want to do next?`)
});
Here authController.assistantTest; is
exports.assistantTest = (req, res) => {
console.log('hello world');
};
Any help / links to docs really appreciated!
It looks like you're trying to add a piece of functionality that runs before your intent handler. In your case, it's comparing user's email obtained via Sign In versus what's stored in your database.
This is a good use case for a middleware from Node.js client library (scroll down to "Scaling with plugins and middleware
" section). The middleware layer consists of a function you define that the client library automatically runs before the IntentHandler. Using a middleware layer lets you modify the Conversation instance and add additional functionality.
Applying this to your example gives:
gapp.middleware(conv => {
// will print hello world before running the intent handler
console.log('hello world');
});
gapp.intent('Get Signin', (conv, params, signin) => {
if (signin.status !== 'OK') {
return conv.ask('You need to sign in before using the app.');
}
authController.assistantTest;
const payload = conv.user.profile.payload
console.log(payload);
conv.ask(`I got your account details, ${payload.name}. What do you want to do next?`)
});
You could perform the authentication logic in the middleware, and potentially utilize conv.data by keeping track if user's email matched records from your database.
I decided to use Fine Uploader for my current AngularJS project (which is connected to hosted on Firebase) because it has many core features that I will need in an uploader already built in but, I am having trouble understanding how to use Firebase's email & password authentication method to communicate with AWS (Amazon Web Services) to allow my users to use Fine Uploader S3 to upload content. Based on Fine Uploader blog post Uploads without any server code, the workflow goes like:
Authenticate your users with the help of an identity provider, such as Google
Use the temporary token from your ID provider to grab temporary access keys from AWS
Pass the keys on to Fine Uploader S3
Your users can now upload to your S3 bucket
The problem is that I won't be using OAuth 2.0 (which is used by Google, Facebook or Amazon to provide user identities) to allow my user's to sign into my app and upload content. Instead I will be using Firebase's email & password authentication.
So how can I make Firebase's email & password authentication method create a temporary token to grab temporary access keys from AWS and pass those keys on to Fine Uploader S3 to allow my users to upload content to S3?
To connect AWS with an outside application, Cognito is going to be a good solution. It will let you generate an OpenID token using the AWS Node SDK and your secret keys in your backend, that you can then use with the AWS JavaScript SDK and WebIdentityCredentials in your client.
Note that I'm unfamiliar with your specific plugin/tool, but this much will at least get you the OpenID and in my work it does let me connect using WebIdentityCredentials, which I imagine is what they are using.
Configure Cognito on AWS
Setup on Cognito is fairly easy - it is more or less a walkthrough. It does involve configuring IAM rules on AWS, though. How to set this up is pretty project specific, so I think I need to point you to the official resources. They recently made some nice updates, but I am admittedly not up to speed on all the changes.
Through the configuration, you will want to setup a 'developer authenticated identity', take note of the 'identity pool id', and the IAM role ARN setup by Cognito.
Setup a Node Server that can handle incoming routes
There are a lot of materials out there on how to accomplish this, but you want to be sure to include and configure the AWS SDK. I also recommend using body-parser as it will make reading in your POST requests easier.
var app = express();
var bodyParser = require('body-parser');
var AWS = require('aws-sdk');
app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json());
Create POST Function to talk with Cognito
Once you have your server setup, you then reach out to Cognito using getOpenIdTokenForDeveloperIdentity. In my setup, I use authenticated users because I expect them to come back and want to be able to continue the associations, so that is why I send in a UserID in req.body.UserIDFromAngularApp.
This is my function using express.router().
.post(function(req, res) {
if(req.body.UserIDFromAngularApp) {
var cognitoidentity = new AWS.CognitoIdentity();
var params = {
IdentityPoolId: 'your_cognito_identity_pool_id',
Logins: {
'your_developer_authenticated_identity_name': req.body.UserIDFromAngularApp
}
};
cognitoidentity.getOpenIdTokenForDeveloperIdentity(params, function(err, data) {
if (err) { console.log(err, err.stack); res.json({failure: 'Connection failure'}); }
else {
console.log(data); // so you can see your result server side
res.json(data); // send it back
}
});
}
else { res.json({failure: 'Connection failure'}); }
});
If all goes well, that will return an OpenID Token back to you. You can then return that back to your Angular application.
POST from Angular, Collect from Promise
At the very least you need to post to your new node server and then collect the OpenID token out of the promise. Using this pattern, that will be found in data.Token.
It sounds like from there you may just need to pass that token on to your plugin/tool.
In case you need to handle authentication further, I have included code to handle the WebIdentityCredentials.
angular.module('yourApp').factory('AWSmaker', ['$http', function($http) {
return {
reachCognito: function(authData) {
$http.post('http://localhost:8888/simpleapi/aws', {
'UserIDFromAngularApp': authData.uid,
})
.success(function(data, status, headers, config) {
if(!data.failure) {
var params = {
RoleArn: your_role_arn_setup_by_cognito,
WebIdentityToken: data.Token
};
AWS.config.credentials = new AWS.WebIdentityCredentials(params, function(err) {
console.log(err, err.stack);
});
}
});
}
}]);
This should get you on your way. Let me know if I can help further.
Each OAuth provider has a slightly unique way of handling things, and so the attributes available in your Firebase authenticated token vary slightly based on provider. For example, when utilizing Facebook, the Facebook auth token is stored at facebook.accessToken in the returned user object:
var ref = new Firebase(URL);
ref.authWithOAuthPopup("facebook", function(error, authData) {
if (authData) {
// the access token for Facebook
console.log(authData.facebook.accessToken);
}
}, {
scope: "email" // the permissions requested
});
All of this is covered in the User Authentication section of the Web Guide.