Authentication of API Gateway methods using Cognito? - amazon-cognito

I have created a an API in API Gateway named “Test” which has 2 methods – add and delete.
Criteria:
Create 2 users in Cognito with 1st user having access to both methods and 2nd user having access to only “add”.
Can anyone help me in this thanks in advance.

Check out the using Groups with API Gateway docs:
You can use groups to create a collection of users in a user pool, which is often done to set the permissions for those users. For example, you can create separate groups for users who are readers, contributors, and editors of your website and app. Using the IAM role associated with a group, you can also set different permissions for those different groups so that only contributors can put content into Amazon S3 and only editors can publish content through an API in Amazon API Gateway.
You should be able to set permissions on the groups such that one has access to both API endpoints and the other just to the one.

Related

Has anyone set up an API for a group of users in BeyondTrust,

Has anyone set up an API for a group of users in BeyondTrust, and if so, will the members of the group be able to see and access each others passwords or do I need to create separate API Keys per each user?
I have not tried this as of yet, I do not fully understand the results of attempting an API for a group of users with in BeyondTrust Password Management systems, and I am afraid if I set up an API for a group of users, they will be able to view and utilize passwords that do not belong to them.
Is an API for a group even a thing, or do I have to set up API keys per user?
Password Safe users can use the API to do most of the functions available through the UI. They operate under the same security model regardless of access method, with the caveat that accounts need to be indicated as accessible via the API. This means that you might not be able access every account that you have access to through the UI, via the API but, it does mean you cannot access any account through the API that you don't have access to through the UI.

implementation of object level raw permissions inside microservice architecture

I have a bunch of microservices running. I have an api gateway that connect consumers to all these services. Now on some services I need to give permission to certain users ( users are stored in separate Users Service ). For example if I have a blog service I need to give blog 1, blog 2 and blog 3 view permissions only to user 1 and not to user 2 ( the scenario is acl rather than RBAC or ABAC i think, correct me if i am worng ). Now how should i implement the permission system.
For example If store the permission on each entity object inside each microservice as suggested here then each of my service has to know about users also to grant them permissions. this scenario will compel me to synchronize the users data across all microservices ( on users delete update ... ).
Another Solution is to create a separate generic authorization service to manage all services permissions. but this solution will require me to save each microservice schema ( and have to synchronize that schema on change )
Or there is any other solution. please help. how to implement ACL (authorization)
One solution would be to introduce userID into the downstream path from the gateway. So the path
GET /blogs/{blogId}
is exposed on the gateway, but this becomes
GET /blogs/{userId}/{blogId}
on the blogs microservice. The gateway handles the user's bearer token and injects the user's ID into the downstrteam call. The blogs microservice would then return the blog for a "valid" path, or a 404 if the path was not valid.
This is illustrated here:
the backend service maintains which users have access to which blogs?
The blog microservice stores and manages blogs. A blog can have an access list associated with it. That list may contain only the user who is the blog owner, or more than one user. The point is that the ACL is not centralised, but distributed with each blog's data containing it.
if a user is deleted
Then you have two choices. If the user is deleted you publish a UserDeleted event, to which the blogs service is subscribed. Then you can manage all the blogs which have the user in their ACL and remove them. Or, you can do nothing. I would personally choose the latter; one of the features of having a microservice architecture is that some data will not be consistent. If you require absolute consistency then you can have a "caretaker" process which removes deleted users out of blog ACLs. Or don't use microservices.

AWS Cognito combined with other Rest apis except api gateway

Cognito is a powerful tool yet it's so hard to utilize it .
Trying to implement my business logic I ended up in some dead ends and I need some help.
My client app is an admin panel where a new admin can register providing a company.
Now there are 2 applications (Contests, Reviews) based on these companies .
So from the adminPanel the admin can create events for both apps through 2 different rest apis
The concept is 1 admin is related to 1 or more companies.I have to map that relationship somehow with Cognito .
Because cognito does pretty much well authentication but not authorization.
In every request not only I have to validate the user by the access token but I also have to see if the user is authorized to do the action based on the company.
For example if a user want to create a Contest event for his company.
I will make a request to Contest Api and I have to authorize that the admin is related to this company
Company entities are used from both apis so I have to expose them in a new api called companies.(If any api wants information about companies it should call the get company/{id})
My consideration is that in order to authorize a user in my apis I have to:
1) validate the access_token .
2) communicate with Cognito to get user informations.
3) call companies api to check if user is authorized to execute actions for this campaign.
So I kind of feeling that it becomes too complex and I need 2 services to authenticate and authorize each request(cognito + company api).
Is there any other way to implement cognito authorization logic without having to use a second api ?
P.S I have already check cognito triggers but they don't cover my needs .For example pre token generation trigger can add claims but it will add them in identity_token not access_token .Also my claims has to be an array of company_ids that an admin is related but claims supports strings and numbers only

The Best Solution for an AWS Mobile App, DynamoDB, & S3 Scenario

I am planning a game app for mobile devices. Users will log into the game using their existing social media account to streamline data capture. Company B would like to directly save player data and scoring information from the mobile app to a DynamoDS table named Score Data When a user saves their game the progress data will be stored to the Game state S3 bucket.
What is the best approach for storing data to DynamoDB and S3?
Option 1: Use an EC2 Instance that is launched with an EC2 role providing access to the Score Data DynamoDB table and the GameState S3 bucket that communicates with the mobile app via web services.
Option 2: Use temporary security credentials that assume a role providing access to the Score Data DynamoDB table and the Game State S3 bucket using web identity federation.
Many architects I talked to Option 1 is the right one. But according to AWS doco, it appears Option2 can be valid too. Any inputs would be appreciated!
I would strongly consider Option #2 using Amazon Cognito to provide temporary credentials to your users that enable them to directly and specifically access DynamoDB and S3.
Generally speaking, you need to:
Create a new Cognito Identity Pool and set up 2 IAM roles -- one for authenticated users and one for unauthenticated users (optional). https://docs.aws.amazon.com/cognito/devguide/getting-started/?platform=ios
Authenticate a user via your own authentication provider or via external providers like Facebook, Twitter, etc., and then use Cognito to create temporary credentials for them. https://docs.aws.amazon.com/cognito/devguide/identity/external-providers/
Use the credentials to access DynamoDB and/or S3. Your AWS resources will be protected as long as you set up your IAM roles appropriately. For example, you can give fine grained access to your DynamoDB table so that users cannot access rows that don't belong to them. See the following link for more details: https://docs.aws.amazon.com/cognito/devguide/identity/concepts/iam-roles/
The Cognito developer guide is here: https://docs.aws.amazon.com/cognito/devguide/.

Service account -- limiting access to only big query

Is there a way to create a service account in the context of Google's cloud services that can only access BigQuery and not any other service (GCE, App Engine, &c)? Or is it necessary to create a new "project" and put the account in that project?
There are two ways to scope access:
ACLs and group membership allow control over what the service account has access to.
OAuth credentials can be scoped to individual services / apis.
Either option could work for you, depending on what your ultimate goal is.
How to use ACLs to limit access to only BigQuery
A service account is an identity, just like an email address is an identity.
Identity access is controlled through ACLs, either on the project or on the individual datasets you want to manage. BigQuery's access control is described here: https://cloud.google.com/bigquery/access-control. Other services and apis offer their own ACL controls. Together, these options give you fine grained control over access.
For example, if you put the service account in the project owners ACL, then that service account will have access to everything a project owner would have: BigQuery, Google Storage, etc.
Alternatively, if you put that service account only on a single BigQuery Dataset, then it would only have access to that dataset. (If you also want that service account to be able to run BigQuery jobs, then it would need to be a member of some project since jobs run in the context of a project. If you have a requirement that the project you run BigQuery jobs in cannot be the same project that you store Google Storage data in, then you will need multiple projects.)
How to use OAuth Scopes to limit access to only BigQuery
When you create the OAuth credentials for your service account, you can specify the Scopes that the credentials are valid for. Each api documents the scopes required in order to call the api. BigQuery's scopes are documented here: https://cloud.google.com/bigquery/authorization.
For example, if you only provide BigQuery scopes, then your code will only be able to make BigQuery api calls. Attempting to call a Google Storage API with credentials bound to BigQuery won't work.