How to convert the AWS S3 principalId to user name - amazon-s3

"userIdentity":{
"principalId":"Amazon-customer-ID-of-the-user-who-caused-the-event"
},
How I can convert the principal id, which is available with S3 Event Notification, into the user Name which is responsible for the action?

I couldn't find how to do this either. I ended up sending the sub from user attributes as part of the object key. Then you have access to the sub in the event notification.

It's mind-boggling why the IAM console UI does not show this nor allow for searching.
Have to use the AWS CLI to pull out these info.
aws iam list-users

Related

I need to create a new IAM user programatically which limits access to an s3 object to only them

Using nodejs i need to have a function which when called creates a new IAM user and and gives them access to the specific s3 object that has been created for them, no one else other than them should have access to the object. I have cognito users and need to give them their own IAM user which can then only access their s3 object.
At the moment I keep receiving the error of invalidclienttokenid, but im unsure as to what this token is and how i can set it programatically. Is this the cognito tokenid returned from logging in? if so how does this relate to trying to create new s3 object and IAM user. Any help at all would be greatly appreciated!!

Create Lambda Function AWS for buckets adding to S3

we have a requirement to create item in dynamodb table when a bucket is created using AWS Lambda, any help would be appreciated
If you are creating S3 bucket from lambda, then same lambda can add an item to dynamoDB. Your lambda role should have permission to add item to dynamoDB.
To achieve this what you can do is, you can use cloud custodianwith lambda.
In cloud custodian there is a way by which you can automatically add username tag,enable encryption etc while creating a s3 bucket with the help of this you can add your custom logic and insted to adding username tag etc you can use boto3 to add the bucket value to DynamoDB.
As per amazon: "Enabling notifications is a bucket-level operation; that is, you store notification configuration information in the notification subresource associated with a bucket"
The previous statement quotes the AWS S3 Documentation Configuring Amazon S3 Event Notifications. It implies that the bucket have to exist and it is over an specific bucket that you create and configure the notification event. basically: You can't produce a notification event on a bucket creation.
I would suggest as an alternate solution: to consider a scheduled process that monitors the existing list of s3 buckets and write a record to DynamoDB when a new bucket is created (new bucket shows in the list). The following : Creating, Listing, and Deleting Amazon S3 Buckets illustrates some examples of what can be done using the Java SDK but the same can be found in other languages as well (see: List Buckets).
Note: You can look at the following tutorial: Schedule AWS Lambda Functions Using CloudWatch Events as one of the possible ways to to run a lambda on an schedule/interval.
Accessing the Bucket object properties can give you extra information like the bucket creation date, bucket owner and more (details of the bucket class can be found here)

AWS S3 event notification on object permission change

Can you some one guide me how to setup an event notification for object level permission change.Currently notification available for read,write,delete etc..
But I am looking to setup a email trigger if some one changed access permission in an s3 object inside a bucket.
There are two ways to deal with this kind of concern:
Proactive: write IAM policies that prevent users from putting object
with public access
Reactive: use CloudWatch Events to detect issues and respond to them (see blog post)

Amazon S3 - does root user have access to buckets?

I am testing S3 calls using DHC REST client in Chrome. In these tests, the Authorization is all based on my root user credentials.
I can do a GET with //mybucket.s3.amazonws.com, and a list of the items in mybucket is returned.
If I add an item to retrieve (//mybucket.s3.amazonws.com/myitem), I always get 403 Forbidden.
I thought that the root user had automatic access to the objects, but am I wrong about that?
I took screen prints of both tests, which I'll supply if needed.
After some further monkeying around, I found my answer. Yes, the AWS root user can access individual items. But the Authorization header string changes. When you retrieve an object, that object's key participates in the calculation of the auth string. Thus, the same string used to retrieve the bucket list does not work when retrieving an object.

How to restrict Amazon S3 API access?

Is there a way to create a different identity to (access key / secret key) to access Amazon S3 buckets via the REST API where I can restrict access (read only for example)?
The recommended way is to use IAM to create a new user, then apply a policy to that user.
Yes, you can. The S3 API documentation describes the Authentication and Access Control services available to you. You can set up a bucket so that another Amazon S3 account can read but not modify items in the bucket.
Check out the details at http://docs.amazonwebservices.com/AmazonS3/2006-03-01/dev/index.html?UsingAuthAccess.html (follow the link to "Using Query String Authentication")- this is a subdocument to the one Greg Posted, and describes how to generate access URLs on the fly.
This uses a hashed form of the private key and allows expiration, so you can give brief access to files in a bucket without allowed unfettered access to the rest of the S3 store.
Constructing the REST URL is quite difficult, it took me about 3 hours of coding to get it right, but this is a very powerful access technique.