Update Amplify datastore from AWS lambda function - amazon-s3

In my application, there is functionality that when we upload files in s3 then related AWS Lambda trigger is invoked. When Lambda function trigger I want to store file related data in AWS Amplify data store. Is this possible to access AWS Amplify data store in Lambda function? Any references?
s3 ---> Lambda function trigger ---> Amplify Datastore
Note: I found that we can add data from Lambda function to DynamoDB, but my app is using in AWS Amplify data store.

Perform a GraphQL mutation using code like this: https://docs.amplify.aws/lib/datastore/how-it-works/q/platform/js/#writing-data-from-the-appsync-console

You can use the API feature of Amplify to enable access to features like DataStore. Take a look at this reference.

There is no difference for Amplify. Amplify data store uses Dynamodb. So you would need to grant IAM policy to your lambda function to have access to the db. read this reference

Related

Amplify s3 trigger for storage

i have crated an amplify react app with a storage where i can save my pdf file, i have create a trigger from the aws interface, but when i load a file my app don't trigger the lambda function.
i load all file in a public folder of my storage and if i go in my storage properties i have the event
event. when i try the function manualy i have the event in cloudwatch but when i insert a document in my s3 bucket no. where is the problem? Where am I doing wrong?
this is my trigger trigger and this is my function lambda code
thanks for help
i try to retrive pdf file when is load in a s3 bucket
You created the trigger "from the aws interface". I don't believe "Amplify Studio" supports that yet, and you should never make changes to Amplify generated resources via the "AWS Console" web interface.
You should probably undo whatever you setup, and then do an amplify push from within your project to make sure it still deploys.
If your S3 bucket (Storage) is managed by Amplify, and your Lambda (Function) is managed by Amplify, then you can easily generate a trigger that will activate the Lambda when changes occur in the S3 bucket.
From Amplify trigger documentation, you add the trigger from the S3 bucket:
amplify update storage
Then
? Do you want to add a Lambda Trigger for your S3 Bucket? Yes
? Select from the following options
❯ Choose an existing function from the project
Create a new function
After those steps, deploy your changes:
amplify push
Then go ahead and drop a test file into your S3 bucket.
Your lambda will receive all S3 events for the bucket. Your code probably only wants to process the s3:ObjectCreated events.

Can we use AWS Lambdas to check if any mandatory policy is not missing for all newly created roles

how can we make sure that a particular policy ( say S3 bucket access restriction policy ) is attached to all newly created Roles.
Can we write a Lambda that gets triggered only when a new Role is created and check and if missing attached required policies?
AttachRolePolicy API can be used to attach policy to a role. Are there any examples available in AWS Lambda to get this done?
Does Terraform provides any such modules readily available that can be referred in this context.
Yes, this is possible. You can configure a lambda function that's triggered by EventBridge via CloudTrail when a specific AWS API is called. Take a look at the doc here. Since this is pretty simple, I don't think there is a specific module created for this. You can write your own directly based on the resources in the AWS provider.
Python SDK for AWS Boto3, is having multiple APIs that can be used like adding a role, policy etc. It is very simple and documentation is awesome.
https://boto3.amazonaws.com/v1/documentation/api/latest/guide/quickstart.html
an example to attach a policy to an existing role:
client= boto3.resource('iam')
response = client.attach_role_policy(
PolicyArn='arn:aws:iam::aws:policy/ReadOnlyAccess',
RoleName='ReadOnlyRole',
)
Similarly other clients can be grabbed for other AWS services.

Allow API users to run AWS Lambda using execution role from Cognito identity pool

I'm using AWS amplify to create an app, where users can upload images using either private or public file access levels, as described in the documentation. Besides this, I've implemented a lambda function which upon request through API gateway modifies an image and returns a link to the modified image.
What I want is that a given user should be able to call the API and modify only his own images, but not that of other users; i.e. allow the AWS lambda function to use the execution role from the cognito user. If I allow the lambda function to access all data in the S3 bucket then it works fine - but I don't want users to be able to access other users images.
I've been at it for a while now, trying different things to no avail.
Now I've integrated the API with the user pool as described here:
https://docs.aws.amazon.com/apigateway/latest/developerguide/apigateway-enable-cognito-user-pool.html
And then I've tried to follow this guide:
https://aws.amazon.com/premiumsupport/knowledge-center/cognito-user-pool-group/
Which does not work since the "cognito:roles" is not present in the event variable of the lambda_handler (presumably because there are not user pool groups?).
What would the right way be to go about this in an AWS Amplify app?
Primarily, I've followed this guide:
https://aws.amazon.com/premiumsupport/knowledge-center/cognito-user-pool-group/
Use API Gateway request mapping and check permissions in Lambda itself:
Use API Gateway request mapping to pass context.identity.cognitoIdentityId to Lambda. Just it should be a Lambda integration with mapping (not a Proxy integration). Another limitation is that API request should be POST, for GET it's also possible if you map cognitoIdentityId to query string.
Lambda has access to all files in S3
Implement access control check in Lambda itself. Lambda can read all permissions of the file in S3. And then see if owner is Cognito user.

How to use S3 with aws lambda to send and retrieve text file data

So I'm completely new to lambda and S3 and I need to be able to let a client send and retrieve image uri data from S3
Anyone know where I should start?
if you want your clients , send and retrieve images and metadata about the images to s3 , you don't even lambda to do so , you can use any of AWS sdk's available for a variety of programming languages to directly interact with s3.
for e.g, I am attaching a link to example to make a photo gallery type application using s3 and javascript sdk for AWS. you can do such with any programming language for whch sdk are available
https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/s3-example-photo-album.html
Your client can directly interact with AWS S3 using AWS SDK for javascript, using this client can directly upload and retrieve data from the browser.
If you are planning to use S3 with lambda. Attach API Gateway endpoint to AWS Lambda, using this endpoint your client can make REST calls to your Lambda.
Write a Lambda function to handle GET and POST request to retrieve and send data. One Lambda function can have only one entry point, keep this in mind while writing code.
Use this lambda function to connect to S3. As you are new, keep in mind to give Lambda role permission to access S3.

AWS Lambda working with S3

I want to create a Python Lambda function to take uploaded s3 images and create a thumbnail version of them.
I have permission problems where I cannot get access to my bucket. I understand that I need to create a bucket policy. I don't understand how I can make a policy which works for a lambda request performing the thumbnail process?
It sounds like you want to do the following:
Fire lambda whenever the something is uploaded to your bucket
Read a file from the bucket
Write a (thumbnail) file back to the bucket
You'll need 3 different permissions to do that:
The S3 service will need permission to invoke your lambda function (this is done for you when you add an S3 event source via the AWS Lambda console).
The lambda execution role (the one selected on the Configuration tab of the Lambda Console) will need read/write access to call S3. You can generate a policy for this on the policy generator by selecting IAM Policy from the drop down and then selecting the S3 permissions you need.
For added security, you can set a bucket policy on S3 to only allow the lambda function to access it. You can generate this from the policy generator as well by selecting S3 policy. You would then enter lambda.amazonaws.com as the Principal.