Amplify s3 trigger for storage - amazon-s3

i have crated an amplify react app with a storage where i can save my pdf file, i have create a trigger from the aws interface, but when i load a file my app don't trigger the lambda function.
i load all file in a public folder of my storage and if i go in my storage properties i have the event
event. when i try the function manualy i have the event in cloudwatch but when i insert a document in my s3 bucket no. where is the problem? Where am I doing wrong?
this is my trigger trigger and this is my function lambda code
thanks for help
i try to retrive pdf file when is load in a s3 bucket

You created the trigger "from the aws interface". I don't believe "Amplify Studio" supports that yet, and you should never make changes to Amplify generated resources via the "AWS Console" web interface.
You should probably undo whatever you setup, and then do an amplify push from within your project to make sure it still deploys.
If your S3 bucket (Storage) is managed by Amplify, and your Lambda (Function) is managed by Amplify, then you can easily generate a trigger that will activate the Lambda when changes occur in the S3 bucket.
From Amplify trigger documentation, you add the trigger from the S3 bucket:
amplify update storage
Then
? Do you want to add a Lambda Trigger for your S3 Bucket? Yes
? Select from the following options
❯ Choose an existing function from the project
Create a new function
After those steps, deploy your changes:
amplify push
Then go ahead and drop a test file into your S3 bucket.
Your lambda will receive all S3 events for the bucket. Your code probably only wants to process the s3:ObjectCreated events.

Related

Setting up an s3 event notification for an existing bucket to SQS using cdk is trying to create an unknown lambda function

I am trying to setup an s3 event notification for an existing S3 bucket using aws cdk.
Below is the code.
bucket = s3.Bucket.from_bucket_name(self, "S3Bucket", f"some-{stack_settings.aws_account_id}")
bucket.add_event_notification(
s3.EventType.OBJECT_CREATED,
s3n.SqsDestination(queue),
s3.NotificationKeyFilter(
prefix="uploads/"
),
)
The stack creation fails and I am seeing below error on cloudformation console.
User: arn:aws:sts::<account>:assumed-role/some-cicd/i-8989898989xyz
is not authorized to perform: lambda:InvokeFunction on resource:
arn:aws:lambda:us-east-1:<account_number>:function:<some name>-a-BucketNotificationsHandl-b2kDmawsGjpL
because no identity-based policy allows the lambda:InvokeFunction action (Service: AWSLambda;
Status Code: 403; Error Code: AccessDeniedException; Request ID: c2d91744-416c-454d-a510-ff4cce061b80;
Proxy: null)
I am not sure what this lambda is. I am not trying to create any such lambda in my cdk app.
Does anyone know what is going on here and if there is anything wrong with my code ?
The ability to add notifications to an existing bucket is implemented with a custom resource - that is, a lambda that uses the AWS SDK to modify the bucket's settings.
CloudFormation invokes this lambda when creating this custom resource (also on update/delete).
If you would like details, here's the relevant github issue, you can see the commit that added the feature.

Update Amplify datastore from AWS lambda function

In my application, there is functionality that when we upload files in s3 then related AWS Lambda trigger is invoked. When Lambda function trigger I want to store file related data in AWS Amplify data store. Is this possible to access AWS Amplify data store in Lambda function? Any references?
s3 ---> Lambda function trigger ---> Amplify Datastore
Note: I found that we can add data from Lambda function to DynamoDB, but my app is using in AWS Amplify data store.
Perform a GraphQL mutation using code like this: https://docs.amplify.aws/lib/datastore/how-it-works/q/platform/js/#writing-data-from-the-appsync-console
You can use the API feature of Amplify to enable access to features like DataStore. Take a look at this reference.
There is no difference for Amplify. Amplify data store uses Dynamodb. So you would need to grant IAM policy to your lambda function to have access to the db. read this reference

How to use S3 with aws lambda to send and retrieve text file data

So I'm completely new to lambda and S3 and I need to be able to let a client send and retrieve image uri data from S3
Anyone know where I should start?
if you want your clients , send and retrieve images and metadata about the images to s3 , you don't even lambda to do so , you can use any of AWS sdk's available for a variety of programming languages to directly interact with s3.
for e.g, I am attaching a link to example to make a photo gallery type application using s3 and javascript sdk for AWS. you can do such with any programming language for whch sdk are available
https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/s3-example-photo-album.html
Your client can directly interact with AWS S3 using AWS SDK for javascript, using this client can directly upload and retrieve data from the browser.
If you are planning to use S3 with lambda. Attach API Gateway endpoint to AWS Lambda, using this endpoint your client can make REST calls to your Lambda.
Write a Lambda function to handle GET and POST request to retrieve and send data. One Lambda function can have only one entry point, keep this in mind while writing code.
Use this lambda function to connect to S3. As you are new, keep in mind to give Lambda role permission to access S3.

Uploading an image through Amazon API gateway and lambda

I have a REST API with API gateway and Lambda.
I wan't to create an endpoint for uploading a profile picture, that passes the file to a Lambda function, where it is been resized, registers it to the database and returning the url path of the new image.
Is there any way to do so with those services?
Couldn't find anything online (the only suggestion I found is uploading directly to S3, which requires IAM permissions, and having an event triggering a Lambda function that resizing the picture).
Thanks
UPDATE
AWS updated APIGATEWAY and know you can send binaries through an endpoint
Thanks to #blue and #Manzo for commenting it
Uploading a file directly to S3 doesn't necessarily require IAM permissions. You would create an API endpoint that returns a pre-signed S3 URL, which could then be used to upload the file directly to S3. The Lambda function behind the API endpoint would be the only thing that needed the correct IAM permissions for the S3 bucket.
Since API Gateway and Lambda don't support natively currently, you can pass the file to a picture in based64 encoded to API Gateway then pass to Lambda function. Your Lambda function can based64 decoded, then resized, registers it to the database and returning the url path of the new image.

AWS Lambda working with S3

I want to create a Python Lambda function to take uploaded s3 images and create a thumbnail version of them.
I have permission problems where I cannot get access to my bucket. I understand that I need to create a bucket policy. I don't understand how I can make a policy which works for a lambda request performing the thumbnail process?
It sounds like you want to do the following:
Fire lambda whenever the something is uploaded to your bucket
Read a file from the bucket
Write a (thumbnail) file back to the bucket
You'll need 3 different permissions to do that:
The S3 service will need permission to invoke your lambda function (this is done for you when you add an S3 event source via the AWS Lambda console).
The lambda execution role (the one selected on the Configuration tab of the Lambda Console) will need read/write access to call S3. You can generate a policy for this on the policy generator by selecting IAM Policy from the drop down and then selecting the S3 permissions you need.
For added security, you can set a bucket policy on S3 to only allow the lambda function to access it. You can generate this from the policy generator as well by selecting S3 policy. You would then enter lambda.amazonaws.com as the Principal.