How to make an upload method for a large file to Yandex Cloud Serverless function to be called on it? - file-upload

So I want to have no personal server infrastructure. I want to have a HTTP API roun
t a user can upload a file into (2GB+) so that:
File would be stored to object storedge for 3 days
A serverless function would be called on it
So how to make an upload method for a large file to Yandex.Cloud Serverless function to be called on it?
So I need something similar to this AWS sample for YC

There is a limit on request size that Yandex Cloud Serverless Function could handle. It is 3.5MB. So you won't be able to upload 2Gb.
There is a workaround — upload the file directly to Object Storage using a pre-signed link. To generate the link, you'll need AWS-like credentials from Yandex Cloud.
Passing them to the client side is not safe, so it would be better to generate the link on the server (or Serverless Function) and return it to the client.
Here is the tutorial covering the topic.

Related

Is it possible to use Azure Blob Storage on a website that has no authentication?

I need to create a way for anyone who visits my website to upload an image to an Azure Blob Container. The website will have input validations on the file.
I've considered using an Azure Function to write the validated file to the Blob Container, but I can't seem to find a way to do this without exposing the Function URL to the world (similar to this question.
I would use a System-Assigned Managed Identity (SAMI) to authenticate the Function to the Storage account, but because of this, anyone could take the Function URL and bypass the validations and upload.
How is this done in the real world?
If I understand correctly, the user uploads a file via an HTTP POST call to your server, which validates it. You would like to use an Azure Function to then upload the validated file to the Blob Storage.
In this case, you can restrict the access to the Azure Function; so that it can only be called from your server's IP. This way the users cannot reach that Function. This can be done via the networking settings, and is available on all Azure Function plans.
You could also consider implementing the validation logic within the Azure Function.
Finally (perhaps I should have started with this), if you are only considering writing an Azure Function to upload data to a Storage Account, you should perhaps first consider using the Blob Service REST API, specifically the PUT Blob endpoint. There are also official Storage Account SDKs for different languages/ecosystems that you could use to do this.
• Since, you are using an Azure function default generic URL on your website for uploading blobs with no authentication, I would suggest you to please create an ‘A’ host record for your function app. Considering that you have a website, you may be having a custom domain for your website to be unique and as you might be having a custom domain, the custom domain’s DNS records must be hosted on a public DNS server. Thus, similarly, on the same public DNS server, you will have to create an ‘A’ host record for the function app and assign it the same public IP address that is shown and assigned in Azure. This will ensure that your public DNS server has an active DNS resolver for the function app globally and then ensure to create a ‘CNAME’ record for your default generic Azure function app URL with the same URL as the alias in the DNS records and the ‘A’ host record as the assigned value in it.
In this way, whenever, any anonymous person visits your website and tries to upload an image, he will be shown the function app URL as ‘abc.xyz.com’ and not the generic Azure function app URL thus successfully ensuring that your objective is achieved.
• Once the above said has been done, then publish the new ‘CNAME’ record created in the public DNS server as your function app URL. This will not expose the generic Azure function app URL and mask it as well as ensure that it is secured since you will be uploading an SSL/TLS certificate for the website to be HTTPS protected in the function app workspace itself as shown below in the snapshot: -
For more information, kindly refer the below documentation link: -
https://learn.microsoft.com/en-us/azure/dns/dns-custom-domain

Allow API users to run AWS Lambda using execution role from Cognito identity pool

I'm using AWS amplify to create an app, where users can upload images using either private or public file access levels, as described in the documentation. Besides this, I've implemented a lambda function which upon request through API gateway modifies an image and returns a link to the modified image.
What I want is that a given user should be able to call the API and modify only his own images, but not that of other users; i.e. allow the AWS lambda function to use the execution role from the cognito user. If I allow the lambda function to access all data in the S3 bucket then it works fine - but I don't want users to be able to access other users images.
I've been at it for a while now, trying different things to no avail.
Now I've integrated the API with the user pool as described here:
https://docs.aws.amazon.com/apigateway/latest/developerguide/apigateway-enable-cognito-user-pool.html
And then I've tried to follow this guide:
https://aws.amazon.com/premiumsupport/knowledge-center/cognito-user-pool-group/
Which does not work since the "cognito:roles" is not present in the event variable of the lambda_handler (presumably because there are not user pool groups?).
What would the right way be to go about this in an AWS Amplify app?
Primarily, I've followed this guide:
https://aws.amazon.com/premiumsupport/knowledge-center/cognito-user-pool-group/
Use API Gateway request mapping and check permissions in Lambda itself:
Use API Gateway request mapping to pass context.identity.cognitoIdentityId to Lambda. Just it should be a Lambda integration with mapping (not a Proxy integration). Another limitation is that API request should be POST, for GET it's also possible if you map cognitoIdentityId to query string.
Lambda has access to all files in S3
Implement access control check in Lambda itself. Lambda can read all permissions of the file in S3. And then see if owner is Cognito user.

How to use S3 with aws lambda to send and retrieve text file data

So I'm completely new to lambda and S3 and I need to be able to let a client send and retrieve image uri data from S3
Anyone know where I should start?
if you want your clients , send and retrieve images and metadata about the images to s3 , you don't even lambda to do so , you can use any of AWS sdk's available for a variety of programming languages to directly interact with s3.
for e.g, I am attaching a link to example to make a photo gallery type application using s3 and javascript sdk for AWS. you can do such with any programming language for whch sdk are available
https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/s3-example-photo-album.html
Your client can directly interact with AWS S3 using AWS SDK for javascript, using this client can directly upload and retrieve data from the browser.
If you are planning to use S3 with lambda. Attach API Gateway endpoint to AWS Lambda, using this endpoint your client can make REST calls to your Lambda.
Write a Lambda function to handle GET and POST request to retrieve and send data. One Lambda function can have only one entry point, keep this in mind while writing code.
Use this lambda function to connect to S3. As you are new, keep in mind to give Lambda role permission to access S3.

Drone Deploy Export to Amazon S3

Good afternoon, my apologies if this simple question/answer is in the DroneDeploy GitBooks but I am in the initial stages of developing a geoprocessing app for your platform and was wondering, what is the recommended way to auto-export either a processed orthomosaic or DSM from DroneDeploy to an Amazon S3 bucket? In the Exporter examples it seems like the default behavior is to generate a link then send that link via a defined email address - however is there a method to directly send it to Amazon? Is this the "webhook" function as outlined in the documentation?
webhook: {
url: 'http://www.url-to-ping-on-complete.com/any-params-here' // recieve the export document when its complete
}
Thank you,
Matt
At high level here is how you can accomplish this.
When you use the dronedeploy embedded api to start your export you would pass the url to your server in the webhook field.
When the export finishes dronedeploy will POST the the export document to the webhook you specified.
On your server you will select the exportDocument.download_path and download the file.
Once the file is downloaded you can then upload it to aws.
Please note some of the exports can be rather large and it will take considerable time to download and upload the export. If another export happens before you've finished the last export your server might be blocked from accepting the next request. It might be a good idea to use a lambda service such as firebase cloud functions or aws lambda.

Uploading an image through Amazon API gateway and lambda

I have a REST API with API gateway and Lambda.
I wan't to create an endpoint for uploading a profile picture, that passes the file to a Lambda function, where it is been resized, registers it to the database and returning the url path of the new image.
Is there any way to do so with those services?
Couldn't find anything online (the only suggestion I found is uploading directly to S3, which requires IAM permissions, and having an event triggering a Lambda function that resizing the picture).
Thanks
UPDATE
AWS updated APIGATEWAY and know you can send binaries through an endpoint
Thanks to #blue and #Manzo for commenting it
Uploading a file directly to S3 doesn't necessarily require IAM permissions. You would create an API endpoint that returns a pre-signed S3 URL, which could then be used to upload the file directly to S3. The Lambda function behind the API endpoint would be the only thing that needed the correct IAM permissions for the S3 bucket.
Since API Gateway and Lambda don't support natively currently, you can pass the file to a picture in based64 encoded to API Gateway then pass to Lambda function. Your Lambda function can based64 decoded, then resized, registers it to the database and returning the url path of the new image.