Upload file to Amazon S3 and assign callback for percentage uploaded - amazon-s3

Is there an easy way to assign callback function for percentage uploaded in PHP - Amazon s3.
Something similar to this
File Download
but for upload?

The AWS SDK for php 1.2.6 includes a runnable sample in _samples/cli-s3_progress_bar.php.
which shows tracking upload/download progress.
Download here
http://aws.amazon.com/releasenotes/PHP/1553377899765189

Related

Directly download from a link and upload file to GCS

Is there a way to download a MP4 file directly and store on Google bucket. We have a use-case to get a file URL to download and upload it on cloud. However, since file size can be more than 1 GB, it is not feasible to download in local storage first and then upload the file to cloud bucket. We are specifically looking for google cloud storage to upload files and solution should be specific to same.
Some Ref doc we found but does not look like the feasible solution as it uploads file from local storage not directly from link.
https://googleapis.dev/ruby/google-cloud-storage/latest/Google/Cloud/Storage.html
https://www.mydatahack.com/uploading-and-downloading-files-in-s3-with-ruby/
Google Cloud Storage does not offer compute features. That means you cannot directly load an object into Cloud Storage from a URL. You must fetch the object and then upload it into Cloud Storage.

Uploading smaller files with multipart file upload with only one part using AWS CLI

I have configured AWS S3 and a lambda function which triggers when a file is inserted into S3. I have configured an event s3:ObjectCreated:CompleteMultipartUpload in lambda to trigger. When I tested through AWS CLI with large files it worked. But when I upload a smaller file with size less than 5 MB, the event is not triggering the lambda. How can I do this for small size files with only one part?
Anyone please help....
Files less than 5MB cannot be uploaded using multipart upload. Therefore, you can add s3:ObjectCreated:Put event to let your lambda get notified too.

Is there a way to upload files to the Amazon S3 from SFTP

My idea is this: I have an SFTP host with data on it and I want to create a file in S3 from this data, but to save network resources I don't want to download all of this data to a system first to upload again. So my question is: is it possible to transfer the data directly to the s3 without first downloading it? (preferably with the Amazon S3 Java SDK)

Event-driven Elastic Transcoder?

Is there a way to setup a transcoding pipeline on AWS such that it automatically transcodes any new files uploaded to a particular S3 bucket, and places them in another bucket?
I know there is a REST API, and that in theory the uploader could also issue a REST request to the transcoder after it has uploaded the file, but for a variety of reasons, this isn't really an option.
This can now be accomplished using AWS Lambda.
Lambda basically allows you to trigger/run scripts based off of events. You could easily create a Lambda script that runs as soon as a new file is uploaded to a designated s3 bucket. The script would then start a transcoding job for that newly uploaded video file.
This is literally one of the example use cases provided in the Lambda documentation.

How to receive an uploaded file using node.js formidable library and save it to Amazon S3 using knox?

I would like to upload a form from a web page and directly save the file to S3 without first saving it to disk. This node.js app will be deployed to Heroku, where there is no local disk to save the file to.
The node-formidable library provides a great way to upload files and save them to disk. I am not sure how to turn off formidable (or connect-form) from saving file first. The Knox library on the other hand provides a way to read a file from the disk and save it on Amazon S3.
1) Is there a way to hook into formidable's events (on Data) to send the stream to Knox's events, so that I can directly save the uploaded file in my Amazon S3 bucket?
2) Are there any libraries or code snippets that can allow me to directly take the uploaded file and save it Amazon S3 using node.js?
There is a similar question here but the answers there do not address NOT saving the file to disk.
It looks like there is no good way to do it. One reason might be that the node-formidable library saves the uploaded file to disk. I could not find any options to do otherwise. The knox library takes the saved file on the disk and using your Amazon S3 credentials uploads it to Amazon.
Since on Heroku I cannot save files locally, I ended up using transloadit service. Though their authentication docs have some learning curve, I found the service useful.
For those who want to use transloadit using node.js, the following code sample may help (transloadit page had only Ruby and PHP examples)
var crypto, signature;
crypto = require('crypto');
signature = crypto.createHmac("sha1", 'auth secret').
update('some string').
digest("hex")
console.log(signature);
this is Andy, creator of AwsSum:
https://github.com/appsattic/node-awssum/
I just released v0.2.0 of this library. It uploads the files that were created by Express' bodyParser() though as you say, this won't work on Heroku:
https://github.com/appsattic/connect-stream-s3
However, I shall be looking at adding the ability to stream from formidable directly to S3 in the next (v0.3.0) version. For the moment though, take a look and see if it can help. :)