multipart upload to s3 using ng-file-upload - amazon-s3

Can somebody point me to an example of doing multipart upload using ng-file-upload. Also if the upload can be paused/resumed (like when the internet goes down or user puts the machine to sleep).
Thanks.

I switched to evaporate.js for multipart upload.

Related

Using Postman-GRPC ,I need to upload a file as request(via postman) to the my GRPC-server

I have a GRPC file-upload server(C#) running and I also have GRPC-client through which I was uploading files to the GRPC server but now I want to upload my file using postman(yes the new postman version supports GRPC) and now I am not able to figure out how to attach/send the file via the GRPC-postman to my GRPC-server ,for example in REST-postman u r able attach the file in body using FORM-DATA tab. can anyone give me a solution to this
Thank you

Uploading smaller files with multipart file upload with only one part using AWS CLI

I have configured AWS S3 and a lambda function which triggers when a file is inserted into S3. I have configured an event s3:ObjectCreated:CompleteMultipartUpload in lambda to trigger. When I tested through AWS CLI with large files it worked. But when I upload a smaller file with size less than 5 MB, the event is not triggering the lambda. How can I do this for small size files with only one part?
Anyone please help....
Files less than 5MB cannot be uploaded using multipart upload. Therefore, you can add s3:ObjectCreated:Put event to let your lambda get notified too.

After saving file in s3 via boto3, it is being downloaded instead of being viewed

I am saving some html content to Amazon s3, from my flask api using boto3 module with the code
s3.Object(BUCKET_NAME, PREFIX + file_name+'.html').put(Body=html_content)
The file is being stored in s3 but when I am going to view it it is just getting downloaded instead of being viewed. I would rather try to view the file instead of downloading it. How to fix it from boto3 commands? Kindly help me.
Go to the S3 bucket and browse to the file > properties > metadata, there is a key called Content-Type that tells AWS what kind of content it is, it's probably set to binary so it will only be downloaded at the moment, like in this screenshot:
If you change this value to "text/plain" for example it will attempt to view it.

How to receive an uploaded file using node.js formidable library and save it to Amazon S3 using knox?

I would like to upload a form from a web page and directly save the file to S3 without first saving it to disk. This node.js app will be deployed to Heroku, where there is no local disk to save the file to.
The node-formidable library provides a great way to upload files and save them to disk. I am not sure how to turn off formidable (or connect-form) from saving file first. The Knox library on the other hand provides a way to read a file from the disk and save it on Amazon S3.
1) Is there a way to hook into formidable's events (on Data) to send the stream to Knox's events, so that I can directly save the uploaded file in my Amazon S3 bucket?
2) Are there any libraries or code snippets that can allow me to directly take the uploaded file and save it Amazon S3 using node.js?
There is a similar question here but the answers there do not address NOT saving the file to disk.
It looks like there is no good way to do it. One reason might be that the node-formidable library saves the uploaded file to disk. I could not find any options to do otherwise. The knox library takes the saved file on the disk and using your Amazon S3 credentials uploads it to Amazon.
Since on Heroku I cannot save files locally, I ended up using transloadit service. Though their authentication docs have some learning curve, I found the service useful.
For those who want to use transloadit using node.js, the following code sample may help (transloadit page had only Ruby and PHP examples)
var crypto, signature;
crypto = require('crypto');
signature = crypto.createHmac("sha1", 'auth secret').
update('some string').
digest("hex")
console.log(signature);
this is Andy, creator of AwsSum:
https://github.com/appsattic/node-awssum/
I just released v0.2.0 of this library. It uploads the files that were created by Express' bodyParser() though as you say, this won't work on Heroku:
https://github.com/appsattic/connect-stream-s3
However, I shall be looking at adding the ability to stream from formidable directly to S3 in the next (v0.3.0) version. For the moment though, take a look and see if it can help. :)

How do I make Plupload upload directly to Amazon S3?

How do I configure Plupload properly so that it will upload files directly to Amazon S3?
In addition to condictions for bucket, key, and acl, the policy document must contain rules for name, Filename, and success_action_status. For instance:
["starts-with", "$name", ""],
["starts-with", "$Filename", ""],
["starts-with", "$success_action_status", ""],
Filename is a field that the Flash backend sends, but the HTML5 backend does not.
The multipart setting must be True, but that is the default these days.
The multipart_params setting must be a dictionary with the following fields:
key
AWSAccessKeyId
acl = 'private'
policy
signature
success_action_status = '201'
Setting success_action_status to 201 causes S3 to return an XML document with HTTP status code 201. This is necessary to make the flash backend work. (The flash upload stalls when the response is empty and the code is 200 or 204. It results in an I/O error if the response is a redirect.)
S3 does not understand chunks, so remove the chunk_size config option.
unique_names can be either True or False, both work.
Latest Plupload release has illustrative example included, that shows nicely how one might use Plupload to upload files to Amazon S3, using Flash and SilverLight runtimes.
Here is the fresh write-up: Upload to Amazon S3
The official Plupload tutorial, much more detailed than the answers here: https://github.com/moxiecode/plupload/wiki/Upload-to-Amazon-S3
If you are using Rails 3, please check out my sample projects:
Sample project using Rails 3, Flash and MooTools-based FancyUploader to upload directly to S3: https://github.com/iwasrobbed/Rails3-S3-Uploader-FancyUploader
Sample project using Rails 3, Flash/Silverlight/GoogleGears/BrowserPlus and jQuery-based Plupload to upload directly to S3: https://github.com/iwasrobbed/Rails3-S3-Uploader-Plupload
I want to notice, that don't forget to upload crossdomain.xml to your s3 host, and also if you have success_action_redirect url, you need to have crossdomain.xml file on that domain too. I spent 1 day fighting with that problem, and finally found what's wrong. So next time think how flash work inside.
Hope I save time for someone.