Unable to pause S3 file upload - amazon-s3

I am following code sample mentioned here for S3 file upload using S3TransferManager. From samples it appears that to be able to pause an ongoing upload FileUpload reference is required.
What I am unable to understand is how do I get reference of FileUpload if it has to be paused from a different request/thread. Also, transferManager.uploadFile appears to be synchronous operation, is this correct? Please advise.

Related

Uploading smaller files with multipart file upload with only one part using AWS CLI

I have configured AWS S3 and a lambda function which triggers when a file is inserted into S3. I have configured an event s3:ObjectCreated:CompleteMultipartUpload in lambda to trigger. When I tested through AWS CLI with large files it worked. But when I upload a smaller file with size less than 5 MB, the event is not triggering the lambda. How can I do this for small size files with only one part?
Anyone please help....
Files less than 5MB cannot be uploaded using multipart upload. Therefore, you can add s3:ObjectCreated:Put event to let your lambda get notified too.

CKAN resource file upload by api?

How do I upload file in to resource (either create or update)in CKAN using apis.
I followed the documentation it seems working but getting an error at datapusher page.
Click here for Image of Error
this error indicates a time-out when the datapusher service tries to download the resource from CKAN.
#florianm is dead right here.
Another thing to check is the resource url. The exception suggests you've forgotten to put http:// on the front of the url, or something similar.
Also note that you can upload files directly to CKAN if they are not already on-line somewhere else.
#Vithal this error indicates a time-out when the datapusher service tries to download the resource from CKAN.
Some pointers:
What do your CKAN and datapusher error logs say?
How big is your resource file?
Does this error occur when you upload a tiny CSV file?
Does this error occur when you upload your original resource file to
demo.ckan.org?

How to pause/resume procedure while/after upload file by ChannelSftp

I am trying to upload file from local disk to remote server by com.jcraft.jsch.ChannelSftp.put(String src, String dst, SftpProgressMonitor monitor)
I find that even the upload progress is not completed,if there is no exception occured,program will go ahead.For example,send a message to client that a file already uploaded to server.But the file is still being uploaded,if client try to get the file immediately,the file is a blank file or inconsistent file.
How to pause the main program while uploading file,and resume it once complete upload?
JSch's ChannelSftp does the actual file transfer in a background thread, if you use that put method. This is done so your program can do other things during that, or even use the channel to upload or download other stuff.
The progress monitor object you did pass will be informed of the progress, and the end of the upload or download.
Use this to know when it is done, or when you can do other things with the file.

red5 with s3(i want to customize the path for streaming videos)

I am using red5 for streaming videos in my project and I am able to play the videos from the local system which are saved in default folder "streams".
Now i want to customize the path and want to get the videos from S3. How do i configure red5 to work with S3. Is this a good practice?
I've got code using the IStreamFilenameGenerator works with S3; I'll warn you now that it may not work with the latest jets3 library, but you'll get the point of how it works by looking through the source. One problem / issue that you must understand when using S3 is that you cannot "record" to the bucket on-the-fly; your flv files can only be transferred to S3 once the file is finalized; there is an example upload call in the Application.class. Whereas "play" from S3 will work as expected.
I added the S3 code to the red5-examples repo: https://github.com/Red5/red5-examples
Search for:
https://stackoverflow.com/search?q=IStreamFilenameGenerator
Or https://www.google.com.au/search?q=IStreamFilenameGenerator+example&ie=utf-8&oe=utf-8&aq=t&rls=org.mozilla:de:official&client=firefox-a
and you will find some examples howto modify the path(s).
You could alternatively also of course simply mount some drive into the streams folder or I guess a symbolic link would even work. But it might be not that flexible as if you can do it with IStreamFilenameGenerator and generate really some string completely like you want it.
Sebastian

How to receive an uploaded file using node.js formidable library and save it to Amazon S3 using knox?

I would like to upload a form from a web page and directly save the file to S3 without first saving it to disk. This node.js app will be deployed to Heroku, where there is no local disk to save the file to.
The node-formidable library provides a great way to upload files and save them to disk. I am not sure how to turn off formidable (or connect-form) from saving file first. The Knox library on the other hand provides a way to read a file from the disk and save it on Amazon S3.
1) Is there a way to hook into formidable's events (on Data) to send the stream to Knox's events, so that I can directly save the uploaded file in my Amazon S3 bucket?
2) Are there any libraries or code snippets that can allow me to directly take the uploaded file and save it Amazon S3 using node.js?
There is a similar question here but the answers there do not address NOT saving the file to disk.
It looks like there is no good way to do it. One reason might be that the node-formidable library saves the uploaded file to disk. I could not find any options to do otherwise. The knox library takes the saved file on the disk and using your Amazon S3 credentials uploads it to Amazon.
Since on Heroku I cannot save files locally, I ended up using transloadit service. Though their authentication docs have some learning curve, I found the service useful.
For those who want to use transloadit using node.js, the following code sample may help (transloadit page had only Ruby and PHP examples)
var crypto, signature;
crypto = require('crypto');
signature = crypto.createHmac("sha1", 'auth secret').
update('some string').
digest("hex")
console.log(signature);
this is Andy, creator of AwsSum:
https://github.com/appsattic/node-awssum/
I just released v0.2.0 of this library. It uploads the files that were created by Express' bodyParser() though as you say, this won't work on Heroku:
https://github.com/appsattic/connect-stream-s3
However, I shall be looking at adding the ability to stream from formidable directly to S3 in the next (v0.3.0) version. For the moment though, take a look and see if it can help. :)