How can I remotely upload files to Amazon S3? - file-upload

I am looking for a way to transfer files from a server to Amazon S3 bucket, without first downloading the files to my computer. All of the files I plan to transfer can be accessed publicly (e.g. http://something.com/file.ext). Everything I tried only allows me to directly upload files from my Mac to S3.
P.S. Although I have access to windows, a Mac app that can do this would be great... or maybe a browser-based solution :)

You can check out this PHP class (and a net tuts tutorial on it), it works well, I've been using it for a while now. It includes bucket creation, deletion, adding files and more. You can easily add files remotely from another server, or from the same server you're running it on.

Related

AWS download files from S3 in web browser

I am a newbie to AWS and one of the tasks I have is to figure out how to download MSIs, ISOs stored in S3 through a web browser. I read that I could use CLI behind the scenes. So if a customer clicks on one download; the app would make a request to S3 using one of the commands and that would download the file lets say through Google Chrome or IE (Please correct me if I'm wrong in the usage of CLI).
Now if the download stops for some reason due to internet failure; is there a way to resume the download? How do I get a download done through a client.
Thanks in advance for helping. Unfortunately the AWS links gave me very little information so seeking help here!
May
Files stored in Amazon S3 can be directly accessed via web browser, just like clicking a link on any website.
If the files are marked as publicly-accessible, anyone with the link can download the file.
If you wish to limit access to the files, your application can generate a pre-signed URL that will work for a limited time period that you specify (eg 5 minutes). Users can use/click that link to download the file within that time period.
You can also download files using the AWS Command-Line Interface (CLI), which has Copy and Sync commands. This would, however, require installation of the CLI on the user's computer. This is great if they are regularly download files or if you wish to automate the download (eg every hour or daily).
If you wish to explore AWS, sign-up for an account and make use of the Free Usage Tier, which lets you try some services for no charge.

TableTools not working when SWF hosted on AWS S3

I'm trying to use jQuery DataTables and TableTools in conjunction with my Django app, which uses Django-Storages (Boto) to manage my static files on S3. Although I can successfully point my SWF file to the SWF on S3, I've noticed that none of the COPY CSV etc buttons work (except PRINT) when using S3. However, it all works perfectly once I point to a public CDN.
I can use the CDN but am wondering if anyone knows why it doesn't work on S3. I'm guessing it may be a permissions issue?
I am facing the same problem with SWF on S3. I solve it stupid way by moving the swf file back to server instead of load from S3.
Hope this help.
My theory below, not tested:
I suspect it is due to the cross domain issue, as load from S3, the file path changed as well the domain name. This could happen if the Action Script did not check if crossdomain policy specified.

TCL Amazon S3 Interaction

I have a TCL/TK Windows application that creates a small executable that I distribute to my customers. Because it is an exe file I can not email the file. Instead I upload it to an Amazon S3 bucket then create a URL link and email the link to them. They download the file from the link and run the exe.
What I would like to do is add the ability upload to an Amazon bucket within the application that will enable me to upload the file and create a URL that I can copy and email to the customer. I have seen Amazon S3 API's written for other languages, python, java, but not TCL. Has anyone done this? How hard is it? Can you point me to a tutorial?
Actually I do not have to use a S3 bucket. If there is another suggestion for how to distribute small files to customers from within TCL programs I am open to suggestions. Besides what has been laid out above the only other requirement is that multiple people must be able to upload to the same location, the TCL program runs on Windows and I would like to not use a 3rd party program. Security is not a major concern, nor is privacy, these things are handled other ways.
Actually, Tcl does provide an S3 package, but since I don't have Amazon S3 account, I cannot test it out.

Where to save the uploaded files?

I am developing a web application to upload .mp3 files and need to play them. I successfully uploaded the files and saving them in C:/uploads folder. I understand that as it's a web application we need to save them in the Apache web server it self. But I am not sure, where to save them.
Thanks,
Serenity.
You can use content repositories to store uploaded data, I think this is common approach. For instance, take a look at the Apache JackRabbit CR, applying it you won't easy look for uploaded files on hard drive, but you will have web interface, and also some other tools available to connect to repository and show you files there etc.
As alternative to JackRabbit, you can try Alfresco CMS, they both implement JCR, other implementations are listed here (you will them at the bottom of that page).

Updating permissions on Amazon S3 files that were uploaded via JungleDisk

I am starting to use Jungle Disk to upload files to an Amazon S3 bucket which corresponds to a Cloudfront distribution. i.e. I can access it via an http:// URL and I am using Amazon as a CDN.
The problem I am facing is that Jungle Disk doesn't set 'read' permissions on the files so when I go to the corresponding URL in a browser I get an Amazon 'AccessDenied' error. If I use a tool like BucketExplorer to set the ACL then that URL now returns a 200.
I really really like the simplicity of dragging files to a network drive. JungleDisk is the best program I've found to do this reliably without tripping over itself and getting confused. However it doesn't seem to have an option to make the files read-able.
I really don't want to have to go to a different tool (especially if i have to buy it) to just change the permissions - and this seems really slow anyway because they generally seem to traverse the whole directory structure.
JungleDisk provides some kind of 'web access' - but this is a paid feature and I'm not sure if it will work or not.
S3 doesn't appear to propagate permissions down which is a real pain.
I'm considering writing a manual tool to traverse my tree and set everything to 'read' but I'd rather not do this if this is a problem someone else has already solved.
Disclaimer: I am the developer of this tool, but I think it may answer your question.
If you are on Windows you can use CloudBerry Explorer Amazon S3 client. It supports most of the Amazon S3 and CloudFront features and It is freeware.
I use the Transmit Mac app to modify permissions on files I've already uploaded with JungleDisk. If you're looking for a more cross-platform solution, the S3Fox browser plugin for Firefox claims to be able to modify permissions on S3 files as well.
If you need a web based tool, you can use S3fm, free online Amazon S3 file manager.
It's a pure Ajax app that runs in your browser and doesn't require sharing your credentials with a 3rd party web site.
If you need a reliable cross-platform tool to handle permissions, you can have a look at CrossFTP Pro. It supports most of the Amazon S3 and CloudFront features as well.