Amazon S3/Cloudfront Image Overwriting Issue - amazon-s3

I'm currently serving up static images to my site via Amazon Cloudfront and for some reason my images won't update when I try to overwrite them with an updated image. The old image continues to display.
I've even tried deleting the entire images folder and uploading the newest ones without success. The only thing that works is renaming the image.
Anyone else experience this?

recently amazon s3 announced new feature called content invalidation. it allows you to invalidate some file just with a one call. check cloudfront api references for more details.

Related

host reveal.js slideshow from an amazon S3 bucket

I would like to host reveal.js slideshows from an amazon S3 bucket. However when I try uploading the files and navigating to the links, the pages do not render as expected. This remains the case even if I add the contents of the reveal.js github repo to the bucket also. I am using the boto python library to interface with the bucket and upload files programatically.
I'm sure it should be possible to host a reveal.js slideshow in this way. Is anyone able to spell out the process for me?
Thanks.

Uploading image to "buffer"

I'm developing an application that uses (lots) of image processing.
The general overview of the system is:
User Uploads photos to server (Raw photo, with FULL resolution)
Server Fetches new photos, and apply image processing on them
Server resizes image and serves those photos (delete the full one?)
My current situation is that I have almost no expertise in image hosting nor large data uploading and managing.
What I plan to do is:
User uploads directly from Browser to Amazon S3 (Full Image)
User notifies my server, and add the uploaded file to the Queue for my workers
When worker receives a job, it downloads the full image (from Amazon), and process it. Updates database, and then re-uploads the image to Cloudinary (resize in server?)
Use the hosted image on Cloudinary from now on.
My doubts are regarding the process time. I don't want to upload it directly to my server, because it would require a lot of traffic and create a bottleneck, so using Amazon S3 would reduce that. And hosting images with Amazon would not be that good, since they don't provide specific API's to deal with images as Cloudinary does.
Working with separate servers for uploading, and only triggering my server when upload is done by the browser is ok? Using Cloudinary for hosting images is also something that makes sense? Sending to Amazon, instead of my own server (direct upload to my server) should be avoided?
(This is more a guidance/design question)
Why wouldn't you prefer uploading directly to Cloudinary?
The image can be uploaded directly from the browser to your Cloudinary account, without any further servers involved. Cloudinary then notifies you about the uploaded image and its details, then you can perform all the image processing in the cloud via Cloudinary. You can either manipulate the image while keeping the original, or you may choose to replace the original with the manipulated one.

TableTools not working when SWF hosted on AWS S3

I'm trying to use jQuery DataTables and TableTools in conjunction with my Django app, which uses Django-Storages (Boto) to manage my static files on S3. Although I can successfully point my SWF file to the SWF on S3, I've noticed that none of the COPY CSV etc buttons work (except PRINT) when using S3. However, it all works perfectly once I point to a public CDN.
I can use the CDN but am wondering if anyone knows why it doesn't work on S3. I'm guessing it may be a permissions issue?
I am facing the same problem with SWF on S3. I solve it stupid way by moving the swf file back to server instead of load from S3.
Hope this help.
My theory below, not tested:
I suspect it is due to the cross domain issue, as load from S3, the file path changed as well the domain name. This could happen if the Action Script did not check if crossdomain policy specified.

ImageResizer, Amazon S3 and caching

I am building a photo sharing site, and using amazon s3 for my storage. Everything is working great, except that the pages render slowly.
When I have over 100 images on the page, and requests that look like mysite/s3/bucket/image.jpg?w=200, does this mean that every image is first downloaded, and then resized? If so, how do I configure caching of thumbnails? I can't seem to find that info in the documentation.
You need the DiskCache (and possibly SourceDiskCache) plugins installed. DiskCache will cache the resized images to disk, while SourceDiskCache will cache the S3 images to disk.
If you only have a couple versions of the S3 image, output caching is sufficient, but it is definitely needed.
It's also important to think about the bandwidth requirements between the ImageResizer server and S3. If you're using EC2, make sure you're in the same region as the S3 bucket. If you're using a VM, make sure that you have a big pipe.
The bottleneck is always I/O.

Updating permissions on Amazon S3 files that were uploaded via JungleDisk

I am starting to use Jungle Disk to upload files to an Amazon S3 bucket which corresponds to a Cloudfront distribution. i.e. I can access it via an http:// URL and I am using Amazon as a CDN.
The problem I am facing is that Jungle Disk doesn't set 'read' permissions on the files so when I go to the corresponding URL in a browser I get an Amazon 'AccessDenied' error. If I use a tool like BucketExplorer to set the ACL then that URL now returns a 200.
I really really like the simplicity of dragging files to a network drive. JungleDisk is the best program I've found to do this reliably without tripping over itself and getting confused. However it doesn't seem to have an option to make the files read-able.
I really don't want to have to go to a different tool (especially if i have to buy it) to just change the permissions - and this seems really slow anyway because they generally seem to traverse the whole directory structure.
JungleDisk provides some kind of 'web access' - but this is a paid feature and I'm not sure if it will work or not.
S3 doesn't appear to propagate permissions down which is a real pain.
I'm considering writing a manual tool to traverse my tree and set everything to 'read' but I'd rather not do this if this is a problem someone else has already solved.
Disclaimer: I am the developer of this tool, but I think it may answer your question.
If you are on Windows you can use CloudBerry Explorer Amazon S3 client. It supports most of the Amazon S3 and CloudFront features and It is freeware.
I use the Transmit Mac app to modify permissions on files I've already uploaded with JungleDisk. If you're looking for a more cross-platform solution, the S3Fox browser plugin for Firefox claims to be able to modify permissions on S3 files as well.
If you need a web based tool, you can use S3fm, free online Amazon S3 file manager.
It's a pure Ajax app that runs in your browser and doesn't require sharing your credentials with a 3rd party web site.
If you need a reliable cross-platform tool to handle permissions, you can have a look at CrossFTP Pro. It supports most of the Amazon S3 and CloudFront features as well.