What is best architecture to store images for blog and retrieval? I have a usecase where I have to design image storage / retrieval system for articles. Where and how should I store store these images and retrieve / access those while displaying contents of article with minimum latency?
It would be great if you can provide any reference for this. Thanks.
If you want minimum latency for image retrieval, you need to use a CDN (Content Delivery Network)
Check out this article for more details.
For example, AWS offers Cloud Front which is very simple to use - store the images into an S3 bucket, and then use dedicated CloudFront URLs on your client-side code, to fetch the images.
There are other CDN providers out there, you can find them right away on a Google search.
Related
I wonder as one of my personal projects development goes further forward how should i organize the files ( images, videos, audio files ) uploaded by the users onto AWS's S3/GCE Cloud Storage, i'm used to see these kinds of URL below;
Facebook fbcdn-sphotos-g-a.akamaihd.net/hphotos-ak-xft1/v/t1.0-9/11873531_1015...750483_5263546700711467249_n.jpg?oh=b3f06f7e...b7ebf7&oe=56392950&__gda__=1446569890_628...c7765669456
Tumblr 36.media.tumblr.com/686b47...e93fa09c2478/tumblr_nt7lnyP3ld1rqbl96o1_500.png
Twitter pbs.twimg.com/media/CMimixsV...AcZeM.jpg
Does these random characters carry some kind of meaning? or they're just "UUIDs"? Is there a performance/organization issue in using, for instance this kind of URL below?
content.socialnetworkX.com/userY/post/customName_dinosaurs.jpg
EDIT: Let be clear that i'm considering millions of files.
For S3, see the Performance Considerations page where it talks about object naming. Specifically, if you plan to upload objects at a high rate, you should avoid sequentially named objects, as they can be a bottleneck.
Google Cloud Storage does not have this performance bottleneck. See this answer.
Can uploadcare-widget be used without using the upload care service?
The goal:
Use the widget (specifically to allow users to upload files from their google drive/dropbox accounts).
Instead of using upload care's backend, use your own backend, i.e. node.js/aws s3.
Yes, it can. It's open source!
Although you will have to either replicate or get rid of functionality that relies on Uploadcare infrastructure:
uploads (this is the easiest part)
fetching files from social networks and cloud storage services
image preview and cropping that relies on Uploadcare CDN
So unless you're moving enormous amounts of files, most cost efficient way is to use Uploadcare as it is. BTW, you can use your own S3 storage and even upload directly to your S3 buckets.
Is it possible to retrieve the data from a tpk?Means, is there a way to embed some information in tpk like address, region etc. and retrieve that information by means of querying
No, this wouldn't be possible.
Firstly according to the ESRI Documentation Tile Packages are solely for storing raster tiles, when displayed these tiles would show a user a map image but could not be queried interactively to identify or search for addresses / regions.
Additionally tile packages would not be practically accessible to web applications designed with the ArcGIS Javascript API. Tile packages are zipped file systems containing a large number of images, the usual way of making these tiles available to an application would be through a map service.
I would recommend for this type of functionality you view the examples on querying map services as a demonstration of what is available with the API.
I can't seem to find any documentation or reference on upload and sharing images on Google+.
Is this action current supported in google+?
Their moment sharing seems to accept thumbnail url, but I don't want to keep the image hosted on my site once it is created and shared by visitor.
You have a few different options, but I'm not sure any of them are really what you're looking for.
Google+ doesn't really allow outside apps to upload and share something automatically.
As you've observed, the closest you can get is generating a Moment for them to share. And while there are similarities to Instant Upload, it isn't identical. You could probably use a data url to encode and store the image as part of the moment, but I haven't tested this.
Another alternative is to use the Google Drive API to store the image in their Drive space, permit the image to be read publicly, get a link for it, and use this link as the thumbnail URL. Similarly, you might be able to use the Picasa Web Albums Data API to store the image. Both have good, but different, integration with Google+. The former is more modern, while the latter has more features that are tailored for images.
I'm not sure how to word the question but here is what I am looking to do.
I have a site that uses custom map tile overlays on a google map.
The javascript calls a php file on my server that checks to see if an existing map tile exists for the given x, y, and zoom level.
If if exists, it displays that image using file_get_contents.
If it doesn't exist, it creates the new tile then displays it.
I would like to utilize Amazon S3 store and serve the images since there could end being a lot of them and my server is slow. If I have my script check to see if the image exists on amazon and then display it, I am guessing I am not getting the benefits of the speed and Amazons CDN. Is there a way to do this?
Or is there a way to try and pull the file from Amazon first then set up something on Amazon to redirect to my script if the files no there?
Maybe host the script on another of Amazons services? The tile generation is quite slow also in some cases.
Thanks
Ideas:
1 - Use CloudFront, but point it to a cluster of tile generation machines. This way, you can generate the tiles on demand, and any future requests are served right from Cloudfront.
2 - Use CloudFront, but back with with an S3 store of generated tiles. Turn on logging for the S3 bucket, so you can detect failed requests. Consume those logs on a schedule, and generate the missing tiles. This results in a cheaper way of generating tiles, but means that when a tile fails the user get's nothing.
3 - Just pre-generate all the tiles. Throw tasks in an SQS queue, then spin up a collection of EC2 instances to generate the tiles. This will cost the most up front, but all users get a fast experience.
I've written a blog post with a strategy for dealing with this. It's designed to make intelligent and thrifty use of CloudFront, maximize caching and deal with new versions of existing images. You may find the technique described there helpful. The example code shows how to handle different dimensions (i.e. thumbnails) of images. You could modify it to handle different zoom levels.
I need to update that post to support CloudFront custom origins, and I think that for your application you might be better off skipping S3 and using a custom origin. The advantage of a custom origin is simply that it's probably going to be easier to manage all of your images on your local filesystem compared to managing them on S3.