Uploading image to "buffer" - file-upload

I'm developing an application that uses (lots) of image processing.
The general overview of the system is:
User Uploads photos to server (Raw photo, with FULL resolution)
Server Fetches new photos, and apply image processing on them
Server resizes image and serves those photos (delete the full one?)
My current situation is that I have almost no expertise in image hosting nor large data uploading and managing.
What I plan to do is:
User uploads directly from Browser to Amazon S3 (Full Image)
User notifies my server, and add the uploaded file to the Queue for my workers
When worker receives a job, it downloads the full image (from Amazon), and process it. Updates database, and then re-uploads the image to Cloudinary (resize in server?)
Use the hosted image on Cloudinary from now on.
My doubts are regarding the process time. I don't want to upload it directly to my server, because it would require a lot of traffic and create a bottleneck, so using Amazon S3 would reduce that. And hosting images with Amazon would not be that good, since they don't provide specific API's to deal with images as Cloudinary does.
Working with separate servers for uploading, and only triggering my server when upload is done by the browser is ok? Using Cloudinary for hosting images is also something that makes sense? Sending to Amazon, instead of my own server (direct upload to my server) should be avoided?
(This is more a guidance/design question)

Why wouldn't you prefer uploading directly to Cloudinary?
The image can be uploaded directly from the browser to your Cloudinary account, without any further servers involved. Cloudinary then notifies you about the uploaded image and its details, then you can perform all the image processing in the cloud via Cloudinary. You can either manipulate the image while keeping the original, or you may choose to replace the original with the manipulated one.

Related

Can I host images uploaded by user on HostGator?

I have a website where users have the option to upload their profile images. Currently, I'm using Cloudinary to host those images. My client has asked me if I can host those images on HostGator since they already have a paid shared hosting account there. My question is
Can I even do that? I tried that on Heroku and they warn you that images etc stored on their server will be deleted in 24 hours when their dynos restart, and they recommend Amazon S3.
If yes, then I will definitely need some kind of API to work with since all this is handled by my server and there must be a way to upload and delete images programmatically. It would be great if you could point me towards particular resources.
If no, then what are the industry standards for my particular use case?

How to upload multiples images from ftp media server to cludinary?

I'm looking for a solution to load multiples images from ftp media server to cloudinary. I searched on the net and I found these links:
How can I bulk upload my images?
Bulk upload large images to cloudinary
Data upload options:
If your images are already publicly available online, you can specify their remote HTTP or HTTPS URLs instead of uploading the actual data. In this case, Cloudinary will fetch the image from its remote URL for you. This option allows for a much faster migration of your existing images
There is no information about uploading images from an ftp media server or something like that. All the available solutions are using a script and then upload images one by one.In my case I have on my server many folders of images and in each folder there are many sub-folders and I have about 10000 images.How can I do this?
You can upload to Cloudinary using an FTP source like this (in PHP):
\Cloudinary\Uploader::upload('ftp://username:password#ftp.mydomain.com/my_image.jpg');

Process for capturing image, storing it locally, showing it in table view, uploading to server and downloading on other devices?

I want to know about process for managing images for an iOS app dealing with large images. I have done all the steps separately and everything works properly if performed individually. The flow is as given below:
Capture image on iPhone (Full sized images of 2-5 MB)
Store the image locally on iPhone (where should it be stored in NSUserDefaults or temp folder or coredata?)
Show image locally
Upload the image to server when internet connection is available
Download the image from server (I use SDWebImage) to show on all devices
I can do all the steps independently properly but when these are combined they cause problems in app.
The app requires to capture image and show them to table view from local storage until its uploaded on server.
Do I need to upload it to server and download it again before even showing to local table view? If not how should it be managed within app The best case should be that it gets saved in local storage and just uploaded & not downloaded again until its available locally.

Programmatically add meta data for MP4

I have a server from where a single consumer me download MP4 files. I would like to add the username to the meta-data of the file at the time the user clicks "download". Amazon does something like this for the MP3 files.
Now, a slight variation to this is how would I do the same thing if the files are on Amazon Cloudfront.
Thanks!
You would have to route your request through your web server.
Logged-in user clicks
Web server downloads the MP4 file from S3 to its file system.
Web server uses an MP4 editor to add the correct MP4 metadata to the file.
Web server serves the MP4 file back to the customer as a download.
S3 is dumb file storage, so you can't do any on-the-fly editing or processing. Any such work must occur on a machine with a CPU.
As such, the question you posed could not be accomplished in any meaningful way using CloudFront, since the traffic needs to route back through your server for post-processing anyway.

Bulk upload large images to cloudinary

Is there a way to bulk upload images to my cloudinary account?
I am looking to import 100 images of 3MB each at a time.
Thank you.
You can use Cloudinary's upload API to upload images one by one. Here is a sample upload code in Python.
If your images are already in a public location, you can specify the remote HTTP URL as the file parameter instead of sending the actual image's data. This allows much faster uploading. If your images are in an Amazon S3 bucket, images can be fetched by Cloudinary directly from S3 for reaching even higher upload performance.
You can also run your upload code using multiple processes in parallel for quickly uploading multiple files simultaneously. In our Ruby on Rails client library we included a migration tool.
Currently there is no dedicated API method of Cloudinary for performing bulk upload of images.
Easiest way is to use the remote API - and Just pass the url reference to the account and Cloudinary will connect to the image and download it into your account.
http://cloudinary.com/documentation/upload_images#remote_upload