I have a website where users have the option to upload their profile images. Currently, I'm using Cloudinary to host those images. My client has asked me if I can host those images on HostGator since they already have a paid shared hosting account there. My question is
Can I even do that? I tried that on Heroku and they warn you that images etc stored on their server will be deleted in 24 hours when their dynos restart, and they recommend Amazon S3.
If yes, then I will definitely need some kind of API to work with since all this is handled by my server and there must be a way to upload and delete images programmatically. It would be great if you could point me towards particular resources.
If no, then what are the industry standards for my particular use case?
Related
I recently deployed my website in the live shared resource server in Linux environment. The website is a Property Listing Application that allows real estate agents to list their properties with 18 High Definition (HD) images per listing.
My Web Hosting Provider configured the php.ini file for upload_max_filesize at 20MB. I tried different possible configurations to increase the upload size whether through php scripting in cpnel or progammatically but I got connection time out without notification error.
I spoke to Techical Support Team who increased the upload_max_filesize to 50MB that allowed me to only upload 16 images of 3MB size each. But when I tried, the image upload failed still.
I noticed that the cpanel master settings override all the changes. I wanted to upgrade the package for VPS but its features don't convice my ideal server settings.
I decided to concider a self web hosting with control panel. I investigated some possibilities and did some research online to find out how I can accomplish this, I found a few tips that showed possibilities with drawbacks.
I would like to know if it is possible for the self web hosting. What will be server requirements like number of CPUs, Raid, RAM and HDD sizes, etc. What is the most trusted server manufacturers like Cisco, Dell, etc. For Software between Windows and Linux. What is the best fit and why? What is a good cpanel to consider?
I fixed my upload issues. The problem was not upload_max_filsize but the actual image resolution in my script that needed a slight modication. I just modified the image resolution and the upload file image work fine.
For the ideal cpanel, I found cpanel.net
My question is essentially how can I get around the storage quota limits enforced on a PWA? A little background...
I am hoping to create an offline-ready line-of-business progressive web app that would ideally push about 2GB of images and video resources onto my user's phones or tablets - well beyond the current storage quota for caches and Indexed DB. What I'd like to be able to do is have my users (we all work at the same company) do a 1 time download of a zip file or directory and have the user's store that on their phone/tablet's file system in a well known directory. As the online version of the app treats these files as URL's, the fetch api would seem ideal since I could serve from online if connected or the local serviceworker managed cache if not online. But the qouta limits have me stumped. None of the files are larger than 15MB, but there's no way to know which files are needed before a user goes offline. Can I use something like an HTML input type=file tag to load files into the cache at runtime and then treat them as URL's? Of course I would remove other files to make room. But since these files wouldn't be coming from "the origin" with its secure https address (a PWA requirement I think) , but rather a local file system, I'm not sure this will work. If it is workable, would my users be forced to browse to the files manually?
If its an option, you can have a native Android service to do the caching part to avoid space constraint and then serve the data from native code to PWA using websockets/secure web sockets.
No PWA solution possible for now. File API has limitation as its sand-boxed.
I'm developing an application that uses (lots) of image processing.
The general overview of the system is:
User Uploads photos to server (Raw photo, with FULL resolution)
Server Fetches new photos, and apply image processing on them
Server resizes image and serves those photos (delete the full one?)
My current situation is that I have almost no expertise in image hosting nor large data uploading and managing.
What I plan to do is:
User uploads directly from Browser to Amazon S3 (Full Image)
User notifies my server, and add the uploaded file to the Queue for my workers
When worker receives a job, it downloads the full image (from Amazon), and process it. Updates database, and then re-uploads the image to Cloudinary (resize in server?)
Use the hosted image on Cloudinary from now on.
My doubts are regarding the process time. I don't want to upload it directly to my server, because it would require a lot of traffic and create a bottleneck, so using Amazon S3 would reduce that. And hosting images with Amazon would not be that good, since they don't provide specific API's to deal with images as Cloudinary does.
Working with separate servers for uploading, and only triggering my server when upload is done by the browser is ok? Using Cloudinary for hosting images is also something that makes sense? Sending to Amazon, instead of my own server (direct upload to my server) should be avoided?
(This is more a guidance/design question)
Why wouldn't you prefer uploading directly to Cloudinary?
The image can be uploaded directly from the browser to your Cloudinary account, without any further servers involved. Cloudinary then notifies you about the uploaded image and its details, then you can perform all the image processing in the cloud via Cloudinary. You can either manipulate the image while keeping the original, or you may choose to replace the original with the manipulated one.
I don't have a server to distribute a Safari extension I made or to deploy updates. Is there a free service I can use instead of putting it on a file sharing website and posting to reddit?
I ended up using Amazon S3.
Just upload the .plist file and link everything up to each other. For low traffic, you won't be charged anything. With a few hundreds of users, it doesn't cost me more than a few cents every month. Keep in mind that your users' browsers will query your .plist file every time they open, so the traffic may pile up that way.
I wrote a detailed tutorial here.
I am starting to use Jungle Disk to upload files to an Amazon S3 bucket which corresponds to a Cloudfront distribution. i.e. I can access it via an http:// URL and I am using Amazon as a CDN.
The problem I am facing is that Jungle Disk doesn't set 'read' permissions on the files so when I go to the corresponding URL in a browser I get an Amazon 'AccessDenied' error. If I use a tool like BucketExplorer to set the ACL then that URL now returns a 200.
I really really like the simplicity of dragging files to a network drive. JungleDisk is the best program I've found to do this reliably without tripping over itself and getting confused. However it doesn't seem to have an option to make the files read-able.
I really don't want to have to go to a different tool (especially if i have to buy it) to just change the permissions - and this seems really slow anyway because they generally seem to traverse the whole directory structure.
JungleDisk provides some kind of 'web access' - but this is a paid feature and I'm not sure if it will work or not.
S3 doesn't appear to propagate permissions down which is a real pain.
I'm considering writing a manual tool to traverse my tree and set everything to 'read' but I'd rather not do this if this is a problem someone else has already solved.
Disclaimer: I am the developer of this tool, but I think it may answer your question.
If you are on Windows you can use CloudBerry Explorer Amazon S3 client. It supports most of the Amazon S3 and CloudFront features and It is freeware.
I use the Transmit Mac app to modify permissions on files I've already uploaded with JungleDisk. If you're looking for a more cross-platform solution, the S3Fox browser plugin for Firefox claims to be able to modify permissions on S3 files as well.
If you need a web based tool, you can use S3fm, free online Amazon S3 file manager.
It's a pure Ajax app that runs in your browser and doesn't require sharing your credentials with a 3rd party web site.
If you need a reliable cross-platform tool to handle permissions, you can have a look at CrossFTP Pro. It supports most of the Amazon S3 and CloudFront features as well.