We have a bunch of data on S3 (images) but just started reading about Mosso Files (rackspace). Sometime this month they are going to add CDN capabilities so any file you upload is part of the limelight CDN.
Anyone using this service, it's not as well documented or publicized at S3.
Yes, it's not well documented or publicized as S3. But dude it has CDN support which S3 is lack off (unless you willing to pay extra of course). Bad thing is you can't FTP into Mosso CloudFile, you will either have to upload it through web-based control panel or API. Yet, it's still cheap and worth especially with CDN.
I am using the service and it's pretty good and cost effective compare to S3.
We use it for all our client sites, from images to podcasts, and it's hand down, the best way to distribute content and make it highly available - especially at this price!
cheers
Related
not sure if it belongs here or is well titled, but I finish soon my first Nuxt project and I am not sure, where to host it.
Usually I would use a Ionos or digital ocean droplet, but I was told that aws amplify or S3 (I have no Idea about any solution) might be cheaper or maybe cost nothing, since it is a small project, cause it depends on how intense process are ...
If true, would that apply as well, when I would need to run git pull and then the build/generate process, once a day, to get new content (via nuxt/content)?
Sorry if expressed poorly and thanks in advance for any helpful suggestion.
This question do not really belong to stackoverflow because it's essentially opinion based.
By order of preference, I do personally recommend those:
Netlify
Vercel
Digitalocean
Github pages
Surge
More on the official documentation of Nuxt: https://nuxtjs.org/docs/2.x/deployment/netlify-deployment
I'm going to deploy a web application with multiple Pyramid application servers and nginx as a load balancer.
This application will have a feature for uploading files which should be available for downloading afterwards.
Total size of uploaded files may be very big so I'd like to deploy a separate file webserver to serve these static files. (this is one reason why I don't like rsync solution proposed here).
What is the best solution to handle file upload and syncronization in this case? I was thinking about NFS or something like that, but I'm not sure it is a good way to solve the problem. I suppose there must be some best-practices here or even a tool or library for these purposes.
UPDATE:
I don't want use cloud services like Dropbox, it would be nicer to find some syncronization solution inside the network segment.
UPDATE2:
I finished with setting up NFS, for now it works perfectly.
not really a python or pyramid related question. But, you should investigate distributed file systems and CDN's both of which are for this kind of thing. gridfs is easy enough to get going with. But there are plenty of other options. Both Amazon and Google have similar services.
I made a small backup application that simply creates an archive out specified files and folders. Now I need an online service to backup that online. Which service can i use that can be integrated into my app ?
Options I considered:
dropbox is ideal, but they have all but abandoned the desktop.
skydrive has no api.
I couldn't find any free reliable backup service that uses ftp .
anything else ? it should provide 1-2 gb of free space and be reasonably reliable.
Thanks
My app is in C#, but can be ported to any other language as well..
In your case, Amaxon's S3 seems more fitting but that's not free.
Depending on your target audience, you can create a local archive and have that picked up by your regular backup solution. You might try Wuala,or SpiderOak. Expand Wuala by adding your own space. Spideroak is free up to 2GB (more if you invite friends), and also provides a good alternative to Dropbox (if you want to see how to migrate from dropbox to spideroak see my blogpost about that).
Try box.net, now known as box.com or simply Box
reference: http://developers.box.com/docs
I hope someone can help me answer this dilimma I'm having.
I am getting ready to release my new software and its associated content files. Since the File size for the Full version is Huge, I cant use normal software delivery methods. The main download is 450MB and then add-on packs at around 250MB for each add-on product.
So I plan to use Amazon S3 Servers to Host and Deliver Software.
Does anyone have any real-life
situations on the pro's and cons of
this method for the way I want to
use the system?
Does Amazon S3 offer Resumable downloads?
Is there a open source tool that I can use and modify that I can give to my end users to use as a downloader tool, which I can program so that they can download only what they have ordered and also give Expirable download links?
Is there a commercial tool thats available for the above task?
I need some advise on how I would automate all this for each user. Once a customer finalizes an order, I am ok with manually processing som stuff for delivery, but automated is the best.
thanks everyone.
DC
I am starting to use Jungle Disk to upload files to an Amazon S3 bucket which corresponds to a Cloudfront distribution. i.e. I can access it via an http:// URL and I am using Amazon as a CDN.
The problem I am facing is that Jungle Disk doesn't set 'read' permissions on the files so when I go to the corresponding URL in a browser I get an Amazon 'AccessDenied' error. If I use a tool like BucketExplorer to set the ACL then that URL now returns a 200.
I really really like the simplicity of dragging files to a network drive. JungleDisk is the best program I've found to do this reliably without tripping over itself and getting confused. However it doesn't seem to have an option to make the files read-able.
I really don't want to have to go to a different tool (especially if i have to buy it) to just change the permissions - and this seems really slow anyway because they generally seem to traverse the whole directory structure.
JungleDisk provides some kind of 'web access' - but this is a paid feature and I'm not sure if it will work or not.
S3 doesn't appear to propagate permissions down which is a real pain.
I'm considering writing a manual tool to traverse my tree and set everything to 'read' but I'd rather not do this if this is a problem someone else has already solved.
Disclaimer: I am the developer of this tool, but I think it may answer your question.
If you are on Windows you can use CloudBerry Explorer Amazon S3 client. It supports most of the Amazon S3 and CloudFront features and It is freeware.
I use the Transmit Mac app to modify permissions on files I've already uploaded with JungleDisk. If you're looking for a more cross-platform solution, the S3Fox browser plugin for Firefox claims to be able to modify permissions on S3 files as well.
If you need a web based tool, you can use S3fm, free online Amazon S3 file manager.
It's a pure Ajax app that runs in your browser and doesn't require sharing your credentials with a 3rd party web site.
If you need a reliable cross-platform tool to handle permissions, you can have a look at CrossFTP Pro. It supports most of the Amazon S3 and CloudFront features as well.