I've seen the recently Google Drive pricing changes and they are amazing.
1Tb in Google Drive = $9.99
1Tb in Amazon S3 = $85 ($43 if you have more than 5000TB with them)
This changes everything !
We have a SaaS website in which we keep customer's files. Does anyone know if Google Drive can be used to keep this kind of files/service or it's just for personal use?
Does it have a robust API for uploading, downloading, and create public URL's to access files as S3 have ?
Edit: I saw the SDK here (https://developers.google.com/drive/v2/reference/). The main concern is if this service can be used for keeping customer's files, I mean, a SaaS website offering a service and keeping files there.
This doesn't really change anything.
“Google Drive storage is for users and Google Cloud Storage is for developers.”
— https://support.google.com/a/answer/2490100?hl=en
The analogous service with comparable functionality to S3 is Google Cloud Storage, which is remarkably similar to S3 in pricing.
https://developers.google.com/storage/pricing
Does anyone know if Google Drive can be used to keep this kind of files/service or it's just for personal use?
Yes you can. That's exactly why the Drive SDK exists. You can either store files under the user's own account, or under an "app" account called a Service Account.
Does it have a robust API for uploading, downloading, and create public URL's to access files as S3 have ?
"Robust" is a bit subjective, but there is certainly an API.
There are a number of techniques you can use to access the stored files. Look at https://developers.google.com/drive/v2/reference/files to see the various URLs which are provided.
Por true public access, you will probably need to have the files under a public directory. See https://support.google.com/drive/answer/2881970?hl=en
NB. If you are in the TB space, be very aware that Drive has a bunch of quotas, some of which are unpublished. Make sure you test any proof of concept at full scale.
Sorry to spoil your party, before you get too excited, look at this issue. It is in Google's own product, and has been active since November 2013 (i.e.4 months). Now imagine re-syncing a few hundred GB of files once a while. Or better, ask your customers to do it with their files after you recommended Drive to them.
Related
I wonder as one of my personal projects development goes further forward how should i organize the files ( images, videos, audio files ) uploaded by the users onto AWS's S3/GCE Cloud Storage, i'm used to see these kinds of URL below;
Facebook fbcdn-sphotos-g-a.akamaihd.net/hphotos-ak-xft1/v/t1.0-9/11873531_1015...750483_5263546700711467249_n.jpg?oh=b3f06f7e...b7ebf7&oe=56392950&__gda__=1446569890_628...c7765669456
Tumblr 36.media.tumblr.com/686b47...e93fa09c2478/tumblr_nt7lnyP3ld1rqbl96o1_500.png
Twitter pbs.twimg.com/media/CMimixsV...AcZeM.jpg
Does these random characters carry some kind of meaning? or they're just "UUIDs"? Is there a performance/organization issue in using, for instance this kind of URL below?
content.socialnetworkX.com/userY/post/customName_dinosaurs.jpg
EDIT: Let be clear that i'm considering millions of files.
For S3, see the Performance Considerations page where it talks about object naming. Specifically, if you plan to upload objects at a high rate, you should avoid sequentially named objects, as they can be a bottleneck.
Google Cloud Storage does not have this performance bottleneck. See this answer.
I run a multi-gigabyte audio content subscription service. Right now all of our clients get download links via email for all of the content.
I had an idea of employing the Dropbox API after a "successful charge" webhook and giving (read-only) access to a shared Dropbox folder with all of the content. That way, the customer would stay in sync with all updates, changes etc...
The way I picture it, the user checks out and is immediately asked if he would like to add our company's folder to his/her Dropbox.
Does this seem feasible/practical?
Looking at the API, I only see an option to provide a download link but not an actual shared folder. Am I correct in this observation?
That's correct, the Dropbox API doesn't currently offer any API calls for managing shared folders. It only has a way to get the read-only share links like you mentioned.
However, if you'd be interested in potentially participating in a shared folder API beta in the future, please sign up here.
#Greg's answer is correct, but I thought I'd mention a couple other options:
You could use the Saver to let users save the files directly into their Dropbox. This wouldn't help you to push new content to them—they'd still have to visit your site to save the new files—but it would let you cut down on your bandwidth costs, since Dropbox would cache the files for you.
You could use a combination of /copy_ref and /fileops/copy to copy the contents from a central Dropbox account into each user's Dropbox. This wouldn't use any of your bandwidth (once the file was in the central Dropbox account).
Please note, however, that free Dropbox accounts only start with 2GB of storage space. Since you mentioned "multi-gigabyte," you'll need to keep in mind whether your customers will actually have sufficient Dropbox space to store the files you want to share with them. (Even if you were able to use a shared folder, they would need to have enough space left to accept the shared folder invitation.)
I'm trying to make a google gadget that stores some data (say, statistics of users' actions) in a persistent way (i.e. statistics accumulates over time and over multiple users). Also I want these data to be placed at google free hosting, possibly together with the gadget itself.
Any ideas on how to do that?
I know, Google gadgets API has tools for working with remote data, but then the question is where to host it. Google Wave seemed to be an option, but it is no longer supported.
You should get a server and host it there.
You have then the best control over the code, the performance and the data itself.
There are several hosting providers out there who provide hosting for a reasonable price.
Naming some: Hostgator.com (US), Hetzner.de (DE), http://swedendedicated.com (SE, never used, just a quick search on the internet).
I read the directions for posting, so I will be as specific as possible.
I have an S3 bucket with numerous FLV files that I will be allowing customers to stream on THEIR domains.
What I am trying to accomplish is
Setting a bucket policy that 'GRANTS' access to specific domains (a list) to stream my bucket files from their domains.
A bucket policy that restricts a user to 'one stream' per domain. In other words, for each domain listed in the above policy, they can only stream one file at a time on their site.
The premise is a video site where customers will be streaming videos specific to their niche. I make host and deliver the videos, but need some control over their delivery.
All files are in ONE bucket. There aren't any weird things going on with the files. It's very straight forward.
I just need the bucket policy control that would Grant and also Restrict the ability of my customers to stream my content from their domains.
I PRAY I have been clear enough, but please don't hesitate to ask if I have confused you...
Thanks VERY much
A
I don't think you can achieve what you want by simply setting access permissions to the bucket.
I checked in AccessControlList and CannedAccessControlList.
Your best bet will be to write a webservice wrapper to access the bucket data.
You will have better control over the data you serve and may be you might also explore the option of cached copy of data for higher optimization.
We know that Dropbox desktop clients use a binary diff algorithm to break down all files into blocks, and only upload blocks that it doesn't already have in the cloud (https://serverfault.com/questions/52861/how-does-dropbox-version-upload-large-files).
Nevertheless, the Dropbox API, as far as I see, can only upload the whole file (/files_put, /files (POST)) when a sync is needed.
Is there any way to do differential/incremental syncing using the Dropbox API, i.e. upload only the changed portion of the file like the desktop clients do?
If this is not possible, then what are the best practices to periodically sync large files that has small changes using the Dropbox API?
Unfortunately this isn't possible and I would suspect that it may never be available.
After doing a bit of research, I found a feature request for delta-syncing to be integrated into the API[*]. Dropbox hasn't responded, nor has the community upvoted this request.
I would make an educated guess that the reason why Dropbox hasn't provided this functionality, and likely never will, is because this is a dangerous feature in the hands of unknown developers.
Consider the case where you write an application that uses such a delta-change update system for updating large files. You thoroughly test your app and publish it to an app store. A couple of weeks after your initial release, and numerous downloads, you start receiving bad reviews and complaints because you managed to miss a very specific test case.
Within this specific, buggy case you've miscalculated a differential offset by 1-byte. Oh no! You've now corrupted thousands of files, for hundreds of users!
Considering such a possibility, I think I would personally request that Dropbox NEVER provide such a dev feature. If they integrated such a function into the API, they would be breaking their #1 purpose-- to provide consistent, safe, & reliable cloud backups of your important files.
[*]: This was the original reference location, but it is now a dead link.
(https://www.dropbox.com/votebox/1515/delta-sync-api-for-mobile-applications)