Cloud Storage customer access best practices - amazon-s3

Let's say I have a use case where users can buy mp3 files inside an app. The objects are stored in GCP Cloud Storage . What is the best practice to deliver those objects only to the users that purchased the files?
After researching the topic I came up with three solutions:
Client calls a REST (e.g. one running inside App Engine) service. This service downloads the files from Cloud Storage and then sends them back to the client.
Instead of sending the files via the REST call, I could send the download URL (from Cloud Storage) to the client. This would be more cost efficient, however this sounds like a security concern to me as anyone who simply monitors his network could capture the URL.
Creating a (time-limited) signed url to allow the user the download
Obviously a permission check would have to happen first, e.g. a database that contains if user X purchased mp3 Y.
This problem could also be applied to Azure Blob Storage or AWS S3...

In your use case, you have a constant:
You need a backend to authenticate the user (for example Authentication performed with Cloud Identity Platform and hosted on App Engine or Cloud Run
You need to check the list of MP3 that it has bought (stored in Firestore for example)
And then, you need to allow him to download the file. On this last point I recommend you to generated a signedURL. Download URL exists only in Firebase area (maybe your project is a firebase projet?) but it's the same thing than signerURL. Finally I don't recommend you the #1 proposal. It will work, but in case of long download (because network is poor), the connexion will be interrupted after 60 seconds. And this will keep your AppEngine up for nothing (and you will pay for this...).

Related

Nextcloud notification on new files

I try to get information about new uploaded files from our Nextcloud instance. In the last years I have used two things: the filesystem of the Nextcloud and the mail mechanism. On filesystem level I can use the inotify-tools to monitor changes on the files. Nextcloud can also send mails to users, so I can intercept and parse the mails with i.e. maildrop on a local postfix instance.
Are there other possibilities? I also use the ShareApi of Nextcloud from another server to change shares of files. Is there a similar API, which can notify my client (not Android or iOS)? Or is it necessary to implement an own Nextcloud app, which uses the OCS hooks on the Nextcloud installation?
I think of an RSS feed, which I could ask with a timestamp to get the latest changes or a REST api, which I can ask for changes since a given timestamp or an implementation of PushApi .
This official https://github.com/nextcloud/notifications - This app provides a backend and frontend for the notification API available in Nextcloud. The API is used by other apps to notify users in the web UI and sync clients about various things.
You can see some examples there and create whatever notifications you wish.

Most efficient way to send image to frontend

I have images stored in Google Cloud Storage. Whenever my frontend (Swift) requests a certain image I would like to send the image as quickly and efficiently as possible from my backend.
Conveniently, Google Cloud Storage has direct image links for every image.
Is it most efficient to send a multipart/form-data the same way I send an image captured by a user in the front end to the backend? Or is it more efficient to send the URL of the image stored in the cloud where the frontend can proceed to download the image from that URL?
This can indeed be done through a signed URL which provides limited permission and time to make a request. With signed URLs authentication information is contained in their query string, allowing users without credentials to perform specific actions on a resource.
I would like to point out however that Signed URLs can only be used to access resources in Cloud Storage through XML API endpoints.
Since you are using Swift for your frontend I would also like to direct you to explore Google APIs for iOS, such as CocoaPods.

Soundcloud API Download

I am asking this here because Soundcloud does not have support. I going to build a website that people can purchase audio files from using Soundcloud to download the files (and stream before buy). I want to be able to access the download file link in the Soundcloud API without the download link being enabled and showing on the Soundcloud UI. I can't seem to find this info in the Soundcloud API docs. I am going to have a Paypal redirect after the payment to the download link. I know this is a weird way of doing this but I have certain criteria I have to meet. I would host the audio files on my server but they are huge. Anyone have experience with this or can help?
im not sure its possible to do what you want. (very easily at least)
there would be no way for the purchaser to access the 'download' track on soundcloud directly unless downloads are specifically enabled for that track.
really the only way to not host the files and still be able to provide the download would be to use the api to download or proxy the track from soundcloud to your server, using your credentials (because you always have access to your own tracks, download or stream). mind you this would use 2x the bandwidth usage (the server getting the track from soundcloud, and the client downloading the track), and storage space would only be impacked on a temporary bases. but. this is a pretty hacky way and not really a good/proper solution.
you can:
-compress/re-encode the audio as to not use as much disk space
-pay for more storage space at your web host, its usually pretty cheap thse days.
So you want to charge on something free? Well, I think all the downloader out there are middleware where they stream the track from soundcloud and response to client as attachment upon request, one of many examples is http://wittysound.com. Cheapest way to get thing done is providing direct link to soundcloud server like what http://soundflush.com does

How to Give Access to non-public Amazon S3 bucket folders using Parse authenticated user

We are developing a mobile app using Parse as our BAAS solution but using Amazon S3 for storage of our media files. All of our users upload media files into their own individual folders inside of our app's bucket. As the user uploads media files we update their records in Parse so it knows where to download the files. That's the easy part.
I've spent quite a bit of time researching the different policies for S3 buckets and I am trying to get a grip on the proper way to ensure the security of the content uploaded. If you do all of your work with DynamoDB or SimpleDB then it's easy because you're essentially adjusting your ACLs with the IAM accounts and whatnot. If you use Amazon Cognito it's also easy because authentication happens through Google, Facebook or Amazon accounts. In my case I am using Parse to authenticate users which cannot speak to Amazon directly.
My goal is that only the currently logged in Parse user with ID #1234567 can access their own 1234567 folder and files (as well as any other user given permission by this person for collaboration). Here is a post similar to what I'm trying to accomplish: amazon S3 bucket policy - restricting access by referer BUT not restricting if urls are generated via query string authentication
...but how do I accomplish this with the current user's ID number?
Even better question is whether that post mentioned above is best practice or should I instead be looking at creating an EC2 server to handle access to these files? Should I be looking at CloudFront to serve private content? Or is there another method that works better for what I am trying to accomplish? I am going in circles and my head is spinning.
Thanks to whoever can help straighten me out.
Well since Parse is being shut down I am migrating to another service. This question is no longer relevant.

Allowing read and write access to Google Drive files to unauthenticated clients

We have been working on a web service (http://www.genomespace.org/) that allows computational biology tools that implement our REST API to, among other things, read and write files stored on "The Cloud".
Our default file storage is a common Amazon S3 bucket, but we now allow users to mount their own private S3 bucket as well as files on Dropbox.
We are now trying to enable similar functionality for Google Drive and have run into some problems unique to Google Drive that we have not encountered with S3 or Dropbox.
Only way to allow clients that are not Google-authenticated to read files unobtrusively is to make the files "Public". Our preference would be that once the user has authorized access to our application via OAuth2, the user files could remain "Private" in Google Drive.
However, even though the user has already authorized our web service to offline access to their "Private" files, we have not found a way to generate a URL that a client authorized by our system can use to GET the file directly without the client being logged into Google as well.
The closest we have come to this functionality has been to change the file permissions to "Anyone with Link", except that for files greater than 20MB Google insists on returning an intermediate web page warning that the file has not been scanned for viruses. In addition to having to mess with file permissions, this would break our existing clients. Only when the file is "Public" and we utilize URLs of the form https://googledrive.com/host/PARENT_FOLDER_ID/FILENAME can non-Google clients read the files without interference.
Have not found any way for clients that are not Google-authenticated to upload a file to Google Drive. Our API allows our authorized clients to PUT files directly to the backing file storage using URLs provided by our server. However, even if a folder is marked "Public", the client requires Google authentication credentials to save to Google Drive. We could deal with both of these issues with intermediate hops through our system (e.g., our web server would first download the file from Google Drive and then allow the client to GET it) however this would be woefully inefficient and, hopefully, unnecessary. These problems have been discussed multiple times before on stackoverflow (e.g. here and here and have read the responses very carefully, but have not seen any recent discussion.
The Google folks direct their API users to post on stackoverflow for support, so I am hoping for a fresh look from insiders.
The general answer is: dont make the drive requests through the user's browser. Insead do everything from your servers. You are the one having the (refresh) tokens for users, so you should make all requests like a proxy between the user and Drive. Same for downloading, you download it and return to the user. As long as you use each drive's token there shouldnt be rate limit/quota issues.