I have images stored in Google Cloud Storage. Whenever my frontend (Swift) requests a certain image I would like to send the image as quickly and efficiently as possible from my backend.
Conveniently, Google Cloud Storage has direct image links for every image.
Is it most efficient to send a multipart/form-data the same way I send an image captured by a user in the front end to the backend? Or is it more efficient to send the URL of the image stored in the cloud where the frontend can proceed to download the image from that URL?
This can indeed be done through a signed URL which provides limited permission and time to make a request. With signed URLs authentication information is contained in their query string, allowing users without credentials to perform specific actions on a resource.
I would like to point out however that Signed URLs can only be used to access resources in Cloud Storage through XML API endpoints.
Since you are using Swift for your frontend I would also like to direct you to explore Google APIs for iOS, such as CocoaPods.
Related
I have some questions about uploading images to cloud storage, I wish someone could help me. I would like answers according to best practices:
Which is better, send the images to my API with all other form information and then send them to the cloud or upload them to the cloud directly via Frontend and separate from other form information?
Do I need to make a request for each image or is it better to upload all images directly?
Backend generates a signed url at which the client uploads the video directly - this requires solid authentication
Depends if the client has a good internet connection or not but batching is usually good
Let's say I have a use case where users can buy mp3 files inside an app. The objects are stored in GCP Cloud Storage . What is the best practice to deliver those objects only to the users that purchased the files?
After researching the topic I came up with three solutions:
Client calls a REST (e.g. one running inside App Engine) service. This service downloads the files from Cloud Storage and then sends them back to the client.
Instead of sending the files via the REST call, I could send the download URL (from Cloud Storage) to the client. This would be more cost efficient, however this sounds like a security concern to me as anyone who simply monitors his network could capture the URL.
Creating a (time-limited) signed url to allow the user the download
Obviously a permission check would have to happen first, e.g. a database that contains if user X purchased mp3 Y.
This problem could also be applied to Azure Blob Storage or AWS S3...
In your use case, you have a constant:
You need a backend to authenticate the user (for example Authentication performed with Cloud Identity Platform and hosted on App Engine or Cloud Run
You need to check the list of MP3 that it has bought (stored in Firestore for example)
And then, you need to allow him to download the file. On this last point I recommend you to generated a signedURL. Download URL exists only in Firebase area (maybe your project is a firebase projet?) but it's the same thing than signerURL. Finally I don't recommend you the #1 proposal. It will work, but in case of long download (because network is poor), the connexion will be interrupted after 60 seconds. And this will keep your AppEngine up for nothing (and you will pay for this...).
I have a large quantity of videos on my Vimeo account that I would like to migrate to my AWS S3 account.
Rather than go through the time consuming process of downloading from Vimeo to my local machine then uploading from my local machine to S3, is there a way where I can do a direct transfer from Vimeo to S3?
If possible, I would want to create a script to iterate through each video via Vimeo API and set up the path to where it would go into S3 then initiate a direct transfer. Any ideas or suggestions would be much appreciated!
If you have a PRO account or higher, you can use the API to get download links for videos on your account, including download links for the original source file. Those download file links should be able to be used for importing into S3. Note that the links provided via the Vimeo API are expiring HTTP 302 redirects to the video file resource, so make sure you take note of the expiration time also provided in the response.
Download links are returned with the rest of a video's metadata, so I suggest using the fields parameter to only return the metadata needed.
http://developer.vimeo.com/api/common-formats#json-filter
https://developer.vimeo.com/api/reference/videos#GET/users/{user_id}/videos
I want to use an external software Alteryx to access the api so I can crawl some JSON data. When I call https://www.yammer.com/api/v1/messages.json, it keeps on popping "HTTP/1.1 403 Forbbiden".
I guess there is something wrong with the authentication. Does anybody know how to embed the credentials in the URL? Or is there any other ways to authenticate so an external software can access?
I can do it perfectly with normal browser after logged in.
Thanks
Yammer's Rest API for retrieving data implements OAUTH 2.0. This is because any application trying to access the data is making the request as an "App" which will then have access to a user's specific data.
Yammer's OAUTH flow is decribed here: https://developer.yammer.com/docs/oauth-2
I do not yet know of an easy way to implement the authentication using anything other than development within a browser for this process.
You may be better off exporting the json messages to a file and then importing into your external software.
I am creating a mobile app through Phonegap as the client and using Rails as the back-end. I am deploying my app to Heroku and am planning to use S3 to store the image files, because that is what is recommended from my various readings online.
I was wondering how could the Rails controller be used to send images back from Ajax requests from Phonegap.
I am not sure how to write the back-end API code to send images to requests.
I also read that using the send_file method without x-send_file enabled will slow down the server because sending the image would block other request until it is done.
Please let me know if you have any insights.
You could use redirects to the S3 assets here, then your browser is just getting the image directly, and not holding up one of your server processes while the browser slowly downloads the images.
If you need to keep your images private you can use the signed URL feature of S3 to only give signed and time limited URL's to the appropriate users. (See my commit to Paperclip: https://github.com/thoughtbot/paperclip/pull/292)