Can Agora.io cloud recording save to a Google Cloud Storage bucket? - agora.io

I have been able to use the agora.io REST APIs to enable cloud recording to an AWS S3 bucket as a test using your Unity SDK. For our project, we would like to store to Google Cloud Storage (GCS).
I don't see a vendor ID for GCS in the agora documentation (even though GCS is supposed to be compatible with the AWS APIs). Is GCS supported? If so, what's the vendor ID?
I see a similar question last year at Cloud and REST API based Recording and storing to Google bucket or AWS S3

It is supported now
Documentation reference:
https://docs.agora.io/en/cloud-recording/cloud_recording_api_rest?platform=All%20Platforms

#Updated Dec 2021
The Agora Cloud Recording Service does now support GCP.
As mentioned in the Cloud Recording documentation, the service supports:
0: Qiniu Cloud
1: Amazon S3
2: Alibaba Cloud
3: Tencent Cloud
4: Kingsoft Cloud
5: Microsoft Azure
6: Google Cloud
7: Huawei Cloud
8: Baidu AI Cloud
Documentation reference: https://docs.agora.io/en/cloud-recording/cloud_recording_api_rest?platform=All%20Platforms

Related

Use Google Storage Transfer API to transfer data from external GCS into my GCS

I am working on a web application which comprises of ReactJs frontend and Java SpringBoot backend. This application would require users to upload data from their own Google Cloud storage into my Google Cloud Storage.
The application flow will be as follows -
The frontend requests the user for read access on their storage. For this I have used oauth 2.0 access tokens as described here
The generated Oauth token will be passed to the backend.
The backend will also have credentials for my service account to allow it to access my Google Cloud APIs. I have created the service account with required permissions and generated the key using the instructions from here
The backend will use the generated access token and my service account credentials to transfer the data.
In the final step, I want to create a transfer job using the google Storage-Transfer API. I am using the Java API client provided here for this.
I am having difficulty providing the authentication credentials to the transfer api.
In my understanding, there are two different authentications required - one for reading the user's bucket and another for starting the transfer job and writing the data in my cloud storage. I haven't found any relevant documentation or working examples for my use-case. In all the given samples, it is always assumed that the same service account credentials will have access to both the source and sink buckets.
tl;dr
Does the Google Storage Transfer API allow setting different source and target credentials for GCS to GCS transfers? If yes, how does one provide these credentials to the transfer job specification.
Any help is appreciated. Thanks!
This is not allowed for the the GCS Transfer API unfortunately, for this to work it would be required that the Service Account have access to both the source and the sink buckets, as you mentioned.
You can try opening a feature request in Google's Issue Tracker if you'd like so that Google's Product Team can consider such a functionality for newer versions of the API, also you could mention that this is subject is not touched in the documentation, so it can be improved.

Can I use AWS S3 with Google Speech-to-Text in larger files?

I tried to use Google Cloud Speech-to-Text in my node.js project. It works fine with smaller files that I've on my disk but I wanted to get longer files that are stored in AWS S3. Is it possible or I need to use Google Cloud Storage?
You can use google cloud storage libraries in your node.js code to access AWS s30 storage:
"The Cloud Storage XML API is interoperable with some cloud storage tools and libraries that work with services such as Amazon Simple Storage Service (Amazon S3) and Eucalyptus Systems, Inc. To use these tools and libraries, change the request endpoint (URI) that the tool or library uses so it points to the Cloud Storage URI (https://storage.googleapis.com), and configure the tool or library to use your Cloud Storage HMAC keys." For more information please check Google documentation
For longer audio files, you can only use files in Google Cloud Storage. You can't use audio files stored in AWS S3. https://cloud.google.com/speech-to-text/docs/reference/rest/v1/RecognitionAudio

AWS CLI - how to get when the file was changed

A client is uploading data we use to AWS S3. I need to find out when the uploads took place in the last week (or month). How could I go about that? If I use aws s3 ls path I get only the date of the last change.
To obtain historical information about Amazon S3 API calls, you can use AWS CloudTrail.
From Logging Amazon S3 API Calls by Using AWS CloudTrail - Amazon Simple Storage Service:
Amazon S3 is integrated with AWS CloudTrail, a service that provides a record of actions taken by a user, role, or an AWS service in Amazon S3. CloudTrail captures a subset of API calls for Amazon S3 as events, including calls from the Amazon S3 console and from code calls to the Amazon S3 APIs.
To use object-level logging, see: How Do I Enable Object-Level Logging for an S3 Bucket with AWS CloudTrail Data Events? - Amazon Simple Storage Service

Using Google Accounts for Authentication and Google APIs in Compute Engine

The Google App Engine standard environment allows easy integration with Google accounts for user authentication, Cloud Datastore and with APIs such as Gmail API, Google Calendar etc.
Are these same features available in Compute Engine also? I mean can I deploy a web application in a Tomcat container in a Compute Engine VM and use Google Accounts for authentication, Cloud Datastore for persistence and APIs such as Google Plus and Google Calendar for reading users' personal information?
I found this URL that says Cloud Datastore can be used from Compute Engine but could not find similar documentation about usage of Google Accounts for authentication and usage of APIs like Google Plus and Google Calendar.
Yes you can.
You can use all Google APIs (Gmail API, Google Calendar etc) from tomcat and from any other web container. You simply need to provide the credential to connect to the Google APIs. HERE how can you obtain the credential on a server side web application, please note that the documentation it's not referring to any specific web container.
Appengine provides out of the box a simpler way to authenticate Google user through the UserServiceFactory. This service it's not available outside the AppEngine Enviroment because it comes with the AppEngine SDK.
In order to use the Google Cloud Datastore from outside of the AppEngine environment you need to use the Remote API. With this api you will be able to access the Datastore service.

Amazon Cognito development using Mobile SDK

I am looking at Mobile backend service providers and trying some sample development.
I have looked at feedhenry, parse etc. I came across Amazon Cognito as well. Looks, "AWS Cognito + Mobile SDK" supports MBaaS predominantly.
Could someone advise me to start trying MBaaS with AWS Cognito?
A cognito dev guide can be found here: http://docs.aws.amazon.com/cognito/latest/developerguide/what-is-amazon-cognito.html
Also here is a sample app: https://github.com/awslabs/aws-sdk-ios-samples/tree/developer-authenticated-identities-2-4/CognitoSync-Sample
You will likely want to integrate with API Gateway to create your backend APIs- http://docs.aws.amazon.com/apigateway/latest/developerguide/welcome.html