How to implement BigQuery for the Google EKM? - google-bigquery

I want to implement BigQuery which interacts with an external key management using HYOK function.
I have not found any guides about the configuration of BigQuery with EKM (and then using HYOK) but just the configuration of it with the CMEK (that use BYOK).
Can someone here help me?

To clarify some of the terminology in your question,
HYOK = Hold your own key = The ability to use keys outside GCP from within GCP
Cloud EKM = Cloud external key manager = Cloud EKM is the service that GCP uses to implement HYOK. Cloud EKM is used through the Cloud KMS API. EKM keys have the EXTERNAL protection level in Cloud KMS.
CMEK = Customer managed encryption keys = The ability to use keys in Cloud KMS to protect data at rest in other GCP services (such as BigQuery).
Answering your question directly, this guide applies equally to all keys available through Cloud KMS, including Cloud EKM keys. The difference for using a Cloud EKM key as opposed to one stored inside GCP is in how you set up the key in the Cloud KMS API (see this documentation). Once it is set up, the Cloud EKM keys work the same as other protection levels.

Related

Use Google Storage Transfer API to transfer data from external GCS into my GCS

I am working on a web application which comprises of ReactJs frontend and Java SpringBoot backend. This application would require users to upload data from their own Google Cloud storage into my Google Cloud Storage.
The application flow will be as follows -
The frontend requests the user for read access on their storage. For this I have used oauth 2.0 access tokens as described here
The generated Oauth token will be passed to the backend.
The backend will also have credentials for my service account to allow it to access my Google Cloud APIs. I have created the service account with required permissions and generated the key using the instructions from here
The backend will use the generated access token and my service account credentials to transfer the data.
In the final step, I want to create a transfer job using the google Storage-Transfer API. I am using the Java API client provided here for this.
I am having difficulty providing the authentication credentials to the transfer api.
In my understanding, there are two different authentications required - one for reading the user's bucket and another for starting the transfer job and writing the data in my cloud storage. I haven't found any relevant documentation or working examples for my use-case. In all the given samples, it is always assumed that the same service account credentials will have access to both the source and sink buckets.
tl;dr
Does the Google Storage Transfer API allow setting different source and target credentials for GCS to GCS transfers? If yes, how does one provide these credentials to the transfer job specification.
Any help is appreciated. Thanks!
This is not allowed for the the GCS Transfer API unfortunately, for this to work it would be required that the Service Account have access to both the source and the sink buckets, as you mentioned.
You can try opening a feature request in Google's Issue Tracker if you'd like so that Google's Product Team can consider such a functionality for newer versions of the API, also you could mention that this is subject is not touched in the documentation, so it can be improved.

GCP external application to app-engine endpoint authentication

We are building a small web-UI using React that will be served up by GCP App-Engine (standard). The UI will display a carousel of images along with some image metadata to our client's employees when they click on a link inside of their internal GIS system. We are looking to authenticate these calls since the App-Engine endpoint will be exposed publicly, and are hoping to use a GCP Service Account private key that will be used by the client to create a time-limited JSON web-token that will give temporary access to the GIS user when they open the web-UI. We are following this GCP documentation. In summary:
We create a new service-account with necessary IAM permissions in GCP along with a key
We share the private key with client which they then use to sign a Json Web Token which is passed in the call to our endpoint when user accesses our web-UI from their GIS system
Call is authenticated by GCP backend (ESP/OpenAPI)
Question: is this a recommended approach for external system accessing GCP resources or is there a better pattern more applicable to this type of situation (external system accessing GCP resource)?
I believe this is the recommended approach for your use case.
According to the official documentation:

How to retain private control over the encryption keys while using Cloud services?

All cloud service providers have their own Key management systems and customer keys can be imported when desired. However i want to have private control over keys such that keys are provided externally and never imported into the KMS.So the cloud provider should be able to access the external key without importing. Or another case could be where public keys are at the cloud provider but only client has access to private keys externally. Any help or ideas will be appreciated
On Google Cloud, you can do this with External Key Manager (EKM) https://cloud.google.com/kms/docs/ekm

How to generate Google Translate API key

I am attempting to hire a developer to program a Google Translate API call into salesforce. I need to provide him with an API key for Google Translate on our Google Cloud account but can't figure out how to generate that key. Here is a slightly outdated definition of the integration I want to ask him to build.
http://adaptatechnologies.com/implementing-google-translation-services-salesforce/
Where can I find the specific instructions to generate the Google Translate API key?
API keys are the same for all GCP APIs. Instructions here:
https://cloud.google.com/docs/authentication/api-keys
The API keys can be created directly from the APIs & Services section of the GCP console, you can use the official documentation that contains the step-by-step process, as mentioned by Rob Kochman.
Additionally, I would recommend you to set some restrictions to the in order to define the specific services that can be used with each key (Translation API in this case), as well as determine the web sites, IP addresses, or apps can use an API key; in this way, you can add some additional security to your key avoiding to publicly exposing it which cloud lead to the compromise of your account and the generation of unexpected charges.

Limiting Access to API Gateway (and AWS Lambda) in a package

We have a package that we share with out customers. In the package, we have a chunk of code that does HTTP Request callouts to our central API Gateway. As of now, our API Gateway is open and accepts requests from everywhere, which is not good. I want to limit access to our users who would be using our software. The only solution I have found is using IAM and providing authorization that would require us to include our Access Keys in the package. Our users can install our package in any environment they want and we have no control over that environment. So I think a viable option is to create a generic user policy with minimal access to allow our users to call our API Gateway. However, putting access key in the code doesn't seem like a good idea. Another option is to provider our customers with access keys but that also has overhead. What is a better alternative that is more secure and easy to maintain?
You can use built-in API Gateway API Key functionality when IAM policies aren't possible.
So long as your clients could be on any infrastructure, versus limited to AWS, the API Gateway service provides a generic API key solution, which allows you to restrict client traffic to your API Gateway by enforcing that client requests include API keys. This API key interface is part of their "API Usage Plan" feature.
This document explains how to use the console to set up an API Gateway to enforce that client traffic bears an API key:
To set up API keys, do the following:
Configure API methods to require an API key.
Create or import an API key for the API in a region.
Your clients can implement a "secret storage" solution, in order to avoid putting their API keys into their source code.
For sure it isn't wise for your clients to store their API Keys plain-text inside their source code. Instead, they could use a secret storage solution, to store the API keys outside of their codebase, but still give their applications access to the secret.
This article describes an example solution for secure secret storage (e.g. secure API key storage) which grants an application access to the application secret without putting the unencrypted secret into the source code. It uses Amazon KMS + Cryptex, but the same principle can be applied with other technologies: http://technologyadvice.github.io/lock-up-your-customer-accounts-give-away-the-key/