I am currently developing a client-side app where users login using e-mail/password against MongoDB Atlas. The backend runs completely serverless.
All logged in users should be able to upload and retrieve images from GCP - Storage bucket without a visible login, which means the application should authenticate for every user on the background.
I was thinking about using Google Service Accounts in combination with auth0, but I don't know where to start...
If someone could help me tell where to start, that would be great :)
The question is difficult to answer. However, here some insights.
The prefered way is to have a serverless backend, AppEngine standard, Cloud Run or Cloud Function for doing this. The user performs its authentication and then exchange security token between the frontend and the backend. When the user want to reach a GCP resource, it asks the backend, which request the request thanks to its own service account.
By the way, it's easy to trace the user request and to serve him only the resources related to it. And you have only 1 service account, for the backend
If you grant access to a bucket to a user, it could download all the files (But maybe there is one bucket per user?). If you chose to limit object access with ACL, the management is complex.
You don't need to have a service account per user (and in any case, you have a quota to 100 service accounts per project), you can use Cloud Identity Platform (CIP) instead of your MongoDB database for authentication (CIP don't perform authorization, you should use MongoDB for authorization and other stuffs related to authenticated user). CIP is Firebase Auth rebranded
Related
Related to Security Cloud Run services for end-users and other services
I'm using:
Firebase Auth to generate id tokens for users with Google, Microsoft, GitHub ... identities
Cloud Endpoints on Cloud Run to invoke (Cloud Run) gRPC services
Firebase Auth users are auth'd by one of my services
Where I'm struggling....
My app provides 1 or more Cloud Run services that the app's users should be able to curl. But authenticating Cloud Run services require per-service id tokens; the id token's audience must use the Cloud Run service URL and the Cloud Run service URL is service-specific.
It seems as though I ought to be able to exchange the Firebase Auth id token for (Google Account) id tokens (with appropriate audiences) that can then be used to invoke the Cloud Run service. The proxy could also run on Cloud Run and it would use my app's auth service to verify whether the id token user should be issued with a Google id token.
Guillaume Blaquire's answer proposes either Coud Endpoints or a proxy similar to what I describe above. However, Cloud Endpoints requires that the backend services be known at deploy time (which these Cloud Run services won't be) and I want to provide the user with the id token so that they can use curl or some other tool to make the auth'd request.
Cloud Run has some compelling documentation for Authenticate (sic.) but I want something between:
Authenticating users -- I have the JWT but I want to receive a Google id token for the Cloud Run service
Authenticating service-to-service which Guillaume's alternative proposal in the answer.
Rather than place your Cloud Run behind Cloud Endpoints, where you have to know the Cloud Run instances ahead of time, you can handle the request and authentication inside the Cloud Run instance itself.
To be able to handle Firebase Authentication tokens inside the Cloud Run instance, they must be setup so that they can be invoked unauthenticated. Then, inside the Cloud Run, it should launch a web server, parse the incoming request (paying attention to the Authorization header - Firebase Auth sample) and then either action or terminate the request.
To achieve this, take a look at this thread for details on how you can handle both HTTP and service-service requests. Alternatively, you could just deploy the Functions Framework image from which that thread's code is based.
If you want cleaner URLs, host multiple endpoints within a single Cloud Run instance and then place that instance behind Cloud Endpoints or you can take a more manual approach via a custom domain using a service like Firebase Hosting.
After reading each of these Q&As,
google api machine learning can I use an API KEY?
how to use google AI platform online predictions?
How to authenticate GCP AI Platform Predictions using HTTP requests
I am still at a loss to know how to enable simple authentication for the AI Platform Predict API that doesn't need any sign-ins or OAuth screens.
My scenario is the following: we have a static website which allows the user to enter some data, the website (client) sends the data to the model for prediction via the API, and when the results come back, the website shows them to the user. We don't want the user to have to sign in or identify themselves in any way. Just input some data, push a button, and get the results.
However, as far as I've been able to search, there is no way of doing this (the documentation on authentication is in my view confusing, there are multiple overlapping articles and it is difficult to determine what applies in a specific case); you have to use some sort of OAuth which makes the user sign in with a Google account.
Is there really no way to have the website itself authenticated but not the individual users? E.g. using an API key or service account key?
If OAuth is the only way, does that mean users who want to use the website must have a Google Account? And how do I enable it: should I create an OAuth Client ID, or is it the OAuth consent screen?
The recommended practice here is that all the OAuth should happen server-side, where the GCP Service Account JSON key is stored on some backend server.
I am going to answer your question by assuming that your website is hosted on App Engine, but your website could be hosted any where on other GCP products as Cloud Run or any other hosting providers.
In backend webserver you can make the AI Platform predict request using Service Account JSON, then you would need to configure your website to talk to this backend.
Website ----HTTP Request to App Engine URL------> App Engine (code---)--------> AI Platform
So the App Engine backend performs the authentication on behalf of the website client, as Lak clarifies here; since the requests will be passing your GCP Service Acount JSON Key, then they gain access to send the specific HTTP requests to their backend server, which makes the AI Platform calls.
In your case, you do not want the users to access your Google data, you simply want provide them access to your own AI Platform model.
Basically you can just use Client Library on server-side and it automatically does OAuth for you, as long as environment variable is set to Service Account key.
Note: You only need to do Google OAuth IF you want access to a person's Google resources (e.g Google Doc, Calendar, GCP project, etc)
I have a simple and free Google user account like this: my.name#gmail.com.
Im working with SomeCompany with a billable Google account. This company exposed a bucket to which im supposed to upload someFile.txt. The bucket url looks like this: https://console.cloud.google.com/storage/browser/SomeCompany-multi-44444
or, alternatively gsutil:
gs://SomeCompany-multi-44444
I can access and use this bucket (after auth prompt) from my browser.
Question: Can i access this bucket using API (preferably using Python oauth2client or gcloud) without creating (billable) Service account of my own? How? I fail to understand how to create an API authentication to this bucket without creating a service account which requires credit card. Is there something that SomeCompany have to do in order for me to succeed?
Yes, it's possible and reasonable.
Service accounts and user accounts are all Google identities (as are Groups).
The difference is that service accounts use two-legged auth and have a simpler flow. But, a user account is a valid identity and yours has been authorized to use the bucket.
The difference is that you need to use three-legged auth and exchange your credentials for an access token that you may use to authenticate to the service.
Here's a link to the Python Cloud Client Library section on using 3-legged (User) auth.
I have a mobile app which authenticates users on my server. I'd like to store images of authenticated users in Google Cloud Storage bucket but I'd like to avoid uploading images via my server to google bucket, they should be directly uploaded (or downloaded) from the bucket.
(I also don't want to display another Google login to users to grant access to their bucket)
So my best case scenario would be that when user authenticates to my server, my server also generates short lived access token to specific Google storage bucket with read and write access.
I know that service accounts can generate accessTokens but I couldn't find any documentation if it is a good practice top pass these access tokens from server to client app and if it is possible to limit scope of the access token to specific bucket.
I found authorization documentation quite confusing and asking here what would be best practice approach to achieve access to the cloud storage for my case?
I think you are looking for signed urls.
A signed URL is a URL that provides limited permission and time to
make a request. Signed URLs contain authentication information in
their query string, allowing users without credentials to perform
specific actions on a resource.
Here you can see more about them in GCP. Here you have an explanation of how you can adapt them for your program.
I'm a newbie in AWS infrastructure, and I can't figure out how to build auth process which I want.
I want to have something similar to what other cloud storages, like Box, Dropbox, Onedrive have:
developer registeres OAuth app with a set of permissions
client with one click can give a consent for this app to have listed permissions on his own account and it's content, eternally, until consent is deliberately withdrawn
Now, as far as I understand, client should go to console and create a user, create a role for him, then send this user's id and key to my app, which is not that convinient. I'm looking for a most easy and simple way to do that.
I've tested "Login with Amazon" + "Amazon Cognito", but it turned out as a completely opposite mechanism: client should set up Login, link it to Cognito, to provide me one click access.
So, is it even possible? Which is the best way to implement such auth process?
There isn't a way to do what you're trying to do, and I would suggest that there's a conceptual problem with comparing Amazon S3 to Dropbox, Box, or Onedrive -- it's not the same kind of service.
S3 is a service that you could use to build a service like those others (among other purposes, of course).
Amazon Simple Storage Service (Amazon S3), provides developers and IT teams with secure, durable, highly-scalable cloud storage.
https://aws.amazon.com/s3/
Note the target audience -- "developers and IT teams" -- not end-users.
Contrast that with Amazon Cloud Drive, another service from Amazon -- but not part of AWS.
The Amazon Cloud Drive API and SDKs for Android and iOS enable your users to access the photos, videos, and documents that they have saved in the Amazon Cloud Drive, and provides you the ability to interact with millions of Amazon customers. Access to the free Amazon Cloud Drive API and SDKs for Android and iOS enable you to place your own creative spin on how users upload, view, edit, download, and organize their digital content using your app.
https://developer.amazon.com/public/apis/experience/cloud-drive/
The only way for your app to access your app's user's bucket would be for the user to configure and provide your app with a key and secret, or to configure their bucket policy to allow the operation by your app's credentials, or to create an IAM role and allow your app to assume it on their behalf, or something similar within the authentication and authorization mechanisms in AWS... none of which sound like a good idea.
There's no OAuth mechanism for allowing access to resources in an AWS account.