Monitor Secured REST API/URLs on prometheus/grafana - api

Can anyone suggest the steps to monitor the urls and api's on prom/grafana.
Plan:
API token has to be generated.
Use the token generated and access the API's and URL to scrape the metrics with prometheus and plot the graph on grafana.
how can i automate the api token generation for making api call to monitor api/url is up or not within prometheus ?
No details or docs to monitor api/url which are secured with token with shorter ttl on prometheus and grafana.

Related

Using Firebase Auth id tokens to authenticate (multiple) Cloud Run services

Related to Security Cloud Run services for end-users and other services
I'm using:
Firebase Auth to generate id tokens for users with Google, Microsoft, GitHub ... identities
Cloud Endpoints on Cloud Run to invoke (Cloud Run) gRPC services
Firebase Auth users are auth'd by one of my services
Where I'm struggling....
My app provides 1 or more Cloud Run services that the app's users should be able to curl. But authenticating Cloud Run services require per-service id tokens; the id token's audience must use the Cloud Run service URL and the Cloud Run service URL is service-specific.
It seems as though I ought to be able to exchange the Firebase Auth id token for (Google Account) id tokens (with appropriate audiences) that can then be used to invoke the Cloud Run service. The proxy could also run on Cloud Run and it would use my app's auth service to verify whether the id token user should be issued with a Google id token.
Guillaume Blaquire's answer proposes either Coud Endpoints or a proxy similar to what I describe above. However, Cloud Endpoints requires that the backend services be known at deploy time (which these Cloud Run services won't be) and I want to provide the user with the id token so that they can use curl or some other tool to make the auth'd request.
Cloud Run has some compelling documentation for Authenticate (sic.) but I want something between:
Authenticating users -- I have the JWT but I want to receive a Google id token for the Cloud Run service
Authenticating service-to-service which Guillaume's alternative proposal in the answer.
Rather than place your Cloud Run behind Cloud Endpoints, where you have to know the Cloud Run instances ahead of time, you can handle the request and authentication inside the Cloud Run instance itself.
To be able to handle Firebase Authentication tokens inside the Cloud Run instance, they must be setup so that they can be invoked unauthenticated. Then, inside the Cloud Run, it should launch a web server, parse the incoming request (paying attention to the Authorization header - Firebase Auth sample) and then either action or terminate the request.
To achieve this, take a look at this thread for details on how you can handle both HTTP and service-service requests. Alternatively, you could just deploy the Functions Framework image from which that thread's code is based.
If you want cleaner URLs, host multiple endpoints within a single Cloud Run instance and then place that instance behind Cloud Endpoints or you can take a more manual approach via a custom domain using a service like Firebase Hosting.

Google AI Predict API: anonymous authentication for website

After reading each of these Q&As,
google api machine learning can I use an API KEY?
how to use google AI platform online predictions?
How to authenticate GCP AI Platform Predictions using HTTP requests
I am still at a loss to know how to enable simple authentication for the AI Platform Predict API that doesn't need any sign-ins or OAuth screens.
My scenario is the following: we have a static website which allows the user to enter some data, the website (client) sends the data to the model for prediction via the API, and when the results come back, the website shows them to the user. We don't want the user to have to sign in or identify themselves in any way. Just input some data, push a button, and get the results.
However, as far as I've been able to search, there is no way of doing this (the documentation on authentication is in my view confusing, there are multiple overlapping articles and it is difficult to determine what applies in a specific case); you have to use some sort of OAuth which makes the user sign in with a Google account.
Is there really no way to have the website itself authenticated but not the individual users? E.g. using an API key or service account key?
If OAuth is the only way, does that mean users who want to use the website must have a Google Account? And how do I enable it: should I create an OAuth Client ID, or is it the OAuth consent screen?
The recommended practice here is that all the OAuth should happen server-side, where the GCP Service Account JSON key is stored on some backend server.
I am going to answer your question by assuming that your website is hosted on App Engine, but your website could be hosted any where on other GCP products as Cloud Run or any other hosting providers.
In backend webserver you can make the AI Platform predict request using Service Account JSON, then you would need to configure your website to talk to this backend.
Website ----HTTP Request to App Engine URL------> App Engine (code---)--------> AI Platform
So the App Engine backend performs the authentication on behalf of the website client, as Lak clarifies here; since the requests will be passing your GCP Service Acount JSON Key, then they gain access to send the specific HTTP requests to their backend server, which makes the AI Platform calls.
In your case, you do not want the users to access your Google data, you simply want provide them access to your own AI Platform model.
Basically you can just use Client Library on server-side and it automatically does OAuth for you, as long as environment variable is set to Service Account key.
Note: You only need to do Google OAuth IF you want access to a person's Google resources (e.g Google Doc, Calendar, GCP project, etc)

Authentication with Prometheus Azure Monitor ConfigMap

Is it possible to add a token / credential for authenticated /metrics endpoints? There is not an option for it in the configmap template ->
https://github.com/microsoft/OMS-docker/blob/ci_feature_prod/Kubernetes/container-azm-ms-agentconfig.yaml
This can be done with prometheus itself easily. This Azure Monitoring feature seems kind of pointless if you can't use it to scrape authenticated metrics endpoints.
Thanks!

Security considerations for API Gateway clustering?

Clients that communicate against a single point of entry via an API Gateway over HTTPS against a RESTful API
API Gateway: API Keys for tracking and analytics, oAuth for API platform authentication
User Micro service provides user authentication and authorization, generates JWT that is signed and encrypted (JWS,JWE)
Other micro services determine permissions based on claims inside JWT
Micro services communicate internally via PUB/SUB using JWT in the message and other info. Each micro service could be scaled out with multiple instances (cluster with a load balancer).
Question: Can I cluster the the API Gateway and have the load balancer in front of it. What do I need to consider with respect to managing authentication? ie: sharing of API Keys across the API Gateway cluster?
Extra notes, I'm planning on terminating SSL at the gateway and the use of bcrypt for passwords in the db.
Any feedback would be great, thank you.
Can I cluster the the API Gateway and have the load balancer in front
of it.
Yes, you can. Most of the good Api Gateway solutions will provide the ability to do clustering. e.g. https://getkong.org/docs/0.9.x/clustering/ or you can use cloud based Api Gateway: Azure API Management or AWS API Gateway
What do I need to consider with respect to managing authentication?
These specifics depends on your selection of API Gateway solution.

Monitoring access to AWS API Gateway resources using api-keys

I have built a gateway (using aws api gateway) in front of my rest api. I want to monitor the usage of resources on that api using the api-keys generated by api gateway. By 'usage' I mean which resources were requested and served to clients associated with an api key. Amazon claims that cloudtrail can be used to track gateway requests but the x-api-key header does not show up in cloudtrail logs. Has amazon provided an idiomatic way of doing this? Has anyone implemented this functionality in a custom manner? It seems reasonable that this functionality should be built in, however I cannot find how to do this anywhere.