Creating google api credentials (from service account with scope and delegated account) with oauth2client - google-oauth

To access GMail API (and personify calls) I'm using a service account (created from Google Cloud Platform). The json file I have looks like this
{
"type": "service_account",
"project_id": "[PROJECT-ID]",
"private_key_id": "[KEY-ID]"
"private_key": "-----BEGIN PRIVATE KEY-----\n[PRIVATE-KEY]\n-----END PRIVATE KEY-----\n",
"client_email": "[SERVICE-ACCOUNT-EMAIL]",
"client_id": "[CLIENT-ID]",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/[SERVICE-ACCOUNT-EMAIL]"
}
I'm also using the oauth2client library to make it easier but I can't find a way to create the credentials and then specify a scope and a delegated account.
I tried
from oauth2client import service_account
self._credentials = service_account.ServiceAccountCredentials.from_json(SERVICE_ACCOUNT_JSON_CONTENT)
self._credentials = self._credentials.create_scoped([u'https://www.googleapis.com/auth/gmail.send'])
self._credentials = self._credentials.create_delegated(MY_USER)
self._client = discovery.build(u'gmail', u'v1', credentials=self._credentials)
But I get an error cause it expects a PKCS-8 key.
How can I do that ? (My code runs on App Engine Flex if that helps)
Thanks

Finally, since oauth2client is now deprecated in favor of google-auth, I did
from googleapiclient import discovery
from google.oauth2.service_account import Credentials
credentials = Credentials.from_service_account_file(PATH_TO_SERVICE_ACCOUNT_JSON,
scopes=[u'https://www.googleapis.com/auth/gmail.send'])
delegated_credentials = self._credentials.with_subject(MY_USER)
client = discovery.build(u'gmail', u'v1', credentials=delegated_credentials)
and it worked ;-)

Related

Aws-amplify integration with React Native Token - unauthorized

I have configured my React Native application to use a Cognito user pool that is used for user identity management and authentication using AWS Amplify. I am using Custom Authentication for user management. I am able to register, log in and perform other user management-related tasks. I also have an API gateway and a few Lambda Functions which I want to access through my React Native app. When I sign in, I receive a JWT Token which I want to send to the API gateway to access my Lambdas, but no matter what I do I get an 'unauthorized' 403 or 401 message from my API Gateway.
My question is: How can I expose the API gateway/ Lambdas to the Cognito user pool users and Why the token generated by Cognito itself is unauthorized to access my api gateway.
P.S. - I used postman with the right Auth URL and settings, the postman token itself is authorized to access the API gateway and lambdas. (The user credentials are the same as I use with the React Native app)
I have spent a few days, any pointers in the right direction would be very helpful.
Thanks in advance.
NPN
Amplify Config:
const awsmobile = {
aws_project_region: 'us-XXXX-X',
aws_cognito_region: 'us-XXXX-X',
aws_user_pools_id: 'us-XXXX-XXXXXX',
aws_user_pools_web_client_id: 'XXXXh1i5nXXXX',
//aws_user_pools_web_client_secret: 'XXXXXoofuu0lXXXX',
oauth: {
domain: 'XXXXXXXX.us-XXXX-X.amazoncognito.com',
scope: ["email", "openid", "aws.cognito.signin.user.admin"]
},
aws_cognito_username_attributes: ['EMAIL'],
aws_cognito_social_providers: ['GOOGLE'],
aws_cognito_signup_attributes: ['XXXXX', 'XXXXX', 'EMAIL', 'XXXXXX'],
aws_cognito_mfa_configuration: 'OFF',
aws_cognito_mfa_types: [],
aws_cognito_password_protection_settings: {
passwordPolicyMinLength: 8,
passwordPolicyCharacters: ['REQUIRES_LOWERCASE', 'REQUIRES_UPPERCASE', 'REQUIRES_NUMBERS', 'REQUIRES_SYMBOLS'],
},
aws_cognito_verification_mechanisms: ['EMAIL'],
};
export default awsmobile;
import { Auth } from 'aws-amplify';
const login = async (username: string, password: string) => {
const response = await Auth.signIn(username, password);
console.log(response.data.signInUserSession.accessToken.jwtToken);
return response;
};

Airflow authentication with RBAC and Key cloak

I want to implement rbac based auth in airflow with keycloak. Can someone help me with it.
I have creaed the webserver.config file and I am using docker to up the airflow webserver.
from airflow.www_rbac.security import AirflowSecurityManager
from flask_appbuilder.security.manager import AUTH_OAUTH
import os
import json
AUTH_TYPE = AUTH_OAUTH
AUTH_USER_REGISTRATION_ROLE = "Admin"
OAUTH_PROVIDERS = [
{
'name': 'keycloak',
'icon': 'fa-user-circle',
'token_key': 'access_token',
'remote_app': {
'base_url': 'http://localhost:8180/auth/realms/airflow/protocol/openid-connect/',
'request_token_params': {
'scope': 'email profile'
},
'request_token_url': None,
'access_token_url': 'http://localhost:8180/auth/realms/airflow/protocol/openid-connect/token',
'authorize_url': 'http://localhost:8180/auth/realms/airflow/protocol/openid-connect/auth',
'consumer_secret': "98ec2e89-9902-4577-af8c-f607e34aa659"
}
}
]
I have also set the ariflow.cfg
rbac = True
authenticate = True
But still its not redirecting to the keycloak when the airflow is loaded.
I use :
docker build --rm --build-arg AIRFLOW_DEPS="datadog,dask" --build-arg PYTHON_DEPS="flask_oauthlib>=0.9" -t airflow .
and
docker run -d -p 8080:8080 airflow webserver
TO execute it.
I maybe coming late to this one and my answer may not work exactly as I'm using a different auth provider, however it's still OAuth2 and in a previous life I used Keycloak so my solution should also work there.
My answer makes use of authlib (At time of writing newer versions of airflow have switched. I am on 2.1.2)
I've raised a feature request against Flask-AppBuilder which Airflow uses as it's OAuth hander should really take care of things when the scope includes openid (you'd need to add this to your scopes)
From memory keycloak returns id_token along side the access_token and refresh_token and so this code simply decodes what has already been returned.
import os
import logging
import re
import base64
import yaml
from flask import session
from airflow.www.security import AirflowSecurityManager
from flask_appbuilder.security.manager import AUTH_OAUTH
basedir = os.path.abspath(os.path.dirname(__file__))
MY_PROVIDER = 'keycloak'
class customSecurityiManager(AirflowSecurityManager):
def oauth_user_info(self, provider, resp):
if provider == MY_PROVIDER:
log.debug("{0} response received : {1}".format(provider,resp))
id_token = resp["id_token"]
log.debug(str(id_token))
me = self._azure_jwt_token_parse(id_token)
log.debug("Parse JWT token : {0}".format(me))
if not me.get("name"):
firstName = ""
lastName = ""
else:
firstName = me.get("name").split(' ')[0]
lastName = me.get("name").split(' ')[-1]
return {
"username": me.get("email"),
"email": me.get("email"),
"first_name": firstName,
"last_name": lastName,
"role_keys": me.get("groups", ['Guest'])
}
else:
return {}
log = logging.getLogger(__name__)
AUTH_TYPE = AUTH_OAUTH
AUTH_USER_REGISTRATION = True
AUTH_USER_REGISTRATION_ROLE = "Guest"
AUTH_ROLES_SYNC_AT_LOGIN = True
CSRF_ENABLED = True
AUTH_ROLES_MAPPING = {
"Airflow_Users": ["User"],
"Airflow_Admins": ["Admin"],
}
OAUTH_PROVIDERS = [
{
'name': MY_PROVIDER,
'icon': 'fa-circle-o',
'token_key': 'access_token',
'remote_app': {
'server_metadata_url': WELL_KNOWN_URL,
'client_id': CLIENT_ID,
'client_secret': CLIENT_SECRET,
'client_kwargs': {
'scope': 'openid groups',
'token_endpoint_auth_method': 'client_secret_post'
},
'access_token_method': 'POST',
}
}
]
SECURITY_MANAGER_CLASS = customSecurityManager
Ironically the Azure provider already returns id_token and it's handled so my code makes use of that existing parsing
The code decodes id_token
Note you can turn on debug logging with the environmental variable AIRFLOW__LOGGING__FAB_LOGGING_LEVEL set to DEBUG.
If you switch on debug logs and see an entry like the following (note the id_token) you can probably use the code I've supplied.
DEBUG - OAUTH Authorized resp: {'access_token': '<redacted>', 'expires_in': 3600, 'id_token': '<redacted>', 'refresh_token': '<redacted>, 'scope': 'openid groups', 'token_type': 'Bearer', 'expires_at': <redacted - unix timestamp>}
The id_token is in 3 parts joined by a full stop . The middle part contains the user data and is simply base64 encoded

What type of app/authentication flow should I select to read my cloud OneNote content using a Python script and a personal Microsoft account?

I am completely new to MS Identity services and I am overwhelmed by the many options I have for what I need to do
Here is what I am trying to achieve: I have a OneNote personal account and notes stored in the MS Cloud (OneDrive I guess). I need to be able to run a Python script, get the content of my notes, do some processing and save them back. This will be from the command line on a home Windows10 computer
My question: what type of application should I register in MS AD and what type of authentication flow should I used for the above?
I have tried many things and this is as far as I could get:
-I registered an app with Azure AD (tried both personal and AD app)
-I configured the app as Windows App
-I selected a device authentication flow
I tried this code with both types of app
import requests
import json
from msal import PublicClientApplication
tenant = "5fae6798-ca1a-49d4-a5fb-xxxxxxx" ◄ regular app
client_id = "d03a79d3-1de0-494c-8eb0-xxx" ◄ personal app
#client_id="bbd3d6df-f5f3-4206-8bd5-xxxxxx"
scopes=["Notes.ReadWrite.All","Notes.Read.All","Notes.Read","Notes.Create","Notes.ReadWrite",
"Notes.ReadWrite.CreatedByApp","Notes.Read","Notes.Create","Notes.ReadWrite",
"Notes.ReadWrite.CreatedByApp","Notes.Read.All","Notes.ReadWrite.All"]
endpoint= "https://graph.microsoft.com/v1.0/me"
authority = "https://login.microsoftonline.com/" + tenant
app=PublicClientApplication(client_id=client_id, authority=authority)
flow = app.initiate_device_flow(scopes=scopes)
if "user_code" not in flow:
raise ValueError(
"Fail to create device flow. Err: %s" % json.dumps(flow, indent=4))
print(flow["message"])
result = app.acquire_token_by_device_flow(flow)
endpoint= "https://graph.microsoft.com/v1.0/users/c5af8759-4785-4abf-9434-xxxx/onenote/notebooks"
if "access_token" in result:
# Calling graph using the access token
graph_data = requests.get( # Use token to call downstream service
endpoint,
headers={'Authorization': 'Bearer ' + result['access_token']},).json()
print("Graph API call result: %s" % json.dumps(graph_data, indent=2))
else:
print(result.get("error"))
print(result.get("error_description"))
print(result.get("correlation_id")) # You may need this when reporting a bug
Regular application
To sign in, use a web browser to open the page https://microsoft.com/devicelogin and enter the code AH2UHFDXB to authenticate.
Graph API call result: {
"error": {
"code": "30108",
"message": "OneDrive for Business for this user account cannot be retrieved.",
"innerError": {
"request-id": "016910d2-c193-4e3f-9d51-52fce86bfc72",
"date": "2020-05-14T16:45:44"
}
}
}
Personal application output
Fail to create device flow. Err: {
"error": "invalid_request",
"error_description": "AADSTS9002331: Application 'bbd3d6df-f5f3-4206-8bd5-xxxxxxx'(OneNotePersonal) is configured for use by Microsoft Account users only. Please use the /consumers endpoint to serve this request.\r\nTrace ID: 1c4047e6-98a8-4615-9a0c-4b0dc9ba5600\r\nCorrelation ID: a6733520-6df9-422a-a6b4-e8f4e2de1265\r\nTimestamp: 2020-05-14 16:56:27Z",
"error_codes": [
9002331
],
"timestamp": "2020-05-14 16:56:27Z",
"trace_id": "1c4047e6-98a8-4615-9a0c-4b0dc9ba5600",
"correlation_id": "a6733520-6df9-422a-a6b4-e8f4e2de1265",
"interval": 5,
"expires_in": 1800,
"expires_at": 1589477187.9909642,
"_correlation_id": "a6733520-6df9-422a-a6b4-e8f4e2de1265"
}
This was solved this way
That error message suggests you to create your authority string as
authority = "https://login.microsoftonline.com/consumers",
because you were using the client_id of a "personal app". Change that authority, and you can proceed.

Google API Key + Access Token for Cloud Natural Language API

I need a Google API Key and a Google Access Token to add to some sample code. However, when I create credentials for a Google Cloud NLP project, I get a JSON file that contains the code posted below. Which on is the API Key and which one is the access token? I'm so confused, thanks!
{
"type": "service_account",
"project_id": "project-id",
"private_key_id": "some_number",
"private_key": "-----BEGIN PRIVATE KEY-----\n....
=\n-----END PRIVATE KEY-----\n",
"client_email": "<api-name>api#project-id.iam.gserviceaccount.com",
"client_id": "...",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/...<api-name>api%40project-id.iam.gserviceaccount.com"
}
You have to use the Private Key in order to make a signed JWT (JSON Web Token). You then use this to request a new token. After you get the token from Google, you use that for subsequent requests by adding the token to your HTTP header:
Header Name | value
-----------------------------
Authorization: Bearer <token>
See these Google docs for all the details.

OAuth2Decorator: Using developer's token to run API calls for user

For the "normal" oauth2 dance, I get to specify the user and get a corresponding token.
This allows me to make API calls masquerading as that user, i.e. on his behalf.
It can also allow the user to make calls masquerading as me.
A use case is bigquery where I don't have to grant table access to the user and I can specify my own preferred level of control.
Using the simplified OAuth2Decorator, I don't seem to have this option.
Am I right to say that?
Or is there a work-around?
In general, what is the best practice? To use the proper oauth (comprising of Flow, Credentials and Storage)? Or to use OAuth2Decorator.
Thank you very much.
You can certainly use an OAuth2Decorator
Here is an example:
main.py
import bqclient
import httplib2
import os
from django.utils import simplejson as json
from google.appengine.api import memcache
from google.appengine.ext import webapp
from google.appengine.ext.webapp.util import run_wsgi_app
from oauth2client.appengine import oauth2decorator_from_clientsecrets
PROJECT_ID = "xxxxxxxxxxx"
DATASET = "your_dataset"
QUERY = "select columns from dataset.table"
CLIENT_SECRETS = os.path.join(os.path.dirname(__file__),'client_secrets.json')
http = httplib2.Http(memcache)
decorator = oauth2decorator_from_clientsecrets(CLIENT_SECRETS,
'https://www.googleapis.com/auth/bigquery')
bq = bqclient.BigQueryClient(http, decorator)
class MainHandler(webapp.RequestHandler):
#decorator.oauth_required
def get(self):
data = {'data': json.dumps(bq.Query(QUERY, PROJECT_ID))}
template = os.path.join(os.path.dirname(__file__), 'index.html')
self.response.out.write(render(template, data))
application = webapp.WSGIApplication([('/', MainHandler),], debug=True)
def main():
run_wsgi_app(application)
if __name__ == '__main__':
main()
bqclient.py that gets imported in your main.py which handles BigQuery actions
from apiclient.discovery import build
class BigQueryClient(object):
def __init__(self, http, decorator):
"""Creates the BigQuery client connection"""
self.service = build('bigquery', 'v2', http=http)
self.decorator = decorator
def Query(self, query, project, timeout_ms=10):
query_config = {
'query': query,
'timeoutMs': timeout_ms
}
decorated = self.decorator.http()
queryReply = (self.service.jobs()
.query(projectId=project, body=query_config)
.execute(decorated))
jobReference=queryReply['jobReference']
while(not queryReply['jobComplete']):
queryReply = self.service.jobs().getQueryResults(
projectId=jobReference['projectId'],
jobId=jobReference['jobId'],
timeoutMs=timeout_ms).execute(decorated)
return queryReply
where all your authentication details are kept in a json file client_secrets.json
{
"web": {
"client_id": "xxxxxxxxxxxxxxx",
"client_secret": "xxxxxxxxxxxxxxx",
"redirect_uris": ["http://localhost:8080/oauth2callback"],
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://accounts.google.com/o/oauth2/token"
}
}
finally, don't forget to add these lines to your app.yaml:
- url: /oauth2callback
script: oauth2client/appengine.py
Hope that helps.
I am not sure I completely understand the use case, but if you are creating an application for others to use without their having to authorize access based on their own credentials, I would recommend using App Engine service accounts.
An example of this type of auth flow is described in the App Engine service accounts + Prediction API article.
Also, see this part and this part of the App Engine Datastore to BigQuery codelab, which also uses this authorization method.
The code might look something like this:
import httplib2
# Available in the google-api-python-client lib
from apiclient.discovery import build
from oauth2client.appengine import AppAssertionCredentials
# BigQuery Scope
SCOPE = 'https://www.googleapis.com/auth/bigquery'
# Instantiate and authorize a BigQuery API client
credentials = AppAssertionCredentials(scope=SCOPE)
http = credentials.authorize(httplib2.Http())
bigquery_service = build("bigquery", "v2", http=http)
# Make some calls to the API
jobs = bigquery_service.jobs()
result = jobs.insert(projectId='some_project_id',body='etc, etc')