Can application in public cloud be authorized to fetch data from government tenant via graph api? - azure-gov

I'm trying to fetch email list from government tenant via graph api and it worked fine until last week. I'm using client credentials flow. Last week i started to get the following error when trying to authorize my app in government tenants:
oauthlib.oauth2.rfc6749.errors.InvalidClientIdError: (invalid_request) AADSTS900441: Requests to applications hosted in the public cloud are not supported for USGov tenants.
Is there a way to authorize application from public azure cloud to read data from government tenant?
EDIT: code example and debug logs
from oauthlib.oauth2 import BackendApplicationClient
client = BackendApplicationClient(client_id=config.CLIENT_ID)
MSGRAPH = requests_oauthlib.OAuth2Session(
client=client
)
token = MSGRAPH.fetch_token(
'https://login.microsoftonline.us' + '/<tenant>' + config.TOKEN_ENDPOINT,
client_id=config.CLIENT_ID,
client_secret=config.CLIENT_SECRET,
include_client_id=True,
scope=['https://graph.microsoft.us/.default'])
endpoint = config.RESOURCE + config.API_VERSION + '/users'
graphdata = MSGRAPH.get(endpoint).json()
DEBUG:requests_oauthlib.oauth2_session:Requesting url https://login.microsoftonline.us/<tenant-id>/oauth2/v2.0/token using method POST.
DEBUG:requests_oauthlib.oauth2_session:Supplying headers {u'Content-Type': u'application/x-www-form-urlencoded;charset=UTF-8', u'Accept': u'application/json'} and data {u'client_secret': u'...', u'grant_type': u'client_credentials', u'client_id': u'...', u'scope': u'https://graph.microsoft.us/.default'}
DEBUG:requests_oauthlib.oauth2_session:Passing through key word arguments {'verify': True, 'json': None, 'proxies': None, 'timeout': None, 'auth': None}.
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): login.microsoftonline.us:443
DEBUG:urllib3.connectionpool:https://login.microsoftonline.us:443 "POST /<tenant-id>/oauth2/v2.0/token HTTP/1.1" 400 522
DEBUG:requests_oauthlib.oauth2_session:Prepared fetch token request body grant_type=client_credentials&client_id=...&client_secret=...&scope=https%3A%2F%2Fgraph.microsoft.us%2F.default
DEBUG:requests_oauthlib.oauth2_session:Request to fetch token completed with status 400.
Basically i see this error when i'm trying to fetch access token. Adminconsent was already given to my application by tenant admin.
This code worked for Gov tenants for month or so and suddenly stopped to work.

AAD started enforcing this about a month ago, GCC High/DoD tenants cannot use confidential apps published in commercial cloud. You need to publish your app from a GCC High/DoD tenant.

Related

YouTube Analytics API authorization issue?

I work with analytics for organizational YouTube channels for my employer. I've been successfully retrieving data using YouTube DATA API, however I am unable to obtain metrics from YouTube Analytics API (a'la reports.query).
Here are the particulars...
Authentication: SAML (organizational GSuite account)
Authorization: OAuth2
My organizational account is an "owner" (though not primary owner) of the YouTube channel.
Google Cloud Platform project: internal (organizational)
I've tried various SCOPE combinations.
When I specify ids="channel==<channel_id>", I receive a 403 (forbidden) response.
When I specify ids="channel==MINE", I get a 200 status with headers but depending on dimension and metric selections, either no records, or a single record with zeros for each metric. My suspicion is that "channel==MINE" is looking for "my" channel rather than the brand account's channel as it wouldn't know which brand channel otherwise.
My question is, how can I assure that my organizational account can obtain data using YouTube Analytics API as well as YouTube Reporting API? Is there an administrative site that assigns these access rights? If so, where is it?
The following code is essentially sample code from the API Explorer...
import os
import google_auth_oauthlib.flow
import googleapiclient.discovery
import googleapiclient.errors
scopes = ["https://www.googleapis.com/auth/youtube.readonly"]
def main():
# Disable OAuthlib's HTTPS verification when running locally.
# *DO NOT* leave this option enabled in production.
os.environ["OAUTHLIB_INSECURE_TRANSPORT"] = "1"
api_service_name = "youtubeAnalytics"
api_version = "v2"
client_secrets_file = "client_secret.json"
# Get credentials and create an API client
flow = google_auth_oauthlib.flow.InstalledAppFlow.from_client_secrets_file(
client_secrets_file, scopes)
# credentials = flow.run_console()
credentials = flow.run_local_server()
youtube_analytics = googleapiclient.discovery.build(
api_service_name, api_version, credentials=credentials)
request = youtube_analytics.reports().query(
ids="channel==MINE",
startDate="2020-09-01",
endDate="2020-09-30",
dimensions="day",
metrics="views",
)
response = request.execute()
print(response)
if __name__ == "__main__":
main()
The code as shown, yields...
{'kind': 'youtubeAnalytics#resultTable', 'columnHeaders': [{'name': 'day', 'columnType': 'DIMENSION', 'dataType': 'STRING'}, {'name': 'views', 'columnType': 'METRIC', 'dataType': 'INTEGER'}], 'rows': []}
With ids set to "channel==UC6L0DBYWqAkmwfawTUMaR3g", the followiing is returned...
googleapiclient.errors.HttpError: <HttpError 403 when requesting https://youtubeanalytics.googleapis.com/v2/reports?ids=channel%3D%3DUC6L0DBYWqAkmwfawTUMaR3g&startDate=2020-09-01&endDate=2020-09-30&dimensions=day&metrics=views&alt=json returned "Forbidden">

API of Polarion ALM occasionally does not authorize any request

I have wrote some Python code that logs in and reads some data from Polarion ALM server via API (more informarion about Polarion API: https://almdemo.polarion.com/polarion/sdk/index.html). In my code I have used zeep Python package to handle SOAP.
My algorithm is simple:
1) Log in via logIn web service (https://almdemo.polarion.com/polarion/sdk/doc/javadoc/com/polarion/alm/ws/client/session/SessionWebService.html#logIn-java.lang.String-java.lang.String-)
2) Add current session to header - so the current session remain alive.
3) Try to read some data, for example via getRootProjectGroup web service (https://almdemo.polarion.com/polarion/sdk/doc/javadoc/com/polarion/alm/ws/client/projects/ProjectWebService.html#getRootProjectGroup--).
4) Regardless of what is happening I close the current session via endSession web service (https://almdemo.polarion.com/polarion/sdk/doc/javadoc/com/polarion/alm/ws/client/session/SessionWebService.html#endSession--).
What I observed:
Ocassionally, at point 3 I receive response with Authorization Error (snippet with response):
<soapenv:Fault>\n <faultcode>soapenv:Server.generalException</faultcode>\n <faultstring>Not authorized.</faultstring>\n <detail>\n <ns1:stackTrace xmlns:ns1="http://xml.apache.org/axis/">Not authorized.\n\tat com.polarion.alm.ws.providers.DoAsUserWrapper.invoke(DoAsUserWrapper.java:37)\n\tat org.apache.axis.strategies.InvocationStrategy.visit(InvocationStrategy.java:32)\n\t..
or everything is good and I receive:
{
'groupURIs': {
'SubterraURI': [
'subterra:data-service:objects:/default/${ProjectGroup}Group'
]
},
'location': None,
'name': 'ROOT_CTX_NAME',
'parentURI': None,
'projectIDs': None,
'uri': 'subterra:data-service:objects:${ProjectGroup}Group',
'unresolvable': False
}
What surprises me the most:
- I always uses the same credential (username and password)
- the session ID of the in request (point 3) is the same as in the server response during log in (point 1) so the session shall remain alive
- if I put my code in the loop (for example 1000 executions), the result for all attempts is always the same (1000 successes or 1000 failures), even if I add a wait (e.g. 1s) between the attemps
I would like to know why server rejects some of the requests. Is it some kind of Polarion server issue? How could I make a work around to somehow connect with the server and be able to read some data from the server even if it reject my first request.
It appears that it is issue with SOAP client (and relatively popular one). To fix it, I have turned off TLS verification. More details in:
https://python-zeep.readthedocs.io/en/master/transport.html

AAD Authentication with Azure Data Explorer (Kusto) not working for simple query via API

I'm attempting to access Kusto via the API with Python (a "headless" script, in other words), and would like to use an AAD application for authentication. I'm specifically working with the sample code on https://github.com/Azure/azure-kusto-python/blob/master/azure-kusto-data/tests/sample.py, which attempts to query the Samples > StormEvents table on the cluster https://help.kusto.windows.net. I can run the query in the Kusto explorer just fine, but I'm getting "Caller is not authorized to perform this action" when trying to run the sample code.
I followed the instructions on https://kusto.azurewebsites.net/docs/management/access-control/aad.html and https://kusto.azurewebsites.net/docs/management/access-control/how-to-provision-aad-app.html to create an AAD application on the Azure portal and add API permissions for Azure Data Explorer. In the code, I have the "Application (client) ID" from the portal in the client_id field, and the appropriate secret in the client_secret field. The authority_id field is set to 72f988bf-86f1-41af-91ab-2d7cd011db47, which is what's shown on the portal as well as the table on https://kusto.azurewebsites.net/docs/management/access-control/aad.html#authenticating-with-aad-programmatically The app name (and client ID) is accepted on https://www.analytics.msftcloudes.com/support/directory just fine.
The code is thus as follows (omitting the imports and the specific secrets):
cluster = "https://help.kusto.windows.net"
client_id = "<omitted>"
client_secret = "<omitted>"
authority_id = "72f988bf-86f1-41af-91ab-2d7cd011db47"
kcsb = KustoConnectionStringBuilder.with_aad_application_key_authentication(
cluster, client_id, client_secret, authority_id
)
client = KustoClient(kcsb)
db = "Samples"
query = "StormEvents | take 10"
response = client.execute(db, query)
The failure output is:
azure.kusto.data.exceptions.KustoServiceError: (KustoServiceError(...), [{'error': {'code': 'Forbidden', 'message': 'Caller is not authorized to perform this action', '#type': 'Kusto.DataNode.Exceptions.UnauthorizedDatabaseAccessException', '#message': "Principal 'AAD app id=(omitted)' is not authorized to access database 'Samples'.", '#context': {'timestamp': '2019-06-05T19:39:17.3493255Z', 'serviceAlias': 'HELP', 'machineName': 'KEngine000000', 'processName': 'Kusto.WinSvc.Svc', 'processId': 18832, 'threadId': 25568, 'appDomainName': 'Kusto.WinSvc.Svc.exe', 'clientRequestd': 'KPC.execute;9ede2b2d-5fba-478c-ad8f-8306284cf6e9', 'activityId': 'efdb96c9-da46-4d5f-b739-54661e7002e3', 'subActivityId': '33f89e2b-2347-447a-abe9-81e586d0e2a0', 'activityType': 'DN-FE-ExecuteQuery', 'parentActivityId': '438b2bb3-26fb-4f7e-813d-bc8a5c39ce1c', 'activityStack': '(Activity stack: CRID=KPC.execute;9ede2b2d-5fba-478c-ad8f-8306284cf6e9 ARID=efdb96c9-da46-4d5f-b739-54661e7002e3 > KD-Query-Client-ExecuteQueryAsKustoDataStream/5ddd9239-e742-4edc-ab3e-55d59a1f2c99 > P-WCF-Service-ExecuteQueryInternalAsKustoDataStream--IClientServiceCommunicationContract/438b2bb3-26fb-4f7e-813d-bc8a5c39ce1c > DN-FE-ExecuteQuery/33f89e2b-2347-447a-abe9-81e586d0e2a0)'}, '#permanent': True}}])
I've also added the sample cluster in Kusto Explorer, like the docs say.
Am I still missing something?
https://help.kusto.windows.net is the URL of an ADX cluster which is an exploratory aid, and only allows interactive access by AAD users (not AAD applications).
for running automation using AAD application authentication, you should redirect your code at your own cluster/database, on which you grant your AAD application the necessary permissions (database user/viewer)

How to let PowerBI consume my restful service which is secured by access token?

We have a RESTful API which allows multiple customers to retrieve data from. But before that, customers need to authenticate with their credentials and get the access token to access the API. The access token will be expired every 30 minutes, so customers need to re-login again to get a new access token.
The RESTful service will determine by the access token to return customer's data.
We want to use PowerBI to present customers' data.
My question is how to integrate our authentication process with PowerBI? what type of dataset do we need to create?
let
Query2 = let
url="http://api.XXXXX.com/api/1.0/authentication/login",
body = "{
""userName"":""XXX"",
""password"":""XXXX""
}",
jsonResult = Json.Document(Web.Contents(url,[Headers =[#"Content-Type"="application/json"],Content = Text.ToBinary(body) , Timeout=#duration(0,2,0,0)])),
token = jsonResult[accessToken],
location_url = "http://api.XXXXX.com/api/1.0/cts/sites",
sites = Json.Document(Web.Contents(location_url,[Headers =[Accept="application/json", Authorization=token]]))
in
sites[result]

HTTP error 500 when requesting google big query API using service account

I have been using Big query to generate reports through a web service for a year now, however in the past month or so I have noticed HTTP 500 errors in response to most of my query requests even though no changes have been made to the web service. In my current setup I make 5 simultaneous queries and often 4 out of the 5 queries fail with 500 error. At times all 5 queries are returned but in recent times this rarely happens rendering my application almost unusable.
I use server to server authentication using my service account token and my big query client app is closely modeled on the example given here -
https://developers.google.com/bigquery/articles/dashboard#class
Here is the full error message -
HttpError: https://www.googleapis.com/bigquery/v2/projects/1021946877460/queries?alt=json returned "Unexpected. Please try again.">
Snippet of my bigquery client -
def generateToken():
"""
generates OAuth2.0 token/credentials for login to google big query
"""
credentials = SignedJwtAssertionCredentials(
SERVICE_ACCOUNT_EMAIL,
KEY,
"https://www.googleapis.com/auth/bigquery")
return credentials
class BigQueryClient(object):
def authenticate(self, credentials):
http = httplib2.Http(proxy_info = httplib2.ProxyInfo(
socks.PROXY_TYPE_HTTP,
PROXY_IP,
PROXY_PORT))
http = credentials.authorize(http)
return http
def __init__(self, credentials, project):
http = self.authenticate(credentials)
self.service = build('bigquery', 'v2', http=http)
Please let me know if I am doing something incorrectly here or if anything has changed on the bigquery backend such as limits to the number of query requests allowed over a certain period of time.
Thanks.
500s are always BigQuery bugs. I believe I've tracked down one of your errors in the BigQuery server logs, and am investigating.