Logic App query Azure Table using HTTP and Managed Identity authentication - azure-storage

I am trying to query Azure Table using the HTTP connector without success.
According to this document managed identity authentication is now possible for Azure Table:
https://learn.microsoft.com/en-us/azure/storage/tables/authorize-managed-identity
I have authorized the managed identity of the consumption logic app in the azure table using PowerShell as the documentation sugests.
https://learn.microsoft.com/en-us/azure/storage/tables/assign-azure-role-data-access?tabs=powershell
New-AzRoleAssignment -ObjectID xxxxxxxxxxxxxxxx `
-RoleDefinitionName "Storage Table Data Contributor" `
-Scope "/subscriptions/<subscription>/resourceGroups/<resource-group>/providers/Microsoft.Storage/storageAccounts/<storage-account>/tableServices/default/tables/<table>"
Then in the logic app I've filled the request as documented in:
https://learn.microsoft.com/en-us/rest/api/storageservices/query-tables#request-headers
The run fails with forbidden missing authorization header.
"body": {
"odata.error": {
"code": "AuthenticationFailed",
"message": {
"lang": "en-US",
"value": "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.\nRequestId:8d5dbe66-d002-0005-26e6-45da23000000\nTime:2022-04-01T16:35:57.2213453Z"
}
}
}
Any ideas?

So Basically with the setup bellow I was able to successfully query the Azure Table over HTTP
Headers:
Result:

Related

Error Trying to copy a sharepoint file to DL

Hello could someone please help me on this,
I am trying to copy sharepoint files to my DL but it's not working I got this error
{ "errorCode": "2200", "message": "ErrorCode=HttpRequestFailedWithClientError,
'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,
Message=Http request failed with client error, status code 403 Forbidden, please check your activity settings. If you configured a baseUrl that
includes path, please make sure it ends with '/'.\nRequest URL: ,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,
Message=The remote server returned an error: (403) Forbidden.,Source=System,'", "failureType": "UserError", "target": "Copy SharepointFile", "details": [] }
To copy data from SharePoint to azure data lake first you need to give permissions to the app by following below procedure:
register an app in azure and create client secret 1for that copy secret value, AppId, TenantId.
For granting permission open below url
https://[your_site_url]/_layouts/15/appinv.aspx
you will get as below:
fill the above fields. Permission Request XML:
<AppPermissionRequests AllowAppOnlyPolicy="true">
<AppPermissionRequest Scope="http://sharepoint/content/sitecollection/web" Right="Read"/>
click on trust it.
After that create pipeline and create web activity with below URL format
URL: https://accounts.accesscontrol.windows.net/[Tenant-ID]/tokens/OAuth/
Body: grant_type=client_credentials&client_id=[Client-ID]#[Tenant-ID]&client_secret=[Client-Secret]&resource=00000003-0000-0ff1-ce00-000000000000/[Tenant-Name].sharepoint.com#[Tenant-ID]
Replace Tenant-ID , Client-Secret, Client-ID, Tenant-Name
with your details.
After success of web activity add copy data activity.
create linked services of http and data lake for.
HTTP linked service:
Base url: https://[site-url]/_api/web/GetFileByServerRelativeUrl('[relative-path-to-file]')/$value
Datalake linked service:
create dataset for source as below:
Additional headers: #{concat('Authorization: Bearer ', activity('<Web-activity-name>').output.access_token)}
Sink:
,
In my case I debug the pipeline without entering Additional headers I got below error:
After that I enter Additional headers and debug the pipeline, it executed successfully.
For more clarification you can refer this.
AFAIK for copy files from SharePoint to azure data lake when compare with Azure data factory or synapse Logic apps are better way. You can try with logic apps here

Moving data from Google Big Query to Azure Data Lake Store using Azure Data Factory

I have a scenario where I need to connect the Azure Data Factory (v2) in Google Big Query for to move data to my Azure Data Lake but I don't work.
When I create a Linked Service, I choose Big Query as Source, I put all informations about BigQuery, as project-name, type of connection, etc but when I click in Validade button a message is show for me: (for exemple)...
UserError: ERROR [28000] [Microsoft][BigQuery] (80) Authentication
failed: invalid_grant ERROR [28000] [Microsoft][BigQuery] (80)
Authentication failed: invalid_grant'Type=,Message=ERROR [28000]
[Microsoft][BigQuery] (80) Authentication failed: invalid_grant ERROR
[28000] [Microsoft][BigQuery] (80) Authentication failed:
invalid_grant,Source=,', activityId:
05ff5ce1-5262-4794-853c-c9e39b7f4b71
Any idea? Someone already tested this connector?
Tks.
Peeter Bonomo
The documentation for the ADF connector to BigQuery explains what the parameters mean, but it doesn't give any guidance on how to obtain the values for those parameters.
I spent a couple of days on this and finally got it to work. I used "User Authentication" because I wanted to use a cloud-based IR. The "Client Id" and "Client Secret" can be obtained by creating new credentials in the GCP Console. But to get the "Refresh Token", you have to do the OAuth2 dance with Google and intercept the token.
I put together a PowerShell script to do this for you and wrote up a post on GitHub that walks you through the process of authenticating ADF v2 to Google BigQuery:
https://github.com/AnalyticJeremy/ADF_BigQuery
This is the error you get for any access issues unfortunately. It's also the error you get when your refresh token has expired which it always does after 60 minutes. Which is incredibly curious....this like so many sets of instructions on OAuth2.0 authentication for ADF never mention all this work is to get a code that expires in 60 minutes. Without some method of refreshing it everytime you connect this is worthless. At least the following link mentions this error and you get it because the token has expired...its the only blog post (or Microsoft documentation) that bothers to mention this super big issue. https://medium.com/#varunck/data-transfer-from-google-bigquery-to-azure-blob-storage-using-azure-data-factory-adf-80659c078f83
Here's a different method which is what I will be attempting...it uses service account and IR https://www.youtube.com/watch?v=oRqRt7ya_DM
According to https://learn.microsoft.com/en-us/azure/data-factory/connector-google-bigquery, to connect to Google BigQuery via cloud IR, you need to provide the below information:
{
"name": "GoogleBigQueryLinkedService",
"properties": {
"type": "GoogleBigQuery",
"typeProperties": {
"project" : "<project ID>",
"additionalProjects" : "<additional project IDs>",
"requestGoogleDriveScope" : true,
"authenticationType" : "UserAuthentication",
"clientId": "<id of the application used to generate the refresh token>",
"clientSecret": {
"type": "SecureString",
"value":"<secret of the application used to generate the refresh token>"
},
"refreshToken": {
"type": "SecureString",
"value": "<refresh token>"
}
}
}
}
and that the user you use to grant permission to the application should have access to the project you want to query.
Thanks,
Eva

Azure AD Bearer invalid_token error using Postman

I am really new to Azure AD. I have read the Azure AD documentation which provides information on authentication and accessing web API's.
What I want to do : I want to use Dynamics CRM API to create a lead or contact through AWS Lambda. Meaning, whenever the Lambda function is ran, it should call the CRM API. The way I need to create a lead is with username and password creds included in Lambda. I am not sure which application scenario I need to use when I am using AWS Lambda as the source to access the web api. I want to pass the user creds with POST request.
Creating an application in Azure AD : So, I am not sure which application type I need to use (Web API or Native App?). And what should be the sign-on URL or Redirect URI?
I have tried creating an application and use Postman as the temporary way just to test whether I can get the access token and access the web api. I could able to get the access token but when I tried to access the API it says
Bearer Error invalid_token, error validating token!
I have given enough permissions while creating application in Azure AD to access Dynamics CRM API. But still unable to access the API.
POST request to get access token through Postman:
request: POST
URL: https://login.windows.net/<tenant-id>/oauth2/token
Body:
grant_type: cliet_credentials
username: xxxxx
password: xxxxxxx
client_id: <app id>
resource: <resource> //I am not sure what to include here
client_secret: <secret_key>
I get the access token in the response. Sending the second POST request using the access token
request: POST
URL: https://xxx.api.crm.dynamics.com/api/data/v8.2/accounts
Headers:
Content-type: application/json
OData-MaxVersion: 4.0
OData-Version: 4.0
Authorization: Bearer <access_token>
Body:
{
"name": "Sample Account",
"creditonhold": false,
"address1_latitude": 47.639583,
"description": "This is the description of the sample account",
"revenue": 5000000,
"accountcategorycode": 1
}
It would really help me if I can get a bit more information on where I am stuck. I have already used my one week of time to get this done. Any help will be appreciated.
To do Server-to-Server (S2S) authentication , the application is authenticated based on a service principal identified by an Azure AD Object ID value which is stored in the Dynamics 365 application user record. Please click here and here for detail steps and code samples.

Google Big-query api 403-Forbidden Exception

I am getting the following JSON exception while I am executing the query using the java application for Big-Query API. Following is the exception I am getting:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "accessDenied",
"message": "Access Denied: Job eco-span-847:job_LyHmZIvlY1_0J8JQ3pxThEBf19I: The user does not have permission to run a query in project eco-span-847"
}
],
"code": 403,
"message": "Access Denied: Job eco-span-847:job_LyHmZIvlY1_0J8JQ3pxThEBf19I: The user does not have permission to run a query in project eco-span-847"
}
}
Why is this occurring and how could I resolve this?
To insert a query job in a project the calling user must be at least a Reader on the project.
When using the BigQuery API, you need to construct the client with OAuth credentials for some user. For programmatic access, this is often a Service Account identity. When you create a new Service Account, that account is not automatically added a membership role to your project.
To update the users and service accounts that are members of your project, go to https://console.developers.google.com/ for your project, select "Permissions" in the navigation panel, and make sure the user or service account identity you are calling with is a "Reader" on the project.
Check email/service account is it added in google cloud's IAM&Admin.
Verify the role/permission to use certain api. access control
If above are not met, communicate with the Owner/Admin of the project to add them in.

Google Purchase Status API HTTPS request

I am currently researching a way to use the Google Purchase Status API with just HTTP request calls, and I have hit a brick wall. I have an app setup with Google Play, and ownership of the Google Console account.
Basically, I just would like to check the status of a user's purchase on my server. The only information I should be using is the purchase token, product ID, and product package.
I have followed all the documentation on doing this at developer.android.com/google/play/billing/gp-purchase-status-api.html
The HTTPS request call I am attempting to make is this (product names and real strings substituted):
googleapis.com/androidpublisher/v1.1/applications/(com.product.myproduct)/inapp/(com.product.myproduct.product1)/purchases/(myproductpurchasestring)?access_token=(myaccesstokenstring)
and my response is always this:
{
"error": {
"errors": [
{
"domain": "androidpublisher",
"reason": "developerDoesNotOwnApplication",
"message": "This developer account does not own the application."
}
],
"code": 401,
"message": "This developer account does not own the application."
}
}
When polling my access token through this http request call:
googleapis.com/oauth2/v1/tokeninfo?access_token=(myaccesstokenstring)
this is my response:
{
"issued_to": "12345.apps.googleusercontent.com",
"audience": "12345.apps.googleusercontent.com",
"scope": "https://www.googleapis.com/auth/androidpublisher",
"expires_in": 3319,
"access_type": "offline"
}
So according to the documentation at https://developers.google.com/accounts/docs/OAuth2#webserver, I need to:
Authorise myself and retrieve a refreshable access token that is generated from 'Client ID for web applications' in the API access section of the Google API Console. I have done this.
Utilise this access token for google API calls in either of 2 ways: appending the string to the HTTP header 'Authorization', or as part of the HTTPS request itself with the property access_token=(mytokenstring). This part does not work for me, I always get an unauthorised message.
My question I guess would be: is it possible to use a simple HTTPS request call (without external library support) to retrieve the status of a purchased item without user interaction on backend servers?
I would really appreciate any help, most of the other threads are about how to go about getting a refresh token, but I have covered that already.
ok, I figured out my own problem with the help of a colleague. Basically, my access token was being generated under an account which wasn't linked to the project in any way. It would be safest to use the owner of the project's google account when generating the access token.
Phew!