Error Trying to copy a sharepoint file to DL - sharepoint-2010

Hello could someone please help me on this,
I am trying to copy sharepoint files to my DL but it's not working I got this error
{ "errorCode": "2200", "message": "ErrorCode=HttpRequestFailedWithClientError,
'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,
Message=Http request failed with client error, status code 403 Forbidden, please check your activity settings. If you configured a baseUrl that
includes path, please make sure it ends with '/'.\nRequest URL: ,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,
Message=The remote server returned an error: (403) Forbidden.,Source=System,'", "failureType": "UserError", "target": "Copy SharepointFile", "details": [] }

To copy data from SharePoint to azure data lake first you need to give permissions to the app by following below procedure:
register an app in azure and create client secret 1for that copy secret value, AppId, TenantId.
For granting permission open below url
https://[your_site_url]/_layouts/15/appinv.aspx
you will get as below:
fill the above fields. Permission Request XML:
<AppPermissionRequests AllowAppOnlyPolicy="true">
<AppPermissionRequest Scope="http://sharepoint/content/sitecollection/web" Right="Read"/>
click on trust it.
After that create pipeline and create web activity with below URL format
URL: https://accounts.accesscontrol.windows.net/[Tenant-ID]/tokens/OAuth/
Body: grant_type=client_credentials&client_id=[Client-ID]#[Tenant-ID]&client_secret=[Client-Secret]&resource=00000003-0000-0ff1-ce00-000000000000/[Tenant-Name].sharepoint.com#[Tenant-ID]
Replace Tenant-ID , Client-Secret, Client-ID, Tenant-Name
with your details.
After success of web activity add copy data activity.
create linked services of http and data lake for.
HTTP linked service:
Base url: https://[site-url]/_api/web/GetFileByServerRelativeUrl('[relative-path-to-file]')/$value
Datalake linked service:
create dataset for source as below:
Additional headers: #{concat('Authorization: Bearer ', activity('<Web-activity-name>').output.access_token)}
Sink:
,
In my case I debug the pipeline without entering Additional headers I got below error:
After that I enter Additional headers and debug the pipeline, it executed successfully.
For more clarification you can refer this.
AFAIK for copy files from SharePoint to azure data lake when compare with Azure data factory or synapse Logic apps are better way. You can try with logic apps here

Related

DataFlow :missing required authentication credential

I am getting following error while running DataFlow pipeline
Error reporting inventory checksum: code: "Unauthenticated", message: "Request is missing required authentication credential.
Expected OAuth 2 access token, login cookie or other valid authentication credential.
We have created service account dataflow#12345678.iam.gserviceaccount.com with following roles
BigQuery Data Editor
Cloud KMS CryptoKey Decrypter
Dataflow Worker
Logs Writer
Monitoring Metric Writer
Pub/Sub Subscriber
Pub/Sub Viewer
Storage Object Creator
And in our python code we are using import google.auth
Any idea what am I missing here ?
I do not believe I need to create key for SA , however I am not sure if "OAuth 2 access token" for SA need to be created ? If yes how ?
This was the issue in my case https://cloud.google.com/dataflow/docs/guides/common-errors#lookup-policies
If you are trying to access a service through HTTP, with a custom request (not using a client library), you can obtain a OAuth2 token for that service account using the metadata server of the worker VM. See this example for Cloud Run, you can use the same code snippet in Dataflow to get a token and use it with your custom HTTP request:
https://cloud.google.com/run/docs/authenticating/service-to-service#acquire-token

Logic App query Azure Table using HTTP and Managed Identity authentication

I am trying to query Azure Table using the HTTP connector without success.
According to this document managed identity authentication is now possible for Azure Table:
https://learn.microsoft.com/en-us/azure/storage/tables/authorize-managed-identity
I have authorized the managed identity of the consumption logic app in the azure table using PowerShell as the documentation sugests.
https://learn.microsoft.com/en-us/azure/storage/tables/assign-azure-role-data-access?tabs=powershell
New-AzRoleAssignment -ObjectID xxxxxxxxxxxxxxxx `
-RoleDefinitionName "Storage Table Data Contributor" `
-Scope "/subscriptions/<subscription>/resourceGroups/<resource-group>/providers/Microsoft.Storage/storageAccounts/<storage-account>/tableServices/default/tables/<table>"
Then in the logic app I've filled the request as documented in:
https://learn.microsoft.com/en-us/rest/api/storageservices/query-tables#request-headers
The run fails with forbidden missing authorization header.
"body": {
"odata.error": {
"code": "AuthenticationFailed",
"message": {
"lang": "en-US",
"value": "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.\nRequestId:8d5dbe66-d002-0005-26e6-45da23000000\nTime:2022-04-01T16:35:57.2213453Z"
}
}
}
Any ideas?
So Basically with the setup bellow I was able to successfully query the Azure Table over HTTP
Headers:
Result:

Azure storage account container access via browser- Getting authenticating error

We want to be able to give access to some users to a container that they can see a list of their files in the browser, click on them to read them.
I created a storage account. A container with public access and have created few blobs in a container and have set the container public access level to read.
But when I try to access that container in the browser I get the following error. I created a SAS token for it in storage explorer.
https://teststorageaccount21.blob.core.windows.net/publicc?sv=2019-12-12&st=2021-02-08T20%3A58%3A42Z&se=2021-03-09T20%3A58%3A00Z&sr=c&sp=rl&sig=AjF0IpWIBGZtBeKcOodd8HieENZ0F3Cuig54Y8e0oIM%3D
<Error>
<Code>AuthenticationFailed</Code>
<Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:9cb791c8-a01e-0016-625f-fe693a000000 Time:2021-02-08T21:16:09.2176268Z</Message>
<AuthenticationErrorDetail>Signature did not match. String to sign used was rl 2021-02-08T20:58:42Z 2021-03-09T20:58:00Z /blob/teststorageaccount21/$root 2019-12-12 c </AuthenticationErrorDetail>
</Error>
When I try to access a blob within that container, I can access it without any problems.
This link can be used to see the file
https://teststorageaccount21.blob.core.windows.net/publicc/MyTest1.csv?sv=2019-12-12&st=2021-02-08T21%3A20%3A53Z&se=2021-03-09T21%3A20%3A00Z&sr=b&sp=r&sig=%2Bw8oi73NUU4w%2FqgSAsNvjVHwBi0SgaoQAK6%2BF8P8QQs%3D
Why am I not able to see the list of blobs at the container level with public access. Also, I need help in understanding that authentication error. Not sure what signature didnt match.
Your SAS token is blob level, you need to generate SAS token in container level:
If you list blobs by https://<storage-account-name>.blob.core.windows.net/<container>?<sas-token> in the browser, you will receive an error:
<Error>
<Code>ContainerNotFound</Code>
<Message>The specified container does not exist. RequestId:54174bad-401e-0046-218e-fe53dd000000 Time:2021-02-09T02:52:30.1104394Z</Message>
</Error>
So you need to use List Blobs API, please use this format URL:
https://<storage-account-name>.blob.core.windows.net/<container>?restype=container&comp=list&<sas-token>

Azure File Storage the server responded with a status of 400 (Condition headers are not supported.)

I have some png files stored in Azure file Storage, and I'm retrieving and displaying it from my MVC web project to the browser. But sometimes I get the below error message from browser console.
Failed to load resource: the server responded with a status of 400
(Condition headers are not supported.)
If I refresh the page again, the error message disappears automatically. But it doesn't solve my problem as I run my MVC project again, the same error comes back. How to solve it?
It's acutally a common issue on Azure Storage, which be listed in the offical reference Common REST API Error Codes as the figure below.
It means that Get File REST API does not support those request headers which not be listed in the Request Headers.
There is a similar SO thread Azure File Storage Error: Condition Headers Are Not Supported, which got the same issue with yours. It seems to show up different behavior in different browses when you get a file from Azure File Storage.
I could not reproduce this issus by a file url with SAS token, but I really recommend that you need to store these static files like images on Azure Blob Storage, as I known and as Azure best practice, to show an image by its url with sas token or a public container.

Moving data from Google Big Query to Azure Data Lake Store using Azure Data Factory

I have a scenario where I need to connect the Azure Data Factory (v2) in Google Big Query for to move data to my Azure Data Lake but I don't work.
When I create a Linked Service, I choose Big Query as Source, I put all informations about BigQuery, as project-name, type of connection, etc but when I click in Validade button a message is show for me: (for exemple)...
UserError: ERROR [28000] [Microsoft][BigQuery] (80) Authentication
failed: invalid_grant ERROR [28000] [Microsoft][BigQuery] (80)
Authentication failed: invalid_grant'Type=,Message=ERROR [28000]
[Microsoft][BigQuery] (80) Authentication failed: invalid_grant ERROR
[28000] [Microsoft][BigQuery] (80) Authentication failed:
invalid_grant,Source=,', activityId:
05ff5ce1-5262-4794-853c-c9e39b7f4b71
Any idea? Someone already tested this connector?
Tks.
Peeter Bonomo
The documentation for the ADF connector to BigQuery explains what the parameters mean, but it doesn't give any guidance on how to obtain the values for those parameters.
I spent a couple of days on this and finally got it to work. I used "User Authentication" because I wanted to use a cloud-based IR. The "Client Id" and "Client Secret" can be obtained by creating new credentials in the GCP Console. But to get the "Refresh Token", you have to do the OAuth2 dance with Google and intercept the token.
I put together a PowerShell script to do this for you and wrote up a post on GitHub that walks you through the process of authenticating ADF v2 to Google BigQuery:
https://github.com/AnalyticJeremy/ADF_BigQuery
This is the error you get for any access issues unfortunately. It's also the error you get when your refresh token has expired which it always does after 60 minutes. Which is incredibly curious....this like so many sets of instructions on OAuth2.0 authentication for ADF never mention all this work is to get a code that expires in 60 minutes. Without some method of refreshing it everytime you connect this is worthless. At least the following link mentions this error and you get it because the token has expired...its the only blog post (or Microsoft documentation) that bothers to mention this super big issue. https://medium.com/#varunck/data-transfer-from-google-bigquery-to-azure-blob-storage-using-azure-data-factory-adf-80659c078f83
Here's a different method which is what I will be attempting...it uses service account and IR https://www.youtube.com/watch?v=oRqRt7ya_DM
According to https://learn.microsoft.com/en-us/azure/data-factory/connector-google-bigquery, to connect to Google BigQuery via cloud IR, you need to provide the below information:
{
"name": "GoogleBigQueryLinkedService",
"properties": {
"type": "GoogleBigQuery",
"typeProperties": {
"project" : "<project ID>",
"additionalProjects" : "<additional project IDs>",
"requestGoogleDriveScope" : true,
"authenticationType" : "UserAuthentication",
"clientId": "<id of the application used to generate the refresh token>",
"clientSecret": {
"type": "SecureString",
"value":"<secret of the application used to generate the refresh token>"
},
"refreshToken": {
"type": "SecureString",
"value": "<refresh token>"
}
}
}
}
and that the user you use to grant permission to the application should have access to the project you want to query.
Thanks,
Eva