While deploying the template through Visual Studio, I am getting an error.
VM has reported a failure when processing extension 'dscExtension'. Error message:
The DSC Extension failed to execute: Error downloading
https://stagef848a9a999ac4175a5c.blob.core.windows.net/myresourcegroup-stageartifactsscripts/WebServerConfig.ps1.zip
after 2 attempts: AuthenticationFailedServer
failed to authenticate the request. Make sure the value of
Authorization header is formed correctly including the signature.
13:58:38 - RequestId:d731468a-601e-00be-15ea-b36a90000000 13:58:38 -
Time:2018-03-04T18:58:06.8482865ZSignature did not match. String to sign used was r
When I looked for a solution, somebody suggested to append
&comp=list&restype=container to artifacts location. I tried but failed.
What is the exact location in JSON (Specific parameter) where I should add this string?
The DSC Extension failed to execute: Error downloading https://stagef848a9a999ac4175a5c.blob.core.windows.net/myresourcegroup-stageartifactsscripts/WebServerConfig.ps1.zip after 2 attempts: AuthenticationFailedServer failed to authenticate the request.
According to your error message, it seems that your supplied blob SAS token for downloading the blob is invaild. You could try to download the blob with blob SAS token url directly to check it. If it is invaild we could generate blob SAS token with azure portal or Azure Storage explorer eas.
Related
Looking for some help to resolve the errors I'm facing. Let me explain the scenario. I'm trying to sync one of the ADLS Gen2 container to Azure BLOB Storage. I have AzCopy 10.4.3, I'm using Azcopy Sync to do this. I'm using the command below
azcopy sync 'https://ADLSGen2.blob.core.windows.net/testsamplefiles/SAMPLE' 'https://AzureBlobStorage.blob.core.windows.net/testsamplefiles/SAMPLE' --recursive
When I run this command I'm getting below error
REQUEST/RESPONSE (Try=1/71.0063ms, OpTime=110.9373ms) -- RESPONSE SUCCESSFULLY RECEIVED
PUT https://AzureBlobStorage.blob.core.windows.net/testsamplefiles/SAMPLE/SampleFile.parquet?blockid=ZDQ0ODlkYzItN2N2QzOWJm&comp=block&timeout=901
X-Ms-Request-Id: [378ca837-d01e-0031-4f48-34cfc2000000]
ERR: [P#0-T#0] COPYFAILED: https://ADLSGen2.blob.core.windows.net/testsamplefiles/SAMPLE/SampleFile.parquet: 404 : 404 The specified resource does not exist.. When Staging block from URL. X-Ms-Request-Id: [378ca837-d01e-0031-4f48-34cfc2000000]
Dst: https://AzureBlobStorage.blob.core.windows.net/testsamplefiles/SAMPLE/SampleFile.parquet
REQUEST/RESPONSE (Try=1/22.9854ms, OpTime=22.9854ms) -- RESPONSE SUCCESSFULLY RECEIVED
GET https://AzureBlobStorage.blob.core.windows.net/testsamplefiles/SAMPLE/SampleFile.parquet?blocklisttype=all&comp=blocklist&timeout=31
X-Ms-Request-Id: [378ca84e-d01e-0031-6148-34cfc2000000]
So far I checked and ensured below things
I logged into correct tenant while logging into AzCopy
Storage Blob Data Contributor role was granted to my AD credentials
Not sure what else I'm missing as the file exists in the source and I'm getting the same error. I tried with SAS but I received different error though. I cannot proceed with SAS due to the vendor policy so I need to ensure this is working with oAuth. Any inputs is really appreciated.
For the 404 error, you may check if there is any typo in the command and the path /testsamplefiles/SAMPLE exists on both source and destination account. Also, please note that from the tips.
Use single quotes in all command shells except for the Windows Command
Shell (cmd.exe). If you're using a Windows Command Shell (cmd.exe),
enclose path arguments with double quotes ("") instead of single
quotes ('').
From azcopy sync supported scenario:
Azure Blob <-> Azure Blob (Source must include a SAS or is publicly
accessible; either SAS or OAuth authentication can be used for
destination)
We must provide include a SAS token in the source, but I tried the below code with AD authentication.
azcopy sync "https://[account].blob.core.windows.net/[container]/[path/to/blob]?[SAS]" "https://[account].blob.core.windows.net/[container]/[path/to/blob]"
but got the same 400 error as the Github issue.
Thus, in this case, after my validation, you could use this command to sync one of the ADLS Gen2 container to Azure BLOB Storage without executing azcopy login. If you have login in, you can run azcopy logout.
azcopy sync "https://nancydl.blob.core.windows.net/container1/sample?sv=xxx" "https://nancytestdiag244.blob.core.windows.net/container1/sample?sv=xxx" --recursive --s2s-preserve-access-tier=false
I am trying to run this powershell cmdlet :
Get-AzureRmDataLakeStoreChildItem -AccountName "xxxx" -Path "xxxxxx"
It fails with an access error. It does not really make sense because i have complete access to the ADLS account. I can browse in the Azure portal. It does not even work with a AzureRunAsConnection from an automation account. But it works perfectly for my colleague. What am i doing wrong?
Error :
Operation: LISTSTATUS failed with HttpStatus:Forbidden
RemoteException: AccessControlException LISTSTATUS failed with error
0x83090aa2 (Forbidden. ACL verification failed. Either the resource
does not exist or the user is not authorized to perform the requested
operation.).
[1f6e5d40-9be1-4682-84be-d538dfca0d19][2019-01-24T21:12:27.0252648-08:00]
JavaClassName: org.apache.hadoop.security.AccessControlException.
Last encountered exception thrown after 1 tries. [Forbidden (
AccessControlException LISTSTATUS failed with error 0x83090aa2
(Forbidden. ACL verification failed. Either the resource does not
exist or the user is not authorized to perform the requested
operation.).
I don't see any firewall restrictions :
I resolved the problem by providing read and execute access to all parent folders in the path. Since ADLS uses the POSIX standard, it does not inherit permissions from parent folders. So, even though the SPN(generated by the automation account) i was using had read/execute access to the specific folder i was interested in, it did not have access to other folders in that path.
I am trying to binary copy a few .ZIP files sequentially from FTP to ADLS. Sometimes its failed, sometimes not, it's really strange for me. I got this type of error only working with this external FTP server.
Error type:
{
"errorCode": "2200",
"message": "Failure happened on 'Sink' side. ErrorCode=UserErrorFailedToReadFtpData,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to read data from ftp: The remote server returned an error: (530) Not logged in.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (530) Not logged in.,Source=System,'",
"failureType": "UserError",
"target": "Copy from FTP"
}
A connection is good, as I said sometimes it copy files without any errors, this is a simple activity so I don't know what can cause this type of error.
Sometimes it throws an error after copying 50mb on adls.
Can it be related to the FTP server?
A possible root cause could be :
Your FTP server does not support SSL but you enabled SSL in the FTP linked service. If so, You can disable the SSL in FTP linked service. Check out the FTP properties here: https://learn.microsoft.com/en-us/azure/data-factory/data-factory-ftp-connector
From telemetry, it shows Copy can sometimes pass or fail with same payload, so it looks like a transient failure. But it is hard to determine the RCA from error message ("530 Not logged in"). What I'm suspecting is Copy hit throttling or similar transient issue from FTP server which will block the read request in the middle.
For further troubleshoot, could you check from FTP server side to see whether there's any detailed failure log. Besides, it will be a great help if I can get a test account to test the FTP server behavior and try to repro the issue. Please let me know if it is possible for you.
Regards,
Gary
reportviwer working fine in localsystem
when i set reporturl path and remoteserverurl details
serverurl : http://sriventech.in/ReportServer
reportpath : /invoice.rdlc
and upload the website
it is displaying errror ::
The attempt to connect to the report server failed. Check your connection information and that the report server is a compatible version.
The request failed with HTTP status 404: Not Found.
is there any problem with webserver or my reportpath and serverrul details
Unless you are in Local Mode, you have to provide credentials to the reporting server to run reports. The generic 404 is probably masking an authentication error. Make sure you are in the correct Active Directory group if you are using integrated authentication. Otherwise the page should prompt you to log in.
I created an SSIS package which is having ftp pull files from ftp server and save to my local drive but I'm getting this issue.
With same error message I was getting only warning but today the job fails.
Message:
Executed as user: cam\Package.Runner. Microsoft (R) SQL Server Execute Package Utility Version 10.0.4000.0 for 64-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 10:00:00 AM Error: 2012-02-15 10:00:00.61 Code: 0xC0016016 Source: Description: Failed to decrypt protected XML node "DTS:Password" with error 0x8009000B "Key not valid for use in specified state.". You may not be authorized to access this information. This error occurs when there is a cryptographic error. Verify that the correct key is available. End Error Error: 2012-02-15 10:00:00.62 Code: 0xC0016016 Source: Description: Failed to decrypt protected XML node "DTS:Property" with error 0x8009000B "Key not valid for use in specified state.". You may not be authorized to access this information. This error occurs when there is a cryptographic error. Verify that the correct key is available. End Error Error: 2012-02-15 10:00:33.53 Code: 0xC0029183 Source: Principal Balance File FTP Get FTP Task Description: File represented by "/Concerto/Virtus_Reports/Concerto Principal Balance Report*.pdf" does not exist. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 10:00:00 AM Finished: 10:00:33 AM Elapsed: 33.088 seconds. The package execution failed. The step failed.
You can fix this issue by setting the Protection Level property
Protection Level : DontSaveSensitive
With this property, the package will not be password protected, and another server can access and execute any job with other credentials.
While importing the package to SQL Server choose Protection Level:
Either
1- Don't save sensitive data.
Or
2- Rely on Server Storage and roles for access control.
Screenshot from SSIS Project Package Properties:
Please try save your package with the option "EncryptSensitiveWithPassword".
Step-1: Right click on your FTP connection manager, go to its Properties (the very bottom, not the Edit button),
and type in the password.
Step-2: Save your package with EncryptSensitiveWithPassword.
Step-3: Now edit the command ling in SQL job agent as below
/FILE "C:\Fullpath of SSIS pkg.dtsx" /DECRYPT password
Before Building and deploying the package, please be sure you've changed the property of the solution like this :
Run64BitRuntime = False
I got the same error message for FTP Connections. I think it was caused by me opening the Package while running BIDS under different credentials to those I created it with.
As a clunky workaround I deleted and re-created the FTP Connection. It worked fine afterwards.
Main part of your SSIS job error is
" 0xC0029183 Source: Principal Balance File FTP Get FTP Task Description: File represented by "/Concerto/Virtus_Reports/Concerto Principal Balance Report*.pdf" does not exist. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 10:00:00 AM Finished: 10:00:33 AM Elapsed: 33.088 seconds. The package execution failed. "
It seems that you don't have pdf file on path you have configured in your SSIS package.
Please,check up path and pdf files for import.
Best regards,
Branislav
In case of FTP Connection, you can just create one Script Task before FTP task and set a password for that for example
ConnectionManager FTPConn;
FTPConn = Dts.Connections["FTP Connection Manager"];
FTPConn.Properties["ServerPassword"].SetValue(FTPConn,Dts.Variables["FtpPwd"].value);
and in case of OLE DB you can just add password in the connection string of OLEDB Connection.
This are the steps that worked for me.
copied the package to another folder (just to be safe)
Set the Protection Level to EncryptAllWithPassword and gave it a Password like 'Test'
changed the creator Name to another User that I created
Recreated the job
created a Proxy to run the job
Worked!!