We are using Azure Data Factories to upload SQL data (stored in ORC files) from Azure Data Lake into Azure Data Warehouse. We keep getting the following error on one specific table:
"errorCode": "2200",
"message": "ErrorCode=UserErrorJavaInvocationException,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=An error happened when invoking java, message: java.lang.OutOfMemoryError:Java heap space.,Source=Microsoft.DataTransfer.Richfile.OrcTransferPlugin,''Type=Microsoft.DataTransfer.Richfile.JniExt.JavaBridgeException,Message=,Source=Microsoft.DataTransfer.Richfile.HiveOrcBridge,'",
"failureType": "UserError",
"target": "ADLToADWCopy"
This table gave the same error when we uploaded from the source SQL DB into Azure Data Lake and we were able to resolve that by tweaking the JVM memory allocation on the self-hosted Integration Runtime machine:
setx _JAVA_OPTIONS "-Xms256m -Xmx16g" /M
Is there a way to pass something similar to the Azure-hosted Integration Runtime?
Related
When I try to query our Serverless SQL pool in Azure Synapse Analytics I get the following error:
"Content of directory on path 'https://xxxxxx.dfs.core.windows.net/dataverse-xxxxxx-org5a2bcccf/account/Snapshot/2018-08_1656570292/*.csv' cannot be listed.".
I have checked out the following link for clues as to what could be cause:
https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/resources-self-help-sql-on-demand?tabs=x80070002
It is suggested that the error is due permissions:
However, I believe I have the correct permissons,
I get this error whether I try to execute the query in SSMS or Synapse Workspace.
The error in SSMS is as follows:
Warning: Unable to resolve path https://xxxxx.dfs.core.windows.net/dataverse-xxxxx-org5a2bcccf/account/Snapshot/2018-10_1657304551/*.csv. Error number 13807, Level 16, State 1, Message "Content of directory on path 'https://xxxxxx.dfs.core.windows.net/dataverse-xxxxx-org5a2bcccf/account/Snapshot/2018-10_1657304551/*.csv' cannot be listed.".
Can someone let me know how to resolve this?
The query that I'm attempting to execute can be located here:
https://github.com/slavatrofimov/Synapse-Link-for-Dataverse-data-enrichment-in-Serverless-SQL-Pools/blob/main/SQL/Enrich%20Synapse%20Link%20for%20Dataverse%20Entities%20with%20Human-Readable%20Labels.sql
Is there a definitive way to determine if the problem is due to lack of permissions?
Update Question:
I have just realised that the issue is access the Lake on https://xxxxxx.dfs.core.windows.net/dataverse-xxxxxx-org5a2bcccf/
Therefore please take a look at my permissons on the lake and let me know if it is sufficient?
This issue occurs when the user trying to query the external table does not have the relevant permissions or if there is a firewall enabled on your storage network.
When looked at the permissions you have provided, I see Storage Blob Data reader and Storage Blob Data contributor have been given.
Ref doc: Control storage account access for serverless SQL pool in Azure Synapse Analytics
In case if your storage account is firewall protect then you will have to follow the steps described in this document to overcome the issue: Access storage that is protected with the firewall
Here are couple of relevant articles which might help you configure your storage firewall to overcome this issue:
Storage configuration for external table is not accessible while query on Serverless
Synapse Studio error while trying to read data from Storage Account using SQL On Demand
I have a service principal using which I am trying to create an external table for Azure Data lake gen1. The external table creation fails with the error:
Error occurred while accessing HDFS: Java exception raised on call to HdfsBridge_IsDirExist.
Java exception message:
HdfsBridge::isDirExist - Unexpected error encountered checking whether directory exists or not:
IOException: Server returned HTTP response code: 401
What I understand is that this is unAuthorized error. But I checked that this Service principle has proper role assignment in the Azure Data Lake Gen1 storage. What else could be causing the unauthorized issue here ? Does my SQL synapse instance where I am creating the external table also needs access to ADLS Gen1?
Please note that SQL Synapse instance and ADLS Gen1 instance are in different resource groups.
Just checked the Service principle I was using to create database scoped credentials, its secret had expired based on some periodic schedule. Renewed the secret and using the updated value helped fixing the issue.
Using the preview of Synapse Analytics Workspace and the relates Synapse Studio, I have created a Data Flow that simply loads parquet file from a Datalake gen2 store into a table inside a SQL pool . Running the pipeline that contains only suchData Flow, I got the error
Livy Id=[0] Job failed during run time with state=[dead].
In synapse studio, looking into Monitor -> Apache Spark Application I found the Driver-stderr log for the failed spark application. There there was a row stating
ERROR Dataflow AppManager: name=AppManager.main, opId=AppManager fail, unexpected:java.lang.NoSuchMethodError: com.microsoft.azure.kusto.ingest.IngestionProperties.setJsonMappingName(Ljava/lang/String;)V, message=adfadf
Does any of you ever seen such error?
Looking for some help to resolve the errors I'm facing. Let me explain the scenario. I'm trying to sync one of the ADLS Gen2 container to Azure BLOB Storage. I have AzCopy 10.4.3, I'm using Azcopy Sync to do this. I'm using the command below
azcopy sync 'https://ADLSGen2.blob.core.windows.net/testsamplefiles/SAMPLE' 'https://AzureBlobStorage.blob.core.windows.net/testsamplefiles/SAMPLE' --recursive
When I run this command I'm getting below error
REQUEST/RESPONSE (Try=1/71.0063ms, OpTime=110.9373ms) -- RESPONSE SUCCESSFULLY RECEIVED
PUT https://AzureBlobStorage.blob.core.windows.net/testsamplefiles/SAMPLE/SampleFile.parquet?blockid=ZDQ0ODlkYzItN2N2QzOWJm&comp=block&timeout=901
X-Ms-Request-Id: [378ca837-d01e-0031-4f48-34cfc2000000]
ERR: [P#0-T#0] COPYFAILED: https://ADLSGen2.blob.core.windows.net/testsamplefiles/SAMPLE/SampleFile.parquet: 404 : 404 The specified resource does not exist.. When Staging block from URL. X-Ms-Request-Id: [378ca837-d01e-0031-4f48-34cfc2000000]
Dst: https://AzureBlobStorage.blob.core.windows.net/testsamplefiles/SAMPLE/SampleFile.parquet
REQUEST/RESPONSE (Try=1/22.9854ms, OpTime=22.9854ms) -- RESPONSE SUCCESSFULLY RECEIVED
GET https://AzureBlobStorage.blob.core.windows.net/testsamplefiles/SAMPLE/SampleFile.parquet?blocklisttype=all&comp=blocklist&timeout=31
X-Ms-Request-Id: [378ca84e-d01e-0031-6148-34cfc2000000]
So far I checked and ensured below things
I logged into correct tenant while logging into AzCopy
Storage Blob Data Contributor role was granted to my AD credentials
Not sure what else I'm missing as the file exists in the source and I'm getting the same error. I tried with SAS but I received different error though. I cannot proceed with SAS due to the vendor policy so I need to ensure this is working with oAuth. Any inputs is really appreciated.
For the 404 error, you may check if there is any typo in the command and the path /testsamplefiles/SAMPLE exists on both source and destination account. Also, please note that from the tips.
Use single quotes in all command shells except for the Windows Command
Shell (cmd.exe). If you're using a Windows Command Shell (cmd.exe),
enclose path arguments with double quotes ("") instead of single
quotes ('').
From azcopy sync supported scenario:
Azure Blob <-> Azure Blob (Source must include a SAS or is publicly
accessible; either SAS or OAuth authentication can be used for
destination)
We must provide include a SAS token in the source, but I tried the below code with AD authentication.
azcopy sync "https://[account].blob.core.windows.net/[container]/[path/to/blob]?[SAS]" "https://[account].blob.core.windows.net/[container]/[path/to/blob]"
but got the same 400 error as the Github issue.
Thus, in this case, after my validation, you could use this command to sync one of the ADLS Gen2 container to Azure BLOB Storage without executing azcopy login. If you have login in, you can run azcopy logout.
azcopy sync "https://nancydl.blob.core.windows.net/container1/sample?sv=xxx" "https://nancytestdiag244.blob.core.windows.net/container1/sample?sv=xxx" --recursive --s2s-preserve-access-tier=false
I am trying to connect to dataLake Gen2 storage from databricks python, unfortunately I am running into error.
Code:
dbutils.fs.ls("abfss://<fsystem name>#<storage name>.dfs.core.windows.net/<folder name>")
Error Message:
Configuration property .dfs.core.windows.net not found.
I doubt if it is something to do with my mount code? Additionally I have added Tenant ID to container "manage access" using storage explorer.
here is my mount code:
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": "<client ID>",
"fs.azure.account.oauth2.client.secret": "secret",
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/directory id/oauth2/token"}
dbutils.fs.mount( source = "abfss://filesystem name#<storage name>.dfs.core.windows.net/", mount_point = /mnt/soldel", extra_configs = configs)
Mount code ran fine, with no errors. Please suggest
Note: You cannot access the Azure Data Lake Gen2 account without configuring the storage account with Databricks.
This is expected error message because you haven't configured storage account with databricks to list filesystem.
Kindly check out the error message and see the correct process of list filesystem in Databricks.
For more details, refer "Databricks - Azure Data Lake Storage Gen2".
Hope this helps.