Virto Commerce Catalog Export to CSV issue with blob/local storage setting on azure deployment - virtocommerce

I have an azure deployment, however, when I am exporting to CSV from catalog, it tries to export to the following link:
I have tried to find where the setting is to change this to blob storage but I am a new with the software. If you change the localhost to the URL of the deployment, it gives a 404 error. I believe that there is somewhere there is a setting to configure local or blob storage, could you please direct us to where this is?
By the way, this is from a default azure deployment currently.
Thanks for your help.
Here is a paste of what it returns:
Catalog to csv export
detail
Start export
Start — 4:00:39 AM
End — 4:01:33 AM
Total count
5
Processed count
5
Error count
0
Download Url:http://localhost/admin/Assets/temp/Catalog-newjersey-export.csv

How to switch from local to azure blob storage you can read here Working with blob storages
To solve you problem and switch to azure blob storage you need will do next steps:
Open you azure storage account (blob) access keys and copy primary connection string
Replace in virtocommerce manager web.config exist AssetsConnectionString line to
<add name="AssetsConnectionString" connectionString="provider=AzureBlobStorage; { you storage account connection string }" />

Related

Not able to get Azure SQL Server Extended Events to work when Blob Storage is set to Enabled from selected virtual networks and IP addresses

So I have an Azure Database and want to test extended events with the database.
I was able to set up my Blob Storage container and was able to get Extended Events via Azure Database to work as long as the Blob Storage network setting Public network access is set to Enabled from all networks. If I set Enabled from selected virtual networks and IP addresses and have Microsoft network routing checked as well as Resource type set with Microsoft.Sql/servers and its value as All In current subscription, it still doesn't work.
I'm not exactly sure what I'm doing wrong and I'm not able to find any documentation on how to make it work without opening up to all networks.
The error I'm getting is:
The target, "5B2DA06D-898A-43C8-9309-39BBBE93EBBD.package0.event_file", encountered a configuration error during initialization. Object cannot be added to the event session. (null) (Microsoft SQL Server, Error: 25602)
Edit - Steps to fix the issue
#Imran: Your answer led me to get everything working. The information you gave and the link provided was enough for me to figure it out.
However, for anyone in the future I want to give better instructions.
The first step I had to do was:
All I had to do was run Set-AzSqlServer -ResourceGroupName [ResourcegroupName] b -ServerName [AzureSQLServerName] -AssignIdentity.
This assigns the SQL Server an Azure Active Directory Identity. After running the above command, you can see your new identity in Azure Active Directory under Enterprise applicationsand then where you see theApplication type == Enterprise Applicationsheader, click the headerApplication type == Enterprise Applicationsand change it toManaged Identities`and click apply. You should see your new identity.
The next step is to give your new identity the role of Storage Blob Data Contributor to your container in Blob Storage. You will need to go to your new container and click Access Control (IAM) => Role assignments => click Add => Add Role assignment => Storage Blob Data Contributor => Managed identity => Select member => click your new identity and click select and then Review + assign
The last step is to get SQL Server to use an identity when connecting to `Blob Storage.
You do that by running the command below on your Azure SQL Server database.
CREATE DATABASE SCOPED CREDENTIAL [https://<mystorageaccountname>.blob.core.windows.net/<mystorageaccountcontainername>]
WITH IDENTITY = 'Managed Identity';
GO
You can see your new credentials when running
SELECT * FROM sys.database_scoped_credentials
The last thing I want to mention is when creating Extended Events with
an Azure SQL Server using SSMS, it gives you this link. This only works if you want your Blob Storage wide open. I think this is a disservice and wish they would have instructions when you want your Blob Storage not wide open by using RBAC instead of SAS.
I tried to reproduce the same in my environment I got the result successfully like below:
To resolve this issue, check whether your account type should be
StorageV2(general purpose v2). If you have a general-purpose v1 or blob storage account, try to upgrade like below.
In storage account -> under setting, configuration -> upgrade
Check whether you have choose Allow trusted Microsoft services to access this storage account under exception and I added firewall client Ip address range and vnet like below.
Make sure Microsoft.Authorization/roleAssignments/write permission in your storage account
After enabling firewall, we lose write access to the storage account and audit logs try to Resave the audit settings from the portal is required in order for auditing to function like below.
Note: Auditing to storage behind firewalls using user managed identity authentication type is not presently supported.
When I try to connect, I got result successfully like below:
Reference:
Configure extended events in SQL Azure to the blob storage with Private Endpoint - Microsoft Community Hub by Sakshi Gupta

Content of directory on path https://xxxxxxx.dfs.core.windows.net/dataverse-xxxx-org5a2/account/Snapshot/2018-08_1656570292/*.csv' cannot be listed

When I try to query our Serverless SQL pool in Azure Synapse Analytics I get the following error:
"Content of directory on path 'https://xxxxxx.dfs.core.windows.net/dataverse-xxxxxx-org5a2bcccf/account/Snapshot/2018-08_1656570292/*.csv' cannot be listed.".
I have checked out the following link for clues as to what could be cause:
https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/resources-self-help-sql-on-demand?tabs=x80070002
It is suggested that the error is due permissions:
However, I believe I have the correct permissons,
I get this error whether I try to execute the query in SSMS or Synapse Workspace.
The error in SSMS is as follows:
Warning: Unable to resolve path https://xxxxx.dfs.core.windows.net/dataverse-xxxxx-org5a2bcccf/account/Snapshot/2018-10_1657304551/*.csv. Error number 13807, Level 16, State 1, Message "Content of directory on path 'https://xxxxxx.dfs.core.windows.net/dataverse-xxxxx-org5a2bcccf/account/Snapshot/2018-10_1657304551/*.csv' cannot be listed.".
Can someone let me know how to resolve this?
The query that I'm attempting to execute can be located here:
https://github.com/slavatrofimov/Synapse-Link-for-Dataverse-data-enrichment-in-Serverless-SQL-Pools/blob/main/SQL/Enrich%20Synapse%20Link%20for%20Dataverse%20Entities%20with%20Human-Readable%20Labels.sql
Is there a definitive way to determine if the problem is due to lack of permissions?
Update Question:
I have just realised that the issue is access the Lake on https://xxxxxx.dfs.core.windows.net/dataverse-xxxxxx-org5a2bcccf/
Therefore please take a look at my permissons on the lake and let me know if it is sufficient?
This issue occurs when the user trying to query the external table does not have the relevant permissions or if there is a firewall enabled on your storage network.
When looked at the permissions you have provided, I see Storage Blob Data reader and Storage Blob Data contributor have been given.
Ref doc: Control storage account access for serverless SQL pool in Azure Synapse Analytics
In case if your storage account is firewall protect then you will have to follow the steps described in this document to overcome the issue: Access storage that is protected with the firewall
Here are couple of relevant articles which might help you configure your storage firewall to overcome this issue:
Storage configuration for external table is not accessible while query on Serverless
Synapse Studio error while trying to read data from Storage Account using SQL On Demand

Can I restrict batch account linked auto storage with Firewall and azure virtual network setting?

I have batch account with auto storage linked where the application packages are stored. I want to restrict the access on the this batch linked auto storage with virtual network settings.
I tried adding vnet setting and allowed the subnet of my selfhost virtual machine scale set agents , from devops pipeline I am tryingto execute powershell script which uploads the application package to the batch account using below command
New-AzBatchApplicationPackage -AccountName $BatchAccountName -ResourceGroupName $ResourceGroupName -ApplicationId $ApplicationName -ApplicationVersion $newVersionNumber -Format zip -FilePath $PackageFilePath
this command works when the storage network setting all networks is enabled, but when I try to select the selected network , the command files to upload the package with the error
Failed to add application package DataExportProcessor version 89.0. The auto storage account keys are invalid, please sync auto storage keys.
In the storage selected network I am allowing my devops scale set agent subnet but , I am not uploading package directly to the storage from scale set machine, the New-AzBatchApplicationPackage command uploads the application package to storage, but I am not sure which IP , I should whitelist in my storage account so that batch account can update the application package
Please note that, while setting firewall of storage account you need to select All Networks .
If you want to choose selected network, then you have to add your public IP address and the list of the IPs of the BatchNodeManagement to your Storage Account firewall.
To get the list of those IPs, you can refer this blog by Amine Charot.
Make sure to add IPs like below:
To resolve the "Failed to add application package DataExportProcessor version 89.0. The auto storage account keys are invalid, please sync auto storage keys" please check whether the keys in storage account and batch account are same or not.
If not sync like below:
Go to Azure Portal -> Your Batch Account -> Storage Account -> SyncKeys
Reference:
Package deployment failures (microsoft.com)

Azure ADLS Storage Account changing Firewall and Network setting thru ARM or Powershell

In Azure ADLS Storage Account (Gen 2) we want to make a small change to the NETWORK & FIREWALL SETTINGS making an EXCEPTION to enable:- ALLOW READING OF STORAGE LOGS (As shown in the screenshot below)
We want to do as part of the ARM Template or thru a Powershell script whenever the Azure ADLS Storage account is provisioned. I am unable to find documentation on this can someone help me with how we can achieve this setting change thru Powershell/Arm Template?
Following is what is needed in your ARM Template to fix that CHECKBOX to allow the logs to be accessed (Firewall and Network setting):-
Basically the highlighted line will check both the boxes:
"networkAcls": {
"bypass": "Logging, AzureServices",
"virtualNetworkRules": [],
"ipRules": [
{

Error connecting to DataLake(ADLS Gen2) store from databricks

I am trying to connect to dataLake Gen2 storage from databricks python, unfortunately I am running into error.
Code:
dbutils.fs.ls("abfss://<fsystem name>#<storage name>.dfs.core.windows.net/<folder name>")
Error Message:
Configuration property .dfs.core.windows.net not found.
I doubt if it is something to do with my mount code? Additionally I have added Tenant ID to container "manage access" using storage explorer.
here is my mount code:
configs = {"fs.azure.account.auth.type": "OAuth",
"fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider",
"fs.azure.account.oauth2.client.id": "<client ID>",
"fs.azure.account.oauth2.client.secret": "secret",
"fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/directory id/oauth2/token"}
dbutils.fs.mount( source = "abfss://filesystem name#<storage name>.dfs.core.windows.net/", mount_point = /mnt/soldel", extra_configs = configs)
Mount code ran fine, with no errors. Please suggest
Note: You cannot access the Azure Data Lake Gen2 account without configuring the storage account with Databricks.
This is expected error message because you haven't configured storage account with databricks to list filesystem.
Kindly check out the error message and see the correct process of list filesystem in Databricks.
For more details, refer "Databricks - Azure Data Lake Storage Gen2".
Hope this helps.