Error while accessing the image directly using blob URL - blob

I am new to Azure. I have activated my free subscription to learn Azure. I have created one Storage account in my portal. In that account, I created one container.
I uploaded an image to that container successfully. When I click that image, I have found the URL in properties that I can use to access via browser.
But I am getting below error while performing the above operation.
<Error>
<Code>ResourceNotFound</Code>
<Message>
The specified resource does not exist. RequestId:<Guid> Time:2022-05-17T11:20:46.2299517Z
</Message>
</Error>
But the image does exist in that container. Why am I getting the above error? How do I avoid that error?

I have tested in my environment. Please note that, when you are creating a new container, it's access level will be private by default.
Please check the access level of your container is Private or not like below:
Go to Azure Portal -> Storage Accounts -> Your Storage Account -> Containers -> Your Container "Public access level"
If that access level is private, when you access the blob of that container directly via browser, you will get error like below:
To change the access level of your container:
Select your container that enables Change Access level Option -> Select Change Access Level -> Select Blob or Container from the dropdown -> Click Ok
Now, access the blob URL directly via browser, there will be no issues.

Related

Not able to get Azure SQL Server Extended Events to work when Blob Storage is set to Enabled from selected virtual networks and IP addresses

So I have an Azure Database and want to test extended events with the database.
I was able to set up my Blob Storage container and was able to get Extended Events via Azure Database to work as long as the Blob Storage network setting Public network access is set to Enabled from all networks. If I set Enabled from selected virtual networks and IP addresses and have Microsoft network routing checked as well as Resource type set with Microsoft.Sql/servers and its value as All In current subscription, it still doesn't work.
I'm not exactly sure what I'm doing wrong and I'm not able to find any documentation on how to make it work without opening up to all networks.
The error I'm getting is:
The target, "5B2DA06D-898A-43C8-9309-39BBBE93EBBD.package0.event_file", encountered a configuration error during initialization. Object cannot be added to the event session. (null) (Microsoft SQL Server, Error: 25602)
Edit - Steps to fix the issue
#Imran: Your answer led me to get everything working. The information you gave and the link provided was enough for me to figure it out.
However, for anyone in the future I want to give better instructions.
The first step I had to do was:
All I had to do was run Set-AzSqlServer -ResourceGroupName [ResourcegroupName] b -ServerName [AzureSQLServerName] -AssignIdentity.
This assigns the SQL Server an Azure Active Directory Identity. After running the above command, you can see your new identity in Azure Active Directory under Enterprise applicationsand then where you see theApplication type == Enterprise Applicationsheader, click the headerApplication type == Enterprise Applicationsand change it toManaged Identities`and click apply. You should see your new identity.
The next step is to give your new identity the role of Storage Blob Data Contributor to your container in Blob Storage. You will need to go to your new container and click Access Control (IAM) => Role assignments => click Add => Add Role assignment => Storage Blob Data Contributor => Managed identity => Select member => click your new identity and click select and then Review + assign
The last step is to get SQL Server to use an identity when connecting to `Blob Storage.
You do that by running the command below on your Azure SQL Server database.
CREATE DATABASE SCOPED CREDENTIAL [https://<mystorageaccountname>.blob.core.windows.net/<mystorageaccountcontainername>]
WITH IDENTITY = 'Managed Identity';
GO
You can see your new credentials when running
SELECT * FROM sys.database_scoped_credentials
The last thing I want to mention is when creating Extended Events with
an Azure SQL Server using SSMS, it gives you this link. This only works if you want your Blob Storage wide open. I think this is a disservice and wish they would have instructions when you want your Blob Storage not wide open by using RBAC instead of SAS.
I tried to reproduce the same in my environment I got the result successfully like below:
To resolve this issue, check whether your account type should be
StorageV2(general purpose v2). If you have a general-purpose v1 or blob storage account, try to upgrade like below.
In storage account -> under setting, configuration -> upgrade
Check whether you have choose Allow trusted Microsoft services to access this storage account under exception and I added firewall client Ip address range and vnet like below.
Make sure Microsoft.Authorization/roleAssignments/write permission in your storage account
After enabling firewall, we lose write access to the storage account and audit logs try to Resave the audit settings from the portal is required in order for auditing to function like below.
Note: Auditing to storage behind firewalls using user managed identity authentication type is not presently supported.
When I try to connect, I got result successfully like below:
Reference:
Configure extended events in SQL Azure to the blob storage with Private Endpoint - Microsoft Community Hub by Sakshi Gupta

Content of directory on path https://xxxxxxx.dfs.core.windows.net/dataverse-xxxx-org5a2/account/Snapshot/2018-08_1656570292/*.csv' cannot be listed

When I try to query our Serverless SQL pool in Azure Synapse Analytics I get the following error:
"Content of directory on path 'https://xxxxxx.dfs.core.windows.net/dataverse-xxxxxx-org5a2bcccf/account/Snapshot/2018-08_1656570292/*.csv' cannot be listed.".
I have checked out the following link for clues as to what could be cause:
https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/resources-self-help-sql-on-demand?tabs=x80070002
It is suggested that the error is due permissions:
However, I believe I have the correct permissons,
I get this error whether I try to execute the query in SSMS or Synapse Workspace.
The error in SSMS is as follows:
Warning: Unable to resolve path https://xxxxx.dfs.core.windows.net/dataverse-xxxxx-org5a2bcccf/account/Snapshot/2018-10_1657304551/*.csv. Error number 13807, Level 16, State 1, Message "Content of directory on path 'https://xxxxxx.dfs.core.windows.net/dataverse-xxxxx-org5a2bcccf/account/Snapshot/2018-10_1657304551/*.csv' cannot be listed.".
Can someone let me know how to resolve this?
The query that I'm attempting to execute can be located here:
https://github.com/slavatrofimov/Synapse-Link-for-Dataverse-data-enrichment-in-Serverless-SQL-Pools/blob/main/SQL/Enrich%20Synapse%20Link%20for%20Dataverse%20Entities%20with%20Human-Readable%20Labels.sql
Is there a definitive way to determine if the problem is due to lack of permissions?
Update Question:
I have just realised that the issue is access the Lake on https://xxxxxx.dfs.core.windows.net/dataverse-xxxxxx-org5a2bcccf/
Therefore please take a look at my permissons on the lake and let me know if it is sufficient?
This issue occurs when the user trying to query the external table does not have the relevant permissions or if there is a firewall enabled on your storage network.
When looked at the permissions you have provided, I see Storage Blob Data reader and Storage Blob Data contributor have been given.
Ref doc: Control storage account access for serverless SQL pool in Azure Synapse Analytics
In case if your storage account is firewall protect then you will have to follow the steps described in this document to overcome the issue: Access storage that is protected with the firewall
Here are couple of relevant articles which might help you configure your storage firewall to overcome this issue:
Storage configuration for external table is not accessible while query on Serverless
Synapse Studio error while trying to read data from Storage Account using SQL On Demand

External tables not working when "Deny public network access" is set to Yes

I have enabled Private link by setting the "Deny public network access" knob to Yes in the Firewall settings on my Azure SQL Database server. Everything is working as expected except external data sources (external tables). The external tabels are simply links to tables in another Azure SQL database that belongs to the same server. Before I enabled the Private link, everything worked fine. If I try to query the external tables I get this error message:
"Error retrieving data from [mydbserver].database.windows.net.[mydbname]. The underlying error message received was: 'Reason: An instance-specific error occurred while establishing a connection to SQL Server. Connection was denied since Deny Public Network Access is set to Yes (https://learn.microsoft.com/azure/azure-sql/database/connectivity-settings#deny-public-network-access). To connect to this server, use the Private Endpoint from inside your virtual network (https://learn.microsoft.com/azure/sql-database/sql-database-private-endpoint-overview#how-to-set-up-private-link-for-azure-sql-database)."
I can't find anything in the docs about any limitation regarding external data sources and external tables in combination with Private Link setup.
The external tables where created using the standard way: "CREATE EXTERNAL DATA SOURCE" and "CREATE EXTERNAL TABLE". I have also tried to recreate the data source and the tables after enabling Private Link, but the error remains...
Want to reiterate the answer to the same question posted on Microsoft Q&A: External tables not working when “Deny public network access” is set to Yes
The limitation is with Polybase as it currently does not support Private Link at this time. As per the PG:
Polybase does not support using private link at this time. Please direct the customer to use Managed Identity to secure the connection to Azure Storage.
Albeit, this may not be a workable solution for you but, if the data you need to access is extracted to a storage account and then imported via the method referenced by the PG, this could be a workable solution. The same process is reversed with flip/flop endpoints, and could be done within the security of a VNET + Managed Identity.
You need to use the name yourdbname.privatelink.database.windows.net
Afterwards you'll maybe receive another error that this name is incorrect. In this case you're experiencing a DNS problem and you need to add an entry in the host file of your VM with the IP of the endpoint. If your VM is outside of that VNET, it's another story.
Then you need to add the public IP of your endpoint in your hostfile. I'm still trying to solve this with a serious dns, haven't figured it out yet.
For More information see this;
https://techcommunity.microsoft.com/t5/azure-database-support-blog/lesson-learned-126-deny-public-network-access-allow-azure/ba-p/1244037

Azure ADLS Storage Account changing Firewall and Network setting thru ARM or Powershell

In Azure ADLS Storage Account (Gen 2) we want to make a small change to the NETWORK & FIREWALL SETTINGS making an EXCEPTION to enable:- ALLOW READING OF STORAGE LOGS (As shown in the screenshot below)
We want to do as part of the ARM Template or thru a Powershell script whenever the Azure ADLS Storage account is provisioned. I am unable to find documentation on this can someone help me with how we can achieve this setting change thru Powershell/Arm Template?
Following is what is needed in your ARM Template to fix that CHECKBOX to allow the logs to be accessed (Firewall and Network setting):-
Basically the highlighted line will check both the boxes:
"networkAcls": {
"bypass": "Logging, AzureServices",
"virtualNetworkRules": [],
"ipRules": [
{

Full Public Read Access on Azure Storage Emulator

I am putting together a website which I would like to host static content via Azure Blob. The documentation is very clear on how to set "Public read access for blobs only" to a container via this document: https://learn.microsoft.com/en-us/azure/storage/storage-manage-access-to-resources
In my development environment I am using the Azure storage emulator (https://learn.microsoft.com/en-us/azure/storage/storage-use-emulator).
My question is: How can I set the permission of a container in the emulator to "Public read access for blobs only"?
I think I answered my own question. Unlike the Azure Portal the Emulator does not provide a mechanism to create a container and by extension set/modify the access policy. Using a third party tool such as CloudBerry Explorer will allow for the setting of the policy when a container is created. Additionally it is possible to set/modify the policy through code:
https://learn.microsoft.com/en-us/azure/storage/storage-manage-access-to-resources
public static void SetPublicContainerPermissions(CloudBlobContainer container)
{
BlobContainerPermissions permissions = container.GetPermissions();
permissions.PublicAccess = BlobContainerPublicAccessType.Container;
container.SetPermissions(permissions);
}