Unable to connect to Azure Storage Table resource using SAS key - azure-storage

I have been provided SAS key for an Azure Storage Table resource and I am trying to connect to that using Azure Storage Explorer but keep getting the error:-
Error occurred while adding new connection: SyntaxError:Table name format is incorrect.
I even updated Azure Storage Explorer version but the issue is still the same. The SAS key is correct. And in fact for none of the Azure Storage Table resources I have been able to connect using SAS key while using Azure Storage Explorer.

Related

Getting data from on-prem to Azure Synapse - "unexpected metadata of Synapse Link was detected in the source database"

I'm trying to move data from table A in our on-prem database into an equivalent table A in an Azure Synapse (ASA) dedicated pool. I've set up the integration runtime, and have selected my on prem table from ASA. However, when I run a link connection I am seeing the following error:
Failed to enable Synapse Link on the source due to 'Unexpected metadata of Synapse Link was detected in the source database.'.
Failed to disable Synapse Link on the source due to 'Failed to drop the link topic in the source database: Failed to enable Synapse Link on the source due to 'Unexpected metadata of Synapse Link was detected in the source database.'.'.
Continuous run ID: e52df111-9947-401e-97cb-4ef3f4532934
I'm expecting Table A in ASA to be populated with data from on-prem.
What does this mean? I'm very new to ASA so might have overlooked some setup.
I tried to reproduce the error you have got and ended up with the similar error as below:
The main cause of error is your Azure Synapse workspace managed identity has no permissions to access the Azure Data Lake Storage Gen2 storage account.
As per Official Microsoft Document
Make sure that you've granted your Azure Synapse workspace managed identity permissions to the Azure Data Lake Storage Gen2 storage account
To grant the managed identity of Azure Synapse workspace to access the Azure Data Lake Storage Gen2 storage account follow below steps:
Go to your Azure Data Lake Storage Gen2 >> Access Control (IAM) >> Add >> Add Role assignments
After this pick Storage Blob Data Contributor,
For selected role choose Managed identity.
Then under Members, choose your Azure Synapse workspace.

Cannot open backup device - SQL Server on-premise backup database to Azure storage

I have a database running on an on-premise SQL Server instance. I've set up a SQL agent to backup the database every night and store in a container in Azure. However, I'm seeing the following error after the job runs:
Message
Executed as user: NT SERVICE\SQLSERVERAGENT. Cannot open backup device 'https://mystorageaccount.blob.core.windows.net/mystoragecontainer/20200102/MYDATABASE_0.bak'. Operating system error 50(The request is not supported.). [SQLSTATE 42000] (Error 3201) BACKUP DATABASE is terminating abnormally. [SQLSTATE 42000] (Error 3013)
The Azure storage account is Storage (general purpose v1).
SQL Server 13.0.5233.0
Microsoft SQL Server Management Studio 14.0.17213.0
Microsoft Analysis Services Client Tools 14.0.1016.232
Microsoft Data Access Components (MDAC) 10.0.14393.0
Microsoft MSXML 3.0 6.0
Microsoft Internet Explorer 9.11.14393.0
Microsoft .NET Framework 4.0.30319.42000
Operating System 6.3.14393
Is there a way of configuring NT SERVICE\SQLSERVERAGENT to connect to the Azure storage container?
All the comments are valid to a certain extent. I finally fixed (term used loosely) backing up my SQL Server database to an Azure storage container using SAS (shared access signature) credentials.
Deleted the existing credential in SQL Server (under Security > Credentials)
In Azure, created an access policy under Storage Account > Container. It's important to define start and expiration dates/times, and time zones along with read, write level.
In Azure, Generate a SAS token for the container. Ensure you set the appropriate start and expiration dates/times along with the timezone. Don't rely on just UTC.
Upload a file to the container to make sure it's all ok.
In SQL Server, create the credentials. The secret should be the SAS token without the preceding ?, so just "sv=...." .
IF NOT EXISTS
(SELECT * FROM sys.credentials
WHERE name = 'https://mystorageaccount.blob.core.windows.net/mycontainer')
CREATE CREDENTIAL [https://mystorageaccount.blob.core.windows.net/mycontainer]
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = 'sv=_my_sas_key_without_?';
Backup the database. The URL and container must match with the credentials - in this case, https://mystorageaccount.blob.core.windows.net/mycontainer .
BACKUP DATABASE [mydatabase]
TO URL = 'https://mystorageaccount.blob.core.windows.net/mycontainer/mydatabase_03012020120400.bak'
WITH FORMAT,
COMPRESSION,
STATS=5,
BLOCKSIZE=65536,
MAXTRANSFERSIZE=4194304;
GO
This seems like access related issue.To be able to back up to Azure Blob Storage, your backup command must have a valid credential that has access to the blob storage. This can be done in two ways:
Review your credentials, make sure your access key is correct and, if you’re using SAS, make sure that the access policy is defined and you are pointing to the correct container that it has access to. For more information on URL backups as well as script samples, please refer:
https://learn.microsoft.com/en-us/sql/relational-databases/backup-restore/sql-server-backup-to-url?view=sql-server-2017
Shared Access Signature
Identity and Access Key
Additional reference:
https://blog.pythian.com/how-to-fix-sql-backup-to-url-failure-operating-system-error-50/
Hope it helps.

Is There a Local Emulator for the Azure Data Lake Store

When developing for Azure storage accounts, I can run the Microsoft Storage Emulator to locally keep Blobs, Queues, and Tables without having to connect to Azure online.
Is there something equivalent for the Azure Data Lake Store? It would be nice to develop locally for a while without having to connect to Azure online.
Have you tried Visual Studio with the Azure Data Lake Tools plug-in?
As pointed out by David, you can develop Azure Data Lake Analytics (ADLA) projects locally without needing connectivity to Azure for the ADLA account or the associated Azure Data Lake Store (ADLS) account. Is there some other application you would like to use with ADLS?
Thanks,
Sachin Sheth
Azure Data Lake team
Same problem here.
AFAIK the Storage Emulator is not yet able to really handle Data Lake (ADSL Gen2) Requests.
This Uri works (but looks for a file, not a dir):
http://127.0.0.1:10000/devstoreaccount1/packages-container/Dir/SubDir?sv=2020-04-08&se=2022-10-13T14%3A43%3A39Z&sr=b&sp=rcwl&sig=d2SxwYCkJGyx%2BHac9vntYQZOTt5QVs1bKgKb4%2FgcQ9k%3D
This one doesn't:
Error: Status: 403 (Server failed to authenticate the request. Make sure the value of the Authorization header is formed correctly including the signature.)
ErrorCode: AuthorizationFailure
http://127.0.0.1:10000/devstoreaccount1/packages-container/Dir/SubDir?sv=2020-04-08&se=2022-10-13T14%3A43%3A39Z&sr=d&sp=rcwl&sdd=2&sig=KU%2Fcu6W0Nsv8CucMgusubo8RbXWabFO8nDMkFxU1tTw%3D
The difference is that the second one uses the resource 'sr=d' (directory) while the first uses 'sr=b' (blob).
Both items are working on real Azure Storage (with ADSL Gen2).
The request is already tracked here: https://github.com/Azure/Azurite/issues/553
Tested on VS 2022 17.3.6 using Server: Azurite-Blob/3.18.0

How to write sqlcmd results directly to Azure Storage using Azure PowerShell?

Current story:
Moving overall BI solution fully to Azure cloud services. Building a new Azure DW and loading data from an Azure DB. Currently, Azure DW doesn't support linked servers and/or the elastic query (this is only supported in Azure DB). Due to price, we can not use data factory or an instance of SSIS. We can't use bcp as we don't have a local directory to hold the file in between loads.
Is it possible to use Azure PowerShell with sqlcmd to write results of a query directly to Azure Storage, without having to write to a file on a local directory in between?
Are there other options that aren't mentioned above?
Thank you for any input.
The current Azure Storage PowerShell (Set-AzureStorageBlobContent) only support upload blob from local file.
Azure Storage Client Library (https://github.com/Azure/azure-storage-net) support to upload blob from stream, can you try to develop your own application with the Azure Storage Client Library?
If your data is big, you can also try https://github.com/Azure/azure-storage-net-data-movement/, it has better performance in upload big blob.

How to import SQL database from file to Azure

How to load my SQL database created in MySQL Workbench on Azure cloud?
I created a database which consists of some tables - for now, there is now data in them, it's just a small script created by MySQL Workbench. I also created a database on Azure cloud, created login & password and when I want to use 'automated export' option (I have Storage account, I enter valid login with password) I have error:
'Could not find any bacpac files in the specified storage account.'
I tried google this phrase but I completely do not understand the idea behind these bacpac files and I do not know what to do with it. Can anyone describe me step-by-step how to put my database on Azure cloud?
I want to connect to this DB on Azure in the future because I would like to do a webapplication and android app which will use a remote DB available online.
Azure SQL Database is a custom SQL Server, so if you want to use MySQL you should create a Clear DB (which is a Microsoft partner that offers MySQL on azure). Other option, you can create a Virtual Machine and install by yourself a MySQL.
After that, you can import your tables / records.