External Database Scoped Credential via SAS - azure-synapse

Is it possible to create a data base scoped credential in synapse for Azure Blob Storage in SYnapse?
Tried this scenario :
WITH IDENTITY = 'SHARED ACCESS SIGNATURE', SECRET = '<your SAS secret without the preleading ?>'; CLOSE MASTER KEY; -- only necessary if you need to close the master key context. (it will close with the session/query close)
But it is failing

SAS Database scoped credential for Azure Storage account is supported only in Azure SQL database and not in Synapse.
You need to use Storage account keys only in synapse.
In synapse you would get the below error:
Secret provided can not contain more than 120 characters. Please provide a valid credential.

Related

Azure Synapse Copy Data from BigQuery, Source ERROR [HY000] [Microsoft][BigQuery] (131) Unable to authenticate with Google BigQuery Storage API

I am getting this error at the Source tab at the Use query (Table, Query) Query, when doing a copy data activity at the Azure Synapse pipeline.
Unable to authenticate with Google BigQuery Storage API:
.
The strange thing is I can preview data at the Source dataset, I can also preview data when select the Use query Table option.
I can even run query to select the table's schema
SELECT
*
FROM
`3082`.INFORMATION_SCHEMA.TABLES
WHERE table_type = 'BASE TABLE'
but I get this authentication error when selecting columns
SELECT
*
FROM
`3082.gcp_billing_export_v1_019F74_6EA5E8_C96548`;
ERROR [HY000] [Microsoft][BigQuery] (131) Unable to authenticate with Google BigQuery Storage API. Check your account permissions
The above error is due to issue in authentication of BigQuery Storage API. The permission required to access data from BigQuery are,
bigquery.readsessions.create
bigquery.readsessions.getData
bigquery.readsessions.update
The role BigQuery User will help in giving above permissions.
Reference:
Google cloud doc on Access Control - BigQuery User.
MS doc on Google BigQuery connector issue

Unable to Query Serverless Pool View in Azure Synapse using SQL Admin Credentials

I have set up a Serverless SQL pool in Azure Synapse that is querying a view I had set up of a linked Azure Data Lake.
CREATE VIEW DeviceTelemetryView
AS SELECT corporationid, deviceid, version, Convert(datetime, dateTimestamp, 126) AS dateTimeStamp, deviceData FROM
OPENROWSET(
BULK 'https://test123.dfs.core.windows.net/devicetelemetry/*/*/*/*/*/',
FORMAT = 'PARQUET'
) AS [result]
GO
Using my Azure AD credentials from with synapse studio or SSMS I have no issues querying this View. When I try to query using my SQL Admin account I get the following error:
Cannot find the CREDENTIAL 'https://test123.dfs.core.windows.net/devicetelemetry/////*/', because it does not exist or you do not have permission.
It is important that I am able to query using SQL Admin Creds as we are wanting to query this View via our application for various reports and thus don't want to use AAD creds.
I have tried the SO solution provided here: GRANT Database Scoped Credential syntax gives mismatched input error
GRANT REFERENCES ON DATABASE SCOPED CREDENTIAL::[WorkspaceSystemIdentity] TO [sqlAdmin];
As this seems to be the default credential that was created when linking my DataLake to Synapse however this gives me the following error when run against the db where my view exists:
Cannot find the database scoped credential 'WorkspaceSystemIdentity', because it does not exist or you do not have permission.
You would need to create server-scoped credential to allow access to storage files.
Server-scoped credential
These are used when SQL login calls OPENROWSET function without
DATA_SOURCE to read files on some storage account. The name of
server-scoped credential must match the base URL of Azure storage
(optionally followed by a container name). However, SQL users can't
use Azure AD authentication to access storage and serverless SQL pool doesn't return subfolders unless you specify /** at the end of path.

Error in SSMS when running query from SQL On-Demand endpoint

I am attempting to pull in data from a CSV file that is stored in an Azure Blob container and when I try to query the file I get an error of
File 'https://<storageaccount>.blob.core.windows.net/<container>/Sales/2020-10-01/Iris.csv' cannot be opened because it does not exist or it is used by another process.
The file does exist and as far as I know of it is not being used by anything else.
I am using SSMS and also a SQL On-Demand endpoint from Azure Synapse.
What I did in SSMS was run the following commands after connecting to the endpoint:
CREATE DATABASE [Demo2];
CREATE EXTERNAL DATA SOURCE AzureBlob WITH ( LOCATION 'wasbs://<container>#<storageaccount>.blob.core.windows.net/' )
SELECT * FROM OPENROWSET (
BULK 'Sales/2020-10-01/Iris.csv',
DATA_SOURCE = 'AzureBlob',
FORMAT = '*'
) AS tv1;
I am not sure of where my issue is at or where to go next. Did I mess up anything with creating the external data source? Do I need to use a SAS token there and if so what is the syntax for that?
#Ubiquitinoob44, you need to create a database credential:
https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/develop-storage-files-storage-access-control?tabs=shared-access-signature
I figured out what the issue was. I haven't tried Armando's suggestion yet.
First I had to go to the container and edit IAM policies to give my Active Directory login a Blob Data Contributor role. The user to give access to will be your email address for logging in to your portal.
https://learn.microsoft.com/en-us/azure/storage/common/storage-auth-aad-rbac-portal?toc=/azure/synapse-analytics/toc.json&bc=/azure/synapse-analytics/breadcrumb/toc.json
After that I had to re-connect to the On-Demand endpoint in SSMS. Make sure you login through the Azure AD - MFA option. Originally I was using the On-Demand endpoint username and password which was not given access to the Blob Data Contributor role for the container.
https://learn.microsoft.com/en-us/azure/synapse-analytics/sql/resources-self-help-sql-on-demand

Can't change anything in Encryption after deleting a vault key

I deleted a key vault that was used in a storage account.
Now if I try to change anything in the Encryption section of the storage (like change the encryption type or using a new key), I am getting:
The operation failed because the specified key vault key 'https://dev-certs2.vault.azure.net/keys/<my-previous-key/xxxxxxxxxxxxxxxx' was not found
Is there a way to change it without having to create a new storage account?
By default, the Soft delete will be enabled when you create the keyvault, the default retention period is 90 days, if your keyvault was deleted within 90 days, then you can follow the steps below, if it exceeds 90 days, there seems to be no way to do that without creating a new storage account.(not 100% sure, you may need to contact the azure support)
1.Use powershell to check if the keyvault was in Removed state, if there is no output, it means that exceeds 90 days.
Get-AzKeyVault -VaultName joyk -Location <the same location with the storage> -InRemovedState
2.Use powrershell to recover the previously deleted keyvault.
Undo-AzKeyVaultRemoval -VaultName joyk -ResourceGroupName <group-name> -Location <the same location with the storage>
3.Navigate to the storage account in the portal -> Encryption , you will be able to change the Encryption type or use a new key. After configuring, then you can delete the keyvault again.

Can't CREATE EXTERNAL DATA SOURCE in SQL

I'm trying to create an external data source to access Azure Blob Storage. However, I'm having issues with creating the actual data source.
I've followed the instructions located here:
Examples of bulk access to data in azure blob storage and
Create external data source - transact sql. I'm using SQL Server 2016 on a VM accessing via SSMS on a client machine using Windows Authentication with no issues. Instructions say creating this external data source works for SQL Server 2016 and Azure Blob Storage.
I have created the Master Key:
CREATE MASTER KEY ENCRYPTION BY PASSWORD = <password>
and, the database scoped credential
CREATE DATABASE SCOPED CREDENTIAL UploadCountries
WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
SECRET = <key>;
I have verified both of these exist in the database by querying sys.symmetric_keys and sys.database_scoped_credentials.
However, when I try executing the following code it says 'Incorrect syntax near 'EXTERNAL'
CREATE EXTERNAL DATA SOURCE BlobCountries
WITH (
TYPE = BLOB_STORAGE,
LOCATION = 'https://<somewhere>.table.core.windows.net/<somewhere>',
CREDENTIAL = UploadCountries
);
Your thoughts and help are appreciated!
Steve.
In “Examples of Bulk Access to Data in Azure Blob Storage”, we can find:
Bulk access to Azure blob storage from SQL Server, requires at least SQL Server 2017 CTP 1.1.
And in Arguments section of “CREATE EXTERNAL DATA SOURCE (Transact-SQL)”, we can find similar information:
Use BLOB_STORAGE when performing bulk operations using BULK INSERT or OPENROWSET with SQL Server 2017
You are using SQL Server 2016, so you get Incorrect syntax near 'EXTERNAL' error when you create external data source for Azure Blob storage.