Azure File Storage, using SAS - azure-storage

I am confused about using Azure File Storage and SAS. I thought the point of a SAS was to be able to have access to a resource without needing the account key.
I am looking to do a simple file upload, and when looking into this method, there are examples provided but the first thing they all do is create a CloudStorageAccount by .Parse on the connection string, which is the account key.
Can anyone point to a sample project or sample code which uploads a file to an Azure File Storage account, but doesn't use the account Key?

You can use below code to construct a CloudFile object:
var cloudFile = new CloudFile(new Uri(fileUri), new StorageCredentials(sasToken));

Related

How to write to Blob Storage in Azure SQL Server using TSql?

I'm creating a stored procedure which gets executed when a CSV is uploaded to Blob Storage. This file is then processed using TSQL and wish to write the result to a file
I have been able to read a file and process it using DATA_SOURCE, database scoped credential and external data source. I'm however stuck on writing the output back to a different blob container. How would I do this?
If it was me, I'd use Azure Data Factory, you can create a pipeline that's activated when a file is added to a blob, have it import that file, run an SP and export the results to a blob.
That maybe an Azure function that is activated on changes to a blob container.

Is there any way to check programmatically whether SAS URL for Azure blob storage expired or not?

Is there any way to check programmatically whether SAS URL for Azure blob storage expired or not? I've looked through MSDN but couldn't find any useful information on that.
It depends on how your SAS is generated. If it's generated by account key directly, you can check se query parameter for expiry time, for example:
If you SAS is generated by stored access policy, the expiry time can't be directly found from SAS string, you have to get properties of the corresponding stored access policy from Azure Blob Storage service, here is the API.

Azure SQL Server : Cannot drop the external data source because it is being used

I am trying to import data into Azure SqlServer using OPENROWSET from a file which was uploaded into Azure Storage
I basically do the same flow as given example of Microsoft in here :
CREATE MASTER KEY ENCRYPTION BY PASSWORD = '<password>';
CREATE DATABASE SCOPED CREDENTIAL ElasticDBQueryCred
WITH IDENTITY = '<username>',
SECRET = '<password>';
CREATE EXTERNAL DATA SOURCE MyElasticDBQueryDataSrc
WITH (
TYPE = BLOB_STORAGE,
LOCATION = 'Azure Storage URL',
CREDENTIAL = ElasticDBQueryCred
);
since my secret was set to valid only within 1 days, each time my batch running, I will try to drop and recreate everything from ETERNAL DATA SOURCE to MASTER KEY
Everything seems to run fine until suddenly one of my created EXTERNAL DATA SOURCE cannot be drop by the error :
Cannot drop the external data source 'xxx' because it is being used.
which is causing my CREDENTIAL also be stuck, since I must drop the EXTERNAL DATA SOURCE before dropping the CREDENTIAL.
I think that might be a bug of Azure SQLServer. Any idea how can I drop this properly and prevent this error happen in the future?
I can leave the hanging resource alone and creating a new one for my import but is it a good practice?
The error is also mention here: https://social.msdn.microsoft.com/Forums/office/en-US/6e255ed0-21ad-4a52-b3b4-03fe94e05820/azure-sql-i-cant-drop-external-blobstorage-data-source-though-sysexternaltables-is-empty?forum=ssdsgetstarted
.But rescale the database or recreate it each time the error happens might be not a good idea.
Edit: I still didn't found the solution to this problem. But in my case, the external data source is hanging due to a mistake inside the FORMAT file (the one that defines the format of the import file, which I also store inside Azure Storage) I made while implementing new features. And even when I finally make the FORMAT file working correctly, the hanging external data source is still there and cannot be drop.

Usql with Azure Data Lake Store .net SDK

Can you please suggest can i use Usql with Azure Data Lake Store .net SDK.
Actually i need to upload some files to data lake store so i need to use Azure Data Lake Store SDK and i also need to add some record in Azure sql server database.
So i created a class library where i created a function to upload files to Data lake store as mentioned on below link:
https://learn.microsoft.com/en-us/azure/data-lake-store/data-lake-store-get-started-net-sdk
Then i am calling this function by usql. But its not working fine and throwing error.
Can you please suggest is it actually possible or not or i need to use any other approach for the same.
Thanks
The SDK is meant to be used from outside U-SQL, e.g., your client machine, or a VM outside of ADL, where the data lives that you want to upload.
If your files live inside ADLS already, see the answer to this question.
For more information why you cannot connect to web end points, see this reply.

Storing files in Azure SQLdatabase

I have a vb.net based application which references an Azure SQL Database, I have set up a storage account to which I would like to store files to from the application. I am not sure how to create that link between the DB and the Storage account?
Going through the "SQL Server Data Files in Windows Azure Storage service" Tutorial I cannot create a URI for the sotrage blob. Using Azure Storage Explorer I select my container go into security and generate a signature which all works fine. When I test the URI with the "Test in Browser" button I get this error:
<Error>
<Code>AuthenticationFailed</Code>
<Message>
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:22ab2830-0001-001d-67a0-5460bb000000 Time:2014-10-17T14:06:11.9864269Z
</Message>
<AuthenticationErrorDetail>
Signature did not match. String to sign used was r 2014-10-17T06:00:00Z 2014-10-25T06:00:00Z /macrocitrus/$root 2014-02-14
</AuthenticationErrorDetail>
</Error>
to what this means I have no idea. I am a completely new user with Windows Azure so I am not even sure that I am on the right track?
Is there any documentation that actually explains the steps or what one would require to allow storage access from an SQL DB to an Azure Storage account?
I would not recommend saving the binary content in SQL Database. Instead I would recommend that you save them in blob storage. Here are my reasons for doing so:
Blob storage is designed for that purpose.
Storing data in blob storage is much-much cheaper than storing the data in SQL Database.
By storing binary data with other data, you're unnecessarily making your data access layer bulkier as all the data will be streamed through your database.
General approach in these kinds of scenarios is to keep binary data in blob storage as blobs (think of blobs as files in the cloud). Since each blob gets a unique URL, you can just store the URL in your SQL Database table. So if we go with this approach, what you will be doing is first uploading the blob in blob storage, get its URL and then update the database.
If you search for uploading files in blob storage, I am sure you will find a lot of examples with source code (so I will not bother providing it here :); I hope its all right).
Now coming to the error you're getting. Basically the link you created using Azure Storage Explorer is known as Shared Access Signature (SAS) URL which basically grants a time-limited/permission bound access to your storage account. Now Azure Storage Explorer gave you a SAS URL for the container. There are two ways you can use that URL (assuming you granted Read & List permissions when creating the SAS URL:
To list blobs in that container, just append restype=container&comp=list to your URL and then paste it in the browser and you will see an XML listing of all blobs.
To download a blob, you would need to insert the name of the blob in the URL. So if your URL is like https://[youraccount].blob.core.windows.net/[yourcontainer]?[somestuffhere] and your blob name is myawesomepicture.png, your SAS URL for viewing the file in the browser would be https://[youraccount].blob.core.windows.net/[yourcontainer]/myawesomepicture.png?[somestuffhere]
I wrote a blog post on using Shared Access Signatures which you may find useful: http://gauravmantri.com/2013/02/13/revisiting-windows-azure-shared-access-signature/.