I would like to access/read/store file from linked (external) blob storage account - not from primary data lake store account. I added a new Data Source (external blob) in Azure Portal.
How can I specify that I need to read from external blob?
I found syntax in documentation (following example), but I do not understand what means BlobContainerName :
wasb://<BlobContainerName>#<StorageAccountName>.blob.core.windows.net/Samples/Data/SearchLog.tsv
Thank you in advance
Peter
It's the name of your container. See Blob Service Concepts. See also, WASB_Path_URI (Windows Azure Blob Storage Path URI).
Related
I have a Stream Analytics Job with Use System-assigned Managed Identity enabled and which I would like to output its results to a Data Lake Storage Gen2.
As far as I understand I should only need to go into the Storage Account's IAM settings and add the Stream Analytics Identity as a Stroage Blob Data Owner. However, I don't see the Category of Stream Ananlytics Jobs in the dropdown and I can't seem to find the service principal in any of the other ones.
Am I missing something here or is this scenario just not supported yet?
Just choose the options like below, in the Select option, search for the name of your Stream Analytics Job, then you can find it and add it.
I have incoming blobs in azure storage account for every day-hour, now I want to modulate the structure of the JSON inside the blobs and injest them into azure data lake.
I am using azure data factory and databricks.
Can Someone let me know how to proceed with it? I have mounted blob to databricks but now how to create a new structure and then do the mapping?
Is it possible to use U-sql to create a table under a specified blob storage?
If so could you provide me an example?
If you mean Azure Tables, there is no capability in U-SQL yet to access Azure Tables. You should use Azure Data Factory to move data from ADLS to Azure Tables.
If you mean Azure Blob Storage, you can register your blob storage account with your ADL account (via the portal or the relevant Azure ADL Powershell command) and then create a file inside the blob store with OUTPUT #result TO "wasb://container#account/path" USING ....
I've been building a Datafactory pipeline to move data from my azure table storage to a datalake store, but the tasks fail with an exception that I can't find any information on. The error is
Copy activity encountered a user error: ErrorCode=UserErrorTabularCopyBehaviorNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=CopyBehavior property is not supported if the source is tabular data source.,Source=Microsoft.DataTransfer.ClientLibrary,'.
I don't know where the problem lies, if in the datasets, the linked services or the pipeline, and can't seem to find any info at all on the error I'm seeing on the console.
Since the copy behavior from Azure Table Storage to Azure Data Lake Store is not currently supported as a temporary work around you could go from Azure Table Storage to Azure Blob Storage to Azure Data Lake store.
Azure Table Storage to Azure Blob Storage
Azure Blob Storage to Azure Data Lake Store
I know this is not ideal solution but if you are under time constraints, it is just an intermediary step to get the data into the data lake.
HTH
The 'CopyBehaviour' property is not supported for Table storage (which is not a file based store) that you are trying to use as a source in ADF copy activity. That is the reason why you are seeing this error message.
I have a vb.net based application which references an Azure SQL Database, I have set up a storage account to which I would like to store files to from the application. I am not sure how to create that link between the DB and the Storage account?
Going through the "SQL Server Data Files in Windows Azure Storage service" Tutorial I cannot create a URI for the sotrage blob. Using Azure Storage Explorer I select my container go into security and generate a signature which all works fine. When I test the URI with the "Test in Browser" button I get this error:
<Error>
<Code>AuthenticationFailed</Code>
<Message>
Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:22ab2830-0001-001d-67a0-5460bb000000 Time:2014-10-17T14:06:11.9864269Z
</Message>
<AuthenticationErrorDetail>
Signature did not match. String to sign used was r 2014-10-17T06:00:00Z 2014-10-25T06:00:00Z /macrocitrus/$root 2014-02-14
</AuthenticationErrorDetail>
</Error>
to what this means I have no idea. I am a completely new user with Windows Azure so I am not even sure that I am on the right track?
Is there any documentation that actually explains the steps or what one would require to allow storage access from an SQL DB to an Azure Storage account?
I would not recommend saving the binary content in SQL Database. Instead I would recommend that you save them in blob storage. Here are my reasons for doing so:
Blob storage is designed for that purpose.
Storing data in blob storage is much-much cheaper than storing the data in SQL Database.
By storing binary data with other data, you're unnecessarily making your data access layer bulkier as all the data will be streamed through your database.
General approach in these kinds of scenarios is to keep binary data in blob storage as blobs (think of blobs as files in the cloud). Since each blob gets a unique URL, you can just store the URL in your SQL Database table. So if we go with this approach, what you will be doing is first uploading the blob in blob storage, get its URL and then update the database.
If you search for uploading files in blob storage, I am sure you will find a lot of examples with source code (so I will not bother providing it here :); I hope its all right).
Now coming to the error you're getting. Basically the link you created using Azure Storage Explorer is known as Shared Access Signature (SAS) URL which basically grants a time-limited/permission bound access to your storage account. Now Azure Storage Explorer gave you a SAS URL for the container. There are two ways you can use that URL (assuming you granted Read & List permissions when creating the SAS URL:
To list blobs in that container, just append restype=container&comp=list to your URL and then paste it in the browser and you will see an XML listing of all blobs.
To download a blob, you would need to insert the name of the blob in the URL. So if your URL is like https://[youraccount].blob.core.windows.net/[yourcontainer]?[somestuffhere] and your blob name is myawesomepicture.png, your SAS URL for viewing the file in the browser would be https://[youraccount].blob.core.windows.net/[yourcontainer]/myawesomepicture.png?[somestuffhere]
I wrote a blog post on using Shared Access Signatures which you may find useful: http://gauravmantri.com/2013/02/13/revisiting-windows-azure-shared-access-signature/.