Can I use databricks notebook to re-structure blobs and save it in another azure storage account? - dataframe

I have incoming blobs in azure storage account for every day-hour, now I want to modulate the structure of the JSON inside the blobs and injest them into azure data lake.
I am using azure data factory and databricks.
Can Someone let me know how to proceed with it? I have mounted blob to databricks but now how to create a new structure and then do the mapping?

Related

Reading data from Oracle storage cloud to external table

Hi I have csv file in object storage in Oracle cloud. I want to store this data in the external table which is outside the cloud.can anybody guide me on the same?
How can I read the data from cloud and store in table? I am using Oracle gen2 cloud.
Dbms_cloud is the solution for the above issue.

How to create a table in a blob storage using U-SQL

Is it possible to use U-sql to create a table under a specified blob storage?
If so could you provide me an example?
If you mean Azure Tables, there is no capability in U-SQL yet to access Azure Tables. You should use Azure Data Factory to move data from ADLS to Azure Tables.
If you mean Azure Blob Storage, you can register your blob storage account with your ADL account (via the portal or the relevant Azure ADL Powershell command) and then create a file inside the blob store with OUTPUT #result TO "wasb://container#account/path" USING ....

How to move sharepoint list or excel file to azure sql dw?

I want to copy data from sharepoint to microsoft azure sql DW using azure datafactory or alternative service. Can I do this. Please anyone help me with this.
You can do this by setting up a data pipeline using Azure Data Factory to Azure blob storage. Afterwards you can use Azure's fast PolyBase technology to load the data from blob to your SQL Data Warehouse instance.
Can I ask how much data you intend on loading into the DW? Azure Data Warehouse is intended for use with at least terabyte level data up to petabyte compute and storage. I only ask because each SharePoint list or Excel file has a maximum of 2GB per file.

How to backup Azure Blob Storage?

Is there a way to backup azure blob storage?
If we have to maintain a copy in another storage account or subscription, cost will be doubled, right?
Is there a way to perform backup at a reduced cost instead of doubled cost ?
Any other built-in functionality available in azure like back up zipped/compressed blob for backup functions ?
Configure your blob storage as cool blob and copy to another cool blob or another storage provider like Google Nearline or S3 IA.

Error trying to move data from Azure table to DataLake store with DataFactory

I've been building a Datafactory pipeline to move data from my azure table storage to a datalake store, but the tasks fail with an exception that I can't find any information on. The error is
Copy activity encountered a user error: ErrorCode=UserErrorTabularCopyBehaviorNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=CopyBehavior property is not supported if the source is tabular data source.,Source=Microsoft.DataTransfer.ClientLibrary,'.
I don't know where the problem lies, if in the datasets, the linked services or the pipeline, and can't seem to find any info at all on the error I'm seeing on the console.
Since the copy behavior from Azure Table Storage to Azure Data Lake Store is not currently supported as a temporary work around you could go from Azure Table Storage to Azure Blob Storage to Azure Data Lake store.
Azure Table Storage to Azure Blob Storage
Azure Blob Storage to Azure Data Lake Store
I know this is not ideal solution but if you are under time constraints, it is just an intermediary step to get the data into the data lake.
HTH
The 'CopyBehaviour' property is not supported for Table storage (which is not a file based store) that you are trying to use as a source in ADF copy activity. That is the reason why you are seeing this error message.