Does data lake analytics has a support to create relational databases - azure-data-lake

It seems data lake analytics has a support to create databases but can we create relational databases too?

Each Azure Data Lake Analytics account can create databases that can contain relational data that you can create, manage and query with U-SQL.
That data is stored in a scaled out fashion inside the Azure Data Lake Storage. So in a sense it is creating relational databases.
However it does not give you the ability to create a SQL Server or Oracle or MySQL database.

Related

Access Azure Table Storage in SQL Server

I'm trying to access Azure Table Storage in a Gen 2 data lake from Azure SQL Server, but I can't find any documentation. Loads on how to get to csv's in blob storage, but nothing on Azure tables.
Any ideas?
John
Your requirement isn't feasible.
Azure Table storage is a service that stores non-relational
structured data (also known as structured NoSQL data) in the cloud,
providing a key/attribute store with a schemaless design.
Since, Table storage can't be queried using SQL, therefore there is no sense to access it using any SQL Server.
I recommend you to first go through Table storage concepts
before knowing how to query it.
Once getting the Table Storage structure, you can query the tables either through REST API or Cosmos DB Table API based on your application. Refer Querying tables and entities.
You can also follow this complete tutorial Quickstart: Build a Table API app with .NET SDK and Azure Cosmos DB to create basic application using Table Storage for learning purpose.

Exporting some tables from Azure SQL to Cosmos DB for PowerBI visualization

I have a situation where most of my organization's data sits at Azure SQL. Now we are including one more data source which gives 1GB of structured data daily. My objective is that I have to merge this 1GB with the Azure SQL data and give it to the PowerBI. I can't do this on PowerBi because it is slow.
Now should I choose CosmosDB as a sink and replicate some of the Azure SQL tables to CosmosDB?so that I will have stored procedure at cosmos which does the required merge and places it in the final tables at cosmos for PowerBi to pick up.
Or should I choose some other sink? (I can have appendblob but it has a 200GB upper limit. I can have Azure table storage but then how can I merge with Azure SQL data?
Any recommendations on what to do?

How to see Azure table data in SSMS

As the title suggest, I want to merge my Azure table data with the data residing in Azure SQL. However, I dont want to replicate Azure SQL data to Azure table.
Anyway to have the Azure table in SSMS and then I could create a view over Azure table and Azure SQl together?
Azure Table Storage is not a relational database and does not support join query. To join existing data in Azure SQL Database with data on Azure Table Storage, you will have to replicate/import your data in Azure Table Storage into Azure SQL, and then use SSMS to perform the join. That's the only solution.
Elastic Database queries on Azure SQL Database allows cross database queries involving Azure Database tables or Azure SQL Data Warehouse tables only.
What you want for something like this is Elastic Query. You can read more about it here. Pay attention to the warning about data movement though. This is good for querying, but it's not a great way to do large scale data moves.

Access Azure Data Lake Analytics Tables from SQL Server Polybase

I need to export a multi terabyte dataset processed via Azure Data Lake Analytics(ADLA) onto a SQL Server database.
Based on my research so far, I know that I can write the result of (ADLA) output to a Data Lake store or WASB using built-in outputters, and then read the output data from SQL server using Polybase.
However, creating the result of ADLA processing as an ADLA table seems pretty enticing to us. It is a clean solution (no files to manage), multiple readers, built-in partitioning, distribution keys and the potential for allowing other processes to access the tables.
If we use ADLA tables, can I access ADLA tables via SQL Polybase? If not, is there any way to access the files underlying the ADLA tables directly from Polybase?
I know that I can probably do this using ADF, but at this point I want to avoid ADF to the extent possible - to minimize costs, and to keep the process simple.
Unfortunately, Polybase support for ADLA Tables is still on the roadmap and not yet available. Please file a feature request through the SQL Data Warehouse User voice page.
The suggested work-around is to produce the information as Csv in ADLA and then create the partitioned and distributed table in SQL DW and use Polybase to read the data and fill the SQL DW managed table.

Azure Data Factory Copy Activity to Copy to Azure Data Lake Table

I need to Copy data incrementally from On-Prem SQL server into Table in Azure Data Lake Store.
But when creating Copy Activity using Azure Portal, in the Destination I only see the folders(No option for Tables).
How can I do scheduled On-prem table to Data Lake Table Syncs?
Data Lake Store does not have a notion of tables. It is file storage system (like HDFS). You can however use capabilities such as Hive or Data Lake Analytics on top of your data stored in Data Lake Store to conform your data to a schema. In hive, you can do that using external tables, while in Data Lake Analytics you can run a simple extract script.
I hope this helps!
Azure Data Lake Analytics (ADLA) does have the concept of databases which have tables. However they are not currently exposed as a target in Data Factory. I believe it's on the backlog, although I can't find the reference right now.
What you could do is use Data Factory to copy data into Data Lake Store then run a U-SQL script which imports it into the ADLA database.
If you do feel this is an important feature, you can create a request here and vote for it:
https://feedback.azure.com/forums/327234-data-lake
ADLA Databases and tables: