Azure Data Lake - Data Security and Compliance - azure-data-lake

If the data in the Azure Data Lake is deleted, is the data fully deleted and non-retrievable?

Unfortunately yes, thought it might have asked to create a backup. Try restoration. Always works

Related

Azure Synapse Lake Database Not Appearing in Built Serverless Pool List

I have created a new Azure Lake Database using the following procedure
The Lake Database name is called TestLakeDB.
However, when I check the list of databases available in Use database TestLakeDB doesn't appear.
Any thoughts?
Thanks for the valuable discussion. Posting your conversation as answer to help other community members who faces similar issues.
When we create Lake database after connecting to the github, it won't reflect in the Use Database because it is created in github mode.
To reflect the the Lake Database, create the database in the synapse live mode and connect to the github. Now we can see it reflects our database named Lake_Database1 which is created in synapse live mode in the Use Database.

Write to data lake in stream analytics

Is there a way that I can have a output to data lake from stream analytics and use a aad app or something else than my account that is used to write to data lake? It is not practical to have a user as the one that writes to the data lake.
According to your description, I checked and tested Azure Data Lake Store output for Azure Stream Analytics, and I found that this output would use my current account for authorization as you mentioned.
Moreover, as Renew Data Lake Store authorization section mentioned as follows:
Currently, there is a limitation where the authentication token needs to be manually refreshed every 90 days for all jobs with Data Lake Store output.
For your requirement, I assumed that you could add feedback here. Or you could choose other outputs type for temporarily storing your results, then you could use another background task to trigger the temporary output store, then manually retrieve the records and write to your data lake. For this approach, you could leverage Service-to-service authentication with Data Lake Store.
For now answer is No
But it is planned to be available till the end of 2018
https://feedback.azure.com/forums/270577-stream-analytics/suggestions/15367185-please-provide-support-for-azure-data-lake-store-o

Azure Data Factory Copy Activity to Copy to Azure Data Lake Table

I need to Copy data incrementally from On-Prem SQL server into Table in Azure Data Lake Store.
But when creating Copy Activity using Azure Portal, in the Destination I only see the folders(No option for Tables).
How can I do scheduled On-prem table to Data Lake Table Syncs?
Data Lake Store does not have a notion of tables. It is file storage system (like HDFS). You can however use capabilities such as Hive or Data Lake Analytics on top of your data stored in Data Lake Store to conform your data to a schema. In hive, you can do that using external tables, while in Data Lake Analytics you can run a simple extract script.
I hope this helps!
Azure Data Lake Analytics (ADLA) does have the concept of databases which have tables. However they are not currently exposed as a target in Data Factory. I believe it's on the backlog, although I can't find the reference right now.
What you could do is use Data Factory to copy data into Data Lake Store then run a U-SQL script which imports it into the ADLA database.
If you do feel this is an important feature, you can create a request here and vote for it:
https://feedback.azure.com/forums/327234-data-lake
ADLA Databases and tables:

Error trying to move data from Azure table to DataLake store with DataFactory

I've been building a Datafactory pipeline to move data from my azure table storage to a datalake store, but the tasks fail with an exception that I can't find any information on. The error is
Copy activity encountered a user error: ErrorCode=UserErrorTabularCopyBehaviorNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=CopyBehavior property is not supported if the source is tabular data source.,Source=Microsoft.DataTransfer.ClientLibrary,'.
I don't know where the problem lies, if in the datasets, the linked services or the pipeline, and can't seem to find any info at all on the error I'm seeing on the console.
Since the copy behavior from Azure Table Storage to Azure Data Lake Store is not currently supported as a temporary work around you could go from Azure Table Storage to Azure Blob Storage to Azure Data Lake store.
Azure Table Storage to Azure Blob Storage
Azure Blob Storage to Azure Data Lake Store
I know this is not ideal solution but if you are under time constraints, it is just an intermediary step to get the data into the data lake.
HTH
The 'CopyBehaviour' property is not supported for Table storage (which is not a file based store) that you are trying to use as a source in ADF copy activity. That is the reason why you are seeing this error message.

Is Azure Table storage a column-oriented database like HBase

I wonder to know how data is stored on disk in Azure Table? are they stored in a columnar format like HBase?
Microsoft Azure Table is a form of Microsoft Azure Storage, a scalable cloud storage system. There are three layers within an Azure Storage stamp and Stream layer stores the bits on disk, and in charge of distributing and replicating the data across many servers to keep the data durable within a stamp. Please see “Stream Layer” section in the following paper (http://sigops.org/sosp/sosp11/current/2011-Cascais/11-calder-online.pdf) to understand how we manage data on the hardware.
I can't say for sure, but I don't think so. Azure Table Storage is a key-value store. HDInsight is Azure's column-family storage, built on Hadoop, similar to HBase.