Accessing azure data lake folders from windows explorer - azure-data-lake

It is possible to access Azure data lake folders from windows explorer through SMB or file share like we can do with Azure file storage?

No, that is not possible.
You can try to write your own aplication to do it, but its not an easy job.
You can also use visual studio or azure portal to access your data.

Related

How we can get the data from azure file share to on premises

I'm investigating whether the feature to copy multiple folders
(Exports from Collections) from Azure File Share to onPremise Accelerate file share (windows share) exists or not.
Azure file share is indeed supported in the Import/Export process:
"Azure Import/Export service is used to securely import large amounts of data to Azure Blob storage and Azure Files by shipping disk drives to an Azure datacenter"
You can read more about the feature and when it's best used here

Sand box environment for Data Lake Store and Analytics

Is there any sand box environment for Data Lake Store and Analytics so that I don't have to use my Azure Credits?
Azure Data Lake Analytics (ADLA) does have a mode for local execution. You install an emulator and this enables you to run your U-SQL scripts from Visual Studio either against your local instance or your Azure ADLA account.
Some reading on the topic:
https://azure.microsoft.com/en-gb/blog/run-u-sql-scripts-locally-with-updated-azure-data-lake-tools-for-visual-studio/
https://learn.microsoft.com/en-us/azure/data-lake-analytics/data-lake-analytics-u-sql-sdk
You can use the ADL tool with VisualStudio Community Edition which gives you the experience for free.

Is There a Local Emulator for the Azure Data Lake Store

When developing for Azure storage accounts, I can run the Microsoft Storage Emulator to locally keep Blobs, Queues, and Tables without having to connect to Azure online.
Is there something equivalent for the Azure Data Lake Store? It would be nice to develop locally for a while without having to connect to Azure online.
Have you tried Visual Studio with the Azure Data Lake Tools plug-in?
As pointed out by David, you can develop Azure Data Lake Analytics (ADLA) projects locally without needing connectivity to Azure for the ADLA account or the associated Azure Data Lake Store (ADLS) account. Is there some other application you would like to use with ADLS?
Thanks,
Sachin Sheth
Azure Data Lake team
Same problem here.
AFAIK the Storage Emulator is not yet able to really handle Data Lake (ADSL Gen2) Requests.
This Uri works (but looks for a file, not a dir):
http://127.0.0.1:10000/devstoreaccount1/packages-container/Dir/SubDir?sv=2020-04-08&se=2022-10-13T14%3A43%3A39Z&sr=b&sp=rcwl&sig=d2SxwYCkJGyx%2BHac9vntYQZOTt5QVs1bKgKb4%2FgcQ9k%3D
This one doesn't:
Error: Status: 403 (Server failed to authenticate the request. Make sure the value of the Authorization header is formed correctly including the signature.)
ErrorCode: AuthorizationFailure
http://127.0.0.1:10000/devstoreaccount1/packages-container/Dir/SubDir?sv=2020-04-08&se=2022-10-13T14%3A43%3A39Z&sr=d&sp=rcwl&sdd=2&sig=KU%2Fcu6W0Nsv8CucMgusubo8RbXWabFO8nDMkFxU1tTw%3D
The difference is that the second one uses the resource 'sr=d' (directory) while the first uses 'sr=b' (blob).
Both items are working on real Azure Storage (with ADSL Gen2).
The request is already tracked here: https://github.com/Azure/Azurite/issues/553
Tested on VS 2022 17.3.6 using Server: Azurite-Blob/3.18.0

How to write sqlcmd results directly to Azure Storage using Azure PowerShell?

Current story:
Moving overall BI solution fully to Azure cloud services. Building a new Azure DW and loading data from an Azure DB. Currently, Azure DW doesn't support linked servers and/or the elastic query (this is only supported in Azure DB). Due to price, we can not use data factory or an instance of SSIS. We can't use bcp as we don't have a local directory to hold the file in between loads.
Is it possible to use Azure PowerShell with sqlcmd to write results of a query directly to Azure Storage, without having to write to a file on a local directory in between?
Are there other options that aren't mentioned above?
Thank you for any input.
The current Azure Storage PowerShell (Set-AzureStorageBlobContent) only support upload blob from local file.
Azure Storage Client Library (https://github.com/Azure/azure-storage-net) support to upload blob from stream, can you try to develop your own application with the Azure Storage Client Library?
If your data is big, you can also try https://github.com/Azure/azure-storage-net-data-movement/, it has better performance in upload big blob.

Importing database from an sFTP server in Windows Azure

I'm building a website that will surface data from a third party system. The third party will provide a copy of all the data I need as a SQL restore file (*.bak) inside a rar file on their sftp server. The data changes every day, so my application will need to connect to the sftp site, get the file, unzip it, then restore it into my database server every night. I'm fairly comfortable scripting this in a standard windows environment, but the customer would prefer the application to be built on the MS Azure cloud, which doesn't seem to support a common solution to the problem. It's possible we could abandon Azure, but I'd like to know what the best strategy would be for implementing in Azure if it's possible.
This depends on whether you are trying to use Azure PaaS (cloud service and SQL Azure) or IaaS (VMs). If you are using VMs on Windows Azure, there is going to be no difference between Windows Azure and your familiar Windows environment - so yes, you can do this on Windows Azure.
This can't really be done in Azure cloud services and SQL Azure (SQL Azure cannot restore a .bak file). But your application doesn't seem to be the kind that would run as a cloud service anyway.
Stick to doing it on VMs and it will work as you are familiar with.