Error with Azure Synapse DEP-enabled workspaces - azure-synapse

I have a synapse workspace with DEP-enabled. As pypi libraries cannot be installed directly in spark pool. I used the method, although the installation worked fine. I am still not able to use the REST API.
https://learn.microsoft.com/en-us/azure/synapse-analytics/spark/apache-spark-manage-python-packages#install-wheel-files
Would be great if you could help.
Thanks

This is a Known limitation for synapse workspace with data exfiltration protection enabled.
Users can provide an environment configuration file to install Python packages from public repositories like PyPI. In data exfiltration protected workspaces, connections to outbound repositories are blocked. As a result, Python library installed from public repositories like PyPI are not supported.
As an alternative, users can upload workspace packages or create a private channel within their primary Azure Data Lake Storage account. For more information, visit Package management in Azure Synapse Analytics.

Related

Azure Synapse is failing to perform the deployment giving error "type Microsoft.Data.Tools.Schema.Sql.SqlDwDatabaseSchemaProvider is not valid"

I am working on Azure Synapse. I am able to build the Azure Synapse project successfully using the Azure CI pipeline's MS Build task.
But as I am trying to deploy Azure Synapse using the Azure CD pipeline, I am getting the following error.
Internal Error. The database platform service with type Microsoft.Data.Tools.Schema.Sql.SqlDwDatabaseSchemaProvider is not valid. You must make sure the service is loaded, or you must provide the full type name of a valid database platform service.
I am deploying the DacPac using the following task. I hope this should not be any concern.
https://github.com/DrJohnT/AzureDevOpsExtensionsForSqlServer/tree/master/extensions/PublishDacPac
This is a weird error because a couple of days ago same deployment was done successfully.
Please help!
I have resolved the issue. I have deeply investigated and found that this issue is related to Agent which I am using in the Azure DevOps pipeline.
One more thing, I am deploying SQL Database and Azure Synapse using different pipelines but using the same agent.
Below screenshot of the Azure SQL database pipeline agent. SQL Database is deploying successfully with agent specification vs2017-win2016
Below screenshot of the Azure Synapse deployment pipeline agent. Azure Synapse is deploying successfully with agent specification windows-2019
The gist is that one agent is being used but the specification is different for the Azure Synapse and SQL database while both are building on VS2019.

Grab bash script from an storage account and install in a Linux VMSS

I have a bash script in an azure storage account and I want to call that script from an Azure VMSS, when a new version of the script is available. Is this possible?
Thanks
Custom Script Extension can be used to run the script in VMSS.
The Custom Script Extension downloads and executes scripts on Azure virtual machines. This extension is useful for post deployment configuration, software installation, or any other configuration or management tasks. Scripts can be downloaded from Azure storage or GitHub, or provided to the Azure portal at extension run time. The Custom Script Extension integrates with Azure Resource Manager templates, and can be run using the Azure CLI, PowerShell, Azure portal, or the Azure Virtual Machine REST API.
It can be called on Virtual Machine Scaleset using Add-AzVmssExtension

How do I import one workspace into another in Synapse Analytics Workspace

How do I import one workspace into another workspace in Synapse Analytics? For example, I would like to import dev synapse into qa synapse.
It should behave like Azure Data Factory does where you can Import and Export ARM template.
Unfortunately, you cannot transfer an entire Azure Synapse Analytics workspace to another Azure Synapse Analytics workspace/subscription.
I would suggest you to vote up an idea submitted by another Azure customer.
https://feedback.azure.com/forums/307516-azure-synapse-analytics/suggestions/36256231-enable-support-for-cross-subscription-restore
https://feedback.azure.com/forums/307516-azure-synapse-analytics/suggestions/40528870-connectivity-to-code-repositories-similar-to-data
All of the feedback you share in these forums will be monitored and reviewed by the Microsoft engineering teams responsible for building Azure.
Azure SQL scripts, Notebook can be imported/exported from Azure Synapse Studio.
For pipelines, linked services you can download the support files and copy the JSON template and create a new pipeline using the same JSON template.
You can also use Azure PowerShell modules for to import and export.
For more details, refer Az.Synapse

How can I use NiFi to ingest data from/to ADLS

I would like to use NiFi to connect with ADLS. My scenario is like this: Nifi is installed and running in windows machine. Now I want to move data from my windows local directory to ADLS. I am not using any hadoop component for now. From ADLS again I want to move that data to SQL server which is in Azure too.
How can I connect windows running Nifi to ADLS? All the instruction I found configuring core-site.xml files and taking the jars to Nifi specific folder. But as I don't have Hadoop running(so I don't have core-site.xml file) in that case how I can connect Nifi to ADLS?
Can anyone please share the pointers how it can be done?
Thanks in advance.
You can try to use ExecuteGroovyScript processor and the native azure lib to work with adls.
Here is a java example:
https://github.com/Azure-Samples/data-lake-store-java-upload-download-get-started/blob/master/src/main/java/com/contoso/sample/UploadDownloadApp.java
But it could be easily converted to groovy script.

Azure Gov Cloud and Azure Functions trigger on Storage

I have hard time with Azure Functions on Azure Government. I need to create a C# trigger bases process on Azure Storage. The goal is to automate the process of the loading the files into Azure SQL DB when a file is dropped into Azure Storage.
Since Azure Functions in Azure Government are not fully comparable to Azure Function in regular Azure and not all UIs are the same, I can't deploy the function to trigger on a storage file.
I was able to build the process in regular Azure Cloud following instructions from https://github.com/yorek/AzureFunctionUploadToSQL but since Azure Government is missing the UI for Azure Functions I'm having hard time to replicating the process in Azure Government.
Portal UI support is not yet available in Azure Government, but it is coming soon. Additionally, Azure Government currently supports "App Service plan" ("Consumption plan" coming soon).
In the meantime, you can do everything you need. First, provision your Azure Function in Azure Gov via the Azure CLI by following this Quickstart example for Functions on Azure Gov. That same link also shows you how you can use Visual Studio to set up your triggers (in your case, a Blob trigger).
Once complete, deploy your Function to Azure Gov with Visual Studio.