How does Data from Synapse SQL DW gets transferred to Power BI? - azure-virtual-network

I am importing data from SQL DW to Power Bi using SQL server authentication credentials.
I read in this Microsoft Doc that VNets can be used as Data gateways for various Power BI Data sources. Can this be applied here? Transfer of data from Synapse SQL DW to Power BI service will always happen through public internet or can it happen through VNets also?
I am new with these services, so my question could be silly!

Yes you can connect through public internet as well as from private vnet(data gateway).
Virtual network data gateways allows import or direct query datasets to connect to data services within an Azure VNet without the need of an on-premises data gateway.
as per the doc you are following VNet data gateways will support connectivity to the following Azure data services:
1.Azure SQL
2.Azure Synapse Analytics
3.Azure Data Explorer (Kusto)
4.Azure Table Storage
5.Azure Blob Storage
6.Azure HDInsight (Spark)
7.Azure Data Lake (Gen2)
8.Cosmos DB
Note:The The virtual network (VNet) data gateway is still in preview. and Virtual network data gateways is a premium-only feature, and will be available only in Power BI Premium workspaces and Premium Per User (PPU) for public preview. However, licensing requirements might change when VNet data gateways become generally available.
Reference
Create virtual network data gateways

Related

How to allow sql request to clients on Azure datalake

I use Azure datalake gen 2, I transform data with databricks and I have delta tables which are sent in Power BI. But the clients have to be allowed to request in sql my tables.
What is the best practice ? Is it possible with databricks or have I to use something else ?
Thank you in advance for helping me !!
With premium workspace, you could let Users Credentials pass through to storage account form within Azure Databricks.
Go to Compute --> Cluster --> Advanced options, you'll see a check box Enable credential passthrough for user-level data access (refer screenshot)

Tibco Spotfire, Tibco Analyst/ TDV connectivity to Azure Data Lake

Tibco spotfire is good for dashboards, but can't see any Azure data-source adapters in TDV, Anyway to seamless connect Azure with Spotfire for realtime dashboards, perhaps without synapse !?
You can ref this tutorial: Visualize Azure Data Lake Storage Data in TIBCO Spotfire through ADO.NET:
This article walks you through using the CData ADO.NET Provider for
Azure Data Lake Storage in TIBCO Spotfire. You will establish a
connection and create a simple dashboard.
It shows the example and may can give some guides.

SQL Server 2019 - Connecting to Databricks Cluster using SSMS or Azure Data Studio

Has anyone used SSMS v18.2 or Azure Data Studio to connect to a DataBricks Cluster and so query on DataBricks tables and/or the DataBricks File System (dbfs)?
Would like to know how you can set this up to show a DataBricks server in connections and use PolyBase to connect to dbfs
I can connect to ADLS using the PolyBase commands like as follows:
-- Scoped Credential
CREATE DATABASE SCOPED CREDENTIAL myScopedCredential
WITH
IDENTITY = '<MyId>#https://login.microsoftonline.com/<Id2>/oauth2/token',
SECRET = '<MySecret>';
-- External Data Source
CREATE EXTERNAL DATA SOURCE myDataSource
WITH
(
TYPE = HADOOP,
LOCATION = 'adl://mydatalakeserver.azuredatalakestore.net',
CREDENTIAL = myScopedCredential
)
-- Something similar to setup for dbfs?
-- What IDENTITY used for Scoped Credential?
As per my knowledge, Azure Databrick cannot be connect to SQL Server 2019 using SSMS or Azure Data Studio.
The following list provides the data sources in Azure that you can use with Azure Databricks. For a complete list of data sources that can be used with Azure Databricks, see Data sources for Azure Databricks.
The Spark connector for Microsoft SQL Server and Azure SQL Database enables Microsoft SQL Server and Azure SQL Database to act as input data sources and output data sinks for Spark jobs. It allows you to use real- time transactional data in big data analytics and persist results for ad-hoc queries or reporting.
For more details, refer "Connecting to Microsoft SQL Server and Azure SQL database with Spark connector".
Hope this helps.
This doesn't seem possible without the use of 3rd party tools or custom applications. Databricks SQL just doesn't expose the protocols necessary.
There are 3rd party tools (e.g. from CData) that can help you here. See this article: https://www.cdata.com/kb/tech/databricks-odbc-linked-server.rst

Data Referesh in Power BI from Azure SQL DB

I have data in Azure SQL DB, I used import (not directQuery) while making reports in Power BI Desktop, when i published them on Power BI web, I don't see option to schedule data Referesh for the reports.
I have tried installing on-premises data gateway but it fails to configure.

SQL Azure Inter Geographic Data-Center Transfer Pricing

We all know that the data transfer from Azure to SQL Azure is free, but data access from non Azure Data-Center is charged(per Gigabytes).
Is the Data transfer from Inter Azure Data Centers also free...?
i.e. : Azure from EUROPE Data-Center and SQL Azure from Asia Data-Center : Is this also free?
References :
SQL Azure Pricing Explained
I think they are not free.
Following the Microsoft explanation:
Data transfers measured in GB (transmissions to and from the Windows Azure datacenter): Data transfers are charged based on the total amount of data going in and out of the Azure services via the internet in a given billing period. Data transfers within a sub region are free.
It is not free.
Our company has servers and databases in both US North and US South datacenters and we are charged for any SQL Azure bandwidth sent between the two.