Power BI refreshes failing if Synapse is updated by Dynamics - is this to be expected? - azure-synapse

We have recently implemented Azure Synapse Workspace into our reporting landscape. The purpose of this Synapse Workspace is to store Dynamics data to be reported on by Power BI. We were using the Data Export Services mechanism to move data from Dynamics to Azure SQL Server and report from there but MS have deprecated DES and we now use Synapse as the substitute.
I have found, and this happens 50% of the time during working hours, that when I amend a Power BI report and the Power Query element re-evaluates itself, the re-evaluation seeks an update from the source - which is the Azure Synapse Workspace, the re-evaluation fails. I get an 'Operating system error code 12'. The error message is below with sensitive text scrubed.
Click this to show the error message received
Having Googled the error, it tells me a read (i.e. a Power BI refresh/re-evaluation) cannot take place against the Synapse Workspace if Synapse is updated being by Dynamics during the same time.
Is this correct? I can't believe MS would devise a DES replacement that cannot be read from if it is being updated from the source. The source (Dynamics) will be updated throughout the working day and so this would mean that no one can read from Synapse during the working day.
I'm wondering if further configuration is required within Synapse to allow reads.
If you can confirm what I'm facing is correct and/or advise me on how to remedy this it will be greatly appreciated.
Thanks.
I'm stumped on what to do. I have verified that when the Power BI report fails its refresh/re-evaluation the failing entity has actually been updated in the Synapse Workspace's CSV file. So the Google explanation seems to be correct.

Related

Usage Tracking in Azure synapse analytics

Can anyone share a Kusto query (KQL) that I can use in log analytics that would return some usage tracking stats?
I am trying to identify which "Views" and "Tables" are used the most. Also trying to find out who the power users are and commands/query that is run against the "Tables".
Any insights would be appreciated.
You can use below functions to gather the useage statics
DiagnosticMetricsExpand()
DiagnosticLogsExpand()
ActivityLogRecordsExpand()
And create target tables to store the function data to analyse the useage information.
Refer the Azure documentation for complete details https://learn.microsoft.com/en-us/azure/data-explorer/ingest-data-no-code?tabs=activity-logs
Tutorial: Ingest monitoring data in Azure Data Explorer without code
In this tutorial, you learn how to ingest monitoring data to Azure Data Explorer without one line of code and query that data.

Confusion About Azure Synapse Analytics

Can anyone please help me understand what components/services does Azure Synapse Analytics include?
From what I have read from both Microsoft website and other reviews, it says it is the new SQL Data Warehouse, however, it also says it brings together all these : data ingestion (like azure data factory), data warehouse, and big data analytics (like data lake)?
So what components exactly does a Azure Synapse Analytics include when you purchase it?
Thanks.
Azure Synapse Analytics service currently (as of 6th May 2020) refers to Azure SQL Data Warehouse, more specifically to "gen2" version of it. Microsoft released in November 2019 in Ignite'19 event the new name "Azure Synapse Analytics" and upcoming features for the service. The new features are currently available only in private preview, but I would assume they will be released in public preview soon. Access to new users to private preview is already closed, even though some Microsoft material still hints that you could apply to it.
You can already find information about the new features in documentation and other materil. The confusing part is that you cannot find them in portal yet if you are not part of the private preview. This makes it really hard for new users currently understand what really is available and what is not.
Good start to information on situation and features of both versions this can be found here:
Blog post Azure SQL Data Warehouse is now Azure Synapse Analytics
SQL DW documentation
Synapse new features documentation
Microsoft has made the release of this update very confusing. I assume they wanted to communicate early in Ignite'19 that they will have a competitive offering coming. Compared to some other cloud native data warehousing solutions the old version of Azure DW clearly were behind in many areas, e.g. in flexible scalability. The new Synapse Analytics capabilities look good and can bring Microsoft back to lead in this area.

Power BI web schedule refresh doesn't work

I connected my PostgreSQL DB to Power BI and then built a dashboard. Now I want to refresh the dashboard automatically once a day. I saw that there is an option to do that using the Power BI web version, so I have published the dashboard. In addition, I have pinned the report to the live view option. Then, I have determined the schedule refresh and added a new row to the DB. After the refresh time passed, nothing happened. I pressed the Refresh button in the Power BI web version, and still, nothing happened. However, when I went back to the Power BI desktop application and pressed the refresh button, the dashboard did refresh.
What am I doing wrong?
Thanks!
It looks like your PostgreSQL database is installed on-premise and Power BI Online server, which runs in the cloud, can't connect to your database, which runs in your internal network.
You need to install Power BI Gateway to allow Power BI to connect to your database, and configure it.
Your Power BI app is pulling data right from the source. The on-line web version requires a data gateway, either personal or on-prem depending on your business needs. Remember you can only use live refresh with a single source.
https://powerbi.microsoft.com/en-us/gateway/

Refreshing Power BI SSAS dataset

I'm using Power BI web preview and have SSAS as a data source. When my underlying SSAS tabular data model changes, I don't see my reports or the dataset in Power BI get updated.
Since manually "refreshing" the Power BI data set is not supported as of now, how do I make sure that my Power BI reports are in sync with the latest data?
There's a great blog on the Analysis Service connector here:
http://blogs.msdn.com/b/powerbi/archive/2015/03/11/power-bi-analysis-services-connector-deep-dive.aspx
It includes the top trouble shooting tips. If you're still having problems, please use the "?" at the top of the power bi preview UI to ask for support.
HTH,
Lukasz
As of now you can use Power BI Gateway, and you will have the option to update cubes on scheduled intervals, or you can live connect as well.
Are you using import or direct query method? If you are using import mode you can set up an schedule refresh for that. If you are using direct query all your data is going to be updated

The Pentaho BI Platform Workflow Issue

I have been working with Pentaho for the last few days. I have been able to setup the Pentaho Report Designer to generate a sample report by follow their documentation. Then I follow this article http://www.robertomarchetto.com/www/how_to_use_pentaho_report_designer_tutorial and managed to export the report to Pentaho BI server.
All I don't understand is Pentaho workflow. What should be the process I should follow which means what's the purpose of exporting the export to Pentaho BI server? Why there is a Data Integration tool? Why there is a BI sever when I can export the report from the Designer tool?
Requirement
All I want to do is retrieve the data from the MYSQL DB. Put them into a data-mart. Then from the data-mart generate a report.(According to what I have read, creating a data mart is the efficient way).
How can I get it done?
Pentaho Data Integration can be used to make this report generation automated.
In report designer you will be passing a parameter or set of parameters to generate a single report output.
With Data integration you can generate the reports for different set of parameters. for eg: if reports are generated on daily basis, we can make it automated for the whole month, so that there is no need of generating reports daily and manually.
And using the Pentaho Business Intelligence server we can make all these operations scheduled.
To generate Data/Table(Fact tables/dimension table) in MYSQL DB From difference source like files/different DB - Data Integration tool comes in to picture .
To create Schema on top of Fact tables - Mondrian tool
To handle user/roles on top of created cubes -Meta data editor
To create simple reports on top of small tables - Report Designer
For sequential Execution (at a go) usage of DI jobs/transformation , Reports, Java script - Design Studio
thanks to user surya.thanuri # forums.pentaho.com
The Data Integration tool is mostly for ETL, it's a separate tool and you can ignore it unless you are doing complex analysis of data from multiple dissimilar data sources. You don't need to 'export' reports to the pentaho server, you can write them directly to a directory then refresh the repository from inside the Pentaho web application. Exporting them is just one workflow technique.
You're going to find that there are about a dozen ways to do any one thing with Pentaho. For instance I use the CDA datasources with my reports vice placing the sql code inside my report. Alternatively you can link up to a Data Integration server to execute the Data Integration scripts to view a result set.
Just to answer your datamart question. In general a datamart should probably be supported by either the Data Integration tool (depending on your situation I don't exactly recommend this) or database functions/replication streams (recommended).
Just to hazard a guess, it sounds like someone tossed you a project saying: We need a BI system, here's the database where the data is stored, here are the reports we're already getting. X looked at Pentaho and liked it. You should use that.
First thing you need to do is understand the shape of the data, volume, tables, interrelations. Figure out what the real questions they want to answer are. Determine whether they need real time reporting, etc..etc. Just getting the datamart together itself, if you even need one, can take quite awhile. I think you may have jumped the gun on Pentaho itself.
thanks to user flamierd # forums.pentaho.com