How can i preserve old data in powerbi? - api

I'm using web api to import data into powerbi. After every refresh, old data is replaced by new data of web api so my question is how can I store that old data in power bi ?

Power BI will not store data, unless you have a query source that will support incremental refresh.
https://learn.microsoft.com/en-us/power-bi/admin/service-premium-incremental-refresh
It would be best to use a tool like Azure Function, Azure Logic Apps or Power Automate to get the data and save it as file to a folder then import the data from the folder. Other option would be to move the data to a database table to preserve the history.

Related

Load multiple files using Azure Data factory or Synapse

I am moving from SSIS to Azure.
we have 100's of files and MSSQL tables that we want to push into a Gen2 data lake
using 3 zones then SQL Data Lake
Zones being Raw, Staging & Presentation (Change names as you wish)
What is the best process to automate this as much as possible
for example build a table with files / folders / tables to bring into Raw zone
then have Synapse bring these objects either full or incremental load
then process the them into the next 2 zones I guess more custom code as we progress.
Your requirement can be accomplished using multiple activities in Azure Data Factory.
To migrate SSIS packages, you need to use SSIS Integrated Runtime (IR). ADF supports SSIS Integration which can be configured by creating a new SSIS Integration runtime. To create the same, click on the Configure SSIS Integration, provide the basic details and create a new runtime.
Refer below image to create new SSIS IR.
Refer this third-party tutorial by SQLShack to Move local SSIS packages to Azure Data Factory.
Now, to copy the data to different zones using copy activity. You can make as much copy of your data as your requirement using copy activity. Refer Copy data between Azure data stores using Azure Data Factory.
ADF also supports Incrementally load data using Change Data Capture (CDC).
Note: Both Azure SQL MI and SQL Server support the Change Data Capture technology.
Tumbling window trigger and CDC window parameters need to be configured to make the incremental load automated. Check this official tutorial.
The last part:
then process them into the next 2 zones
This you need to manage programmatically as there is no such feature available in ADF which can update the other copies of the data based on CDC. You need to either create a separate CDC for those zones or do it logically.

Excel into Azure Data Factory into SQL

I read a few threads on this but noticed most are outdated, with excel becoming an integration in 2020.
I have a few excel files stored in Drobox, I would like to automate the extraction of that data into azure data factory, perform some ETL functions with data coming from other sources, and finally push the final, complete table to Azure SQL.
I would like to ask what is the most efficient way of doing so?
Would it be on the basis of automating a logic app to extract the xlsx files into Azure Blob, use data factory for ETL, join with other SQL tables, and finally push the final table to Azure SQL?
Appreciate it!
Before using Logic app to extract excel file Know Issues and Limitations with respect to excel connectors.
If you are importing large files using logic app depending on size of files you are importing consider this thread once - logic apps vs azure functions for large files
Just to summarize approach, I have mentioned below steps:
Step1: Use Azure Logic app to upload excel files from Dropbox to blob storage
Step2: Create data factory pipeline with copy data activity
Step3: Use blob storage service as a source dataset.
Step4: Create SQL database with required schema.
Step5: Do schema mapping
Step6: Finally Use SQL database table as sink

Storing Data for Google Data Studio

I have an application that makes reports available by http in either CSV or JSON format. I want this data to be accessible to Google Data Studio. I was considering building a connector to access the data, but the number of rows that can be accessed at any given time is quite small and there is a daily data limit. So I want to build a system to download the reports daily and store them to be accessed by Data Studio. I created a script to load the reports into a Google Cloud SQL but this is quite expensive because of the base cost of running a Google Cloud SQL machine. Any ideas how else to deal with a situation like this?
You can use firebase Realtime Database.
I used it before for storing 1G Data and 20k rows.
I have code samples for that.

Change BigQuery Datasource in Google Data Studio

I need to Change datasource for bigquery connector in data studio dashboard as i am moving for QA to production environment, schema is same for both of the datasouces.
Is there any way to achieve it so that i don't have to create custom metrics again for new data source.
If the schema is exactly the same I believe you can just edit for data source, choose reconnect and then select the new data source.
Be warned though, this will change that data source for all reports that use it.

Is tableau able to access data dynamically?

Usually a Tableau dashboard operates on "static" data that are "attached" to the published dashboard. I wonder if it is possible to make Tableau able to read data on-the-fly (when a user interacts with it). By that I mean that the data, that should be visualized, are taken from a data base that can by "dynamic". It means, for example, that the data shown by Tableau today and yesterday should not be the same because content of the database might change. Alternatively, we might try to retrieve data from an API. For example Tableau sends a request to a HTTP server and gets a data table in form of JSON and than visualizes it. Is Tableau able to do that?
Yes, Tableau can connect to live data sources such as any number of database technologies. No, it cannot send HTTP requests for JSON directly. It does a have web data connection feature if you or someone has built that web service. Here are some tips on when to use Live connections versus taking an Extract. http://mindmajix.com/use-direct-connection-data-extract-tableau/