Change BigQuery Datasource in Google Data Studio - google-bigquery

I need to Change datasource for bigquery connector in data studio dashboard as i am moving for QA to production environment, schema is same for both of the datasouces.
Is there any way to achieve it so that i don't have to create custom metrics again for new data source.

If the schema is exactly the same I believe you can just edit for data source, choose reconnect and then select the new data source.
Be warned though, this will change that data source for all reports that use it.

Related

Is there a way to add custom queries to existing datasources in Google Data Studio?

I have a postgresql database that's already connected to Google Data Studio as a datasource. Is there a way for me to write SQL-style queries within Google Data Studio and use the results as a data source rather than adding a new data source, connecting the same database, and then adding a custom query every time? Thanks!

Load multiple files using Azure Data factory or Synapse

I am moving from SSIS to Azure.
we have 100's of files and MSSQL tables that we want to push into a Gen2 data lake
using 3 zones then SQL Data Lake
Zones being Raw, Staging & Presentation (Change names as you wish)
What is the best process to automate this as much as possible
for example build a table with files / folders / tables to bring into Raw zone
then have Synapse bring these objects either full or incremental load
then process the them into the next 2 zones I guess more custom code as we progress.
Your requirement can be accomplished using multiple activities in Azure Data Factory.
To migrate SSIS packages, you need to use SSIS Integrated Runtime (IR). ADF supports SSIS Integration which can be configured by creating a new SSIS Integration runtime. To create the same, click on the Configure SSIS Integration, provide the basic details and create a new runtime.
Refer below image to create new SSIS IR.
Refer this third-party tutorial by SQLShack to Move local SSIS packages to Azure Data Factory.
Now, to copy the data to different zones using copy activity. You can make as much copy of your data as your requirement using copy activity. Refer Copy data between Azure data stores using Azure Data Factory.
ADF also supports Incrementally load data using Change Data Capture (CDC).
Note: Both Azure SQL MI and SQL Server support the Change Data Capture technology.
Tumbling window trigger and CDC window parameters need to be configured to make the incremental load automated. Check this official tutorial.
The last part:
then process them into the next 2 zones
This you need to manage programmatically as there is no such feature available in ADF which can update the other copies of the data based on CDC. You need to either create a separate CDC for those zones or do it logically.

Is sharing dataset in the Bigquery is migration?

We need to migrate the data from the old GCP instance to new instance( with new organization node). I am using the "share dataset" option to move the data. It is very convenient approach. Do you think this is a good way to migrate data or should we create new tables and then load the data into the tables?
Thanks in advance!
It's depend on what you want to achieve. The share dataset feature allow other to access the data because you have granted the permission.
However, the data doesn't move and still belong to the old GCP project. If you remove the project, you remove the data. In addition, it's still the old project that pay for the data storage, the new one only for the data processing.
If you plan to shut down the old project, you have to copy the data. Automatically with the data transfert service, or by querying them if you want to filter/transform the existing data before storing them in the new project.

How can i preserve old data in powerbi?

I'm using web api to import data into powerbi. After every refresh, old data is replaced by new data of web api so my question is how can I store that old data in power bi ?
Power BI will not store data, unless you have a query source that will support incremental refresh.
https://learn.microsoft.com/en-us/power-bi/admin/service-premium-incremental-refresh
It would be best to use a tool like Azure Function, Azure Logic Apps or Power Automate to get the data and save it as file to a folder then import the data from the folder. Other option would be to move the data to a database table to preserve the history.

How to move data between Azure SQL database instances

I have a simple Azure website and Azure SQL database. I have now created a new (empty) Azure SQL database and I want to copy the contents of the old database into the new one. The data is only a few kB in size but it would be a pain to do it manually. What is a quick and easy way to do this, using simple tools like Visual Studio and Azure portal?
So just to be clear I want to copy all tables and rows to the new DB.
It turns out you can do this using the Azure portal.
On the original database, choose Export. You need a storage account in the same region as the database for this.
After exporting the database to a .bapac file, choose New / SQL Database / Import and point it to the bapac file. If this is in a different data centre, it will incur data bandwidth charges.
Very simple and this worked great; just a little difficult to find at first.
When I need to such operation, I use Sql Server Data Tools. It`s a plugin for Visual Studio that allow you to copy data, schema and migrate from one version to another.
http://blogs.msdn.com/b/ssdt/