Extracting and loading data from a rest API whit data factory automatically whenever there is a new report ID - api

I am aiming to create a service combining Azure Data Factory and Azure Databricks where one or several reports are extracted and stored from a Rest API (swagger). I have managed to obtain the data of the reports maually using each individual ID using copy data from rest API. However, I'd like to obtain all the reports that share the same name, and whenever there is a new one (new ID) automatically extract it and store it.
I asume that dynamizing this, could be done by querying the IDs for that specific name with a periodic trigger and if eventually there is a new one, the data would be queried by using the updated list of IDs obtained. However I can't get my mind around how to use this list as parameters I have neither found a function that could do this.
I have also tried to use Azure functions, but I am not sure if it would cover this need and I gave up as I did struggle in the config.
Could I get some help please?
Thanks in advance!
Santiago

Related

Rest API parameters

I am trying to get data from a Rest API using Excel's Power Query or Power BI.
According to the developer portal a string parameter ID is required. However, I don't want to extract data or only on (in this case) purchase. I want to extract a whole list of purchases. Furthermore, I don't know the ID -it's a very long difficult to understand ID [Eg. 75e5cff4-103f-43d9-8d46-3835d96b49cf]
How can I write a code to extract all data even if I don't know the ID's?
I've tried using *'s but it doesn't work

How to Populate REST APIs with Azure Data Factory Pipeline with Multiple Entries

I am trying to establish a way of ingesting multiple company data with Azure Data Factory using HTTP Link Service.
The below image shows how to ingest a single data source or a single entry using the HTTP link
https://duedil.io/v4/company/gb/06999618.json
The above would give me the financials for a single company. If we want to ingest the financials of another company with company id of 07803944 we would have to replace the above link with
https://duedil.io/v4/company/gb/07803944.json
For obvious reasons it would not be practical to have a copy activity or pipeline for every company we need the financials for.
Therefore, can someone let me know if's possible to parameterize the link service to ingest multiple companies in the same copy activity? Alternatively, simply add individual company ids to the link service?
I tried the following, but it didn't work.

Access Database - Create a table using data from two different sources

I am a new grad and my programming/database skills are very rudimentary. I was tasked with creating a database and ran into this one issue that I can't solve. I have to create a report that shows some testing results. There are two types of tests - custom tests and core test. All custom tests have a core attached to them. So I have two test IDs - Core ID and Custom ID. The Custom ID always has a Core ID that it can be tracked back to.
So I can't find a way to consolidate both custom and core results in one place and use that as a record source for my report. I tried making a temp table to get custom and core but then I can't consolidate that data that overlaps when I have a custom result that has a core id attached to it as well. Should I look into using VBA? I've tried using update, union, append query etc. but I can't reach a solution.
How can I create a table that extracts data from two different sources? and removes the duplicates. I've used union (tried UNION and UNION ALL as well) query but it omits data that has a core and custom. Some guidance will be greatly appreciated
enter image description here
So I've attached a picture and this is where i get into troubles. The Custom is related to core and the data i need to fetch is in the core table. I have a table that links the custom to a specific core but then how do I tell Access to go to the core to fetch more details from the core table. Like I am having issues putting that logic in.

SESSION_USER equivalent from Big Query in Data Studio reports

We are creating dashboards for clients using data studio.
Each client should see their data in the dashboard, based on their login credentials. It is simple to create an authorized_view in Big Query to let certain users see certain rows of an underlying shared table. But how would one achieve to then move this into a dashboard which can be shared with each client, yet show only the individuals client in the dashboard instead of the data that was visible to the report creator?
So let's say we have a large table with a bunch of columns and one column email which contains the email of users. Now, we want the dashboard to show metrics for each user based on this email column.
On DataStudio in the datasource schema review step, make sure the flag USING VIEWER’S CREDENTIALS is on. By turning it on, the query when being executed will use the viewer’s credential instead of the owner who created the report.
After you finish create proper visualization on Data Studio, final step is to share the report to eg: store managers using the share option of Data Studio which is similar to share a Google Docs. You can confidently share it with the whole organization or with the email group of eg: store managers, permission already be controlled at data level.
Read more about this topic here.

Filtered one-way synchronization of Azure SQL database

We have a multi-tenant, single db application where some customers have expressed the desire to get direct access to their own data.
I have been suggested looking into Azure Data Sync to achieve a setup where each of the customers get their own Azure SQL instance to which we setup a one-way synchronization of their data from the master database.
I managed to find some documentation on this, but one I got around to try it out in a lab setup, it looks like the ability to filter rows in the sync job has been removed in a later iteration of the Azure Data Sync service.
Am I wrong or is that feature really gone? If so, what would be your suggestions to achieve something similar on Azure?
You cannot filter rows using Azure SQL Data Sync. However, you can build a custom solution based on Sync Framework as explained here.