Connecting Azure blob storage to PowerApps - azure-storage

I know it is possible to use blob storage as a data source in PowerApps, but is it possible to get data from e.g Excel that is stored in a blob and show it on a form?

Currently using Excel stored in blob storage is not possible in PowerApps. You can consider creating a new feature request for this in the PowerApps Ideas board.

Related

How to write to Blob Storage in Azure SQL Server using TSql?

I'm creating a stored procedure which gets executed when a CSV is uploaded to Blob Storage. This file is then processed using TSQL and wish to write the result to a file
I have been able to read a file and process it using DATA_SOURCE, database scoped credential and external data source. I'm however stuck on writing the output back to a different blob container. How would I do this?
If it was me, I'd use Azure Data Factory, you can create a pipeline that's activated when a file is added to a blob, have it import that file, run an SP and export the results to a blob.
That maybe an Azure function that is activated on changes to a blob container.

Excel into Azure Data Factory into SQL

I read a few threads on this but noticed most are outdated, with excel becoming an integration in 2020.
I have a few excel files stored in Drobox, I would like to automate the extraction of that data into azure data factory, perform some ETL functions with data coming from other sources, and finally push the final, complete table to Azure SQL.
I would like to ask what is the most efficient way of doing so?
Would it be on the basis of automating a logic app to extract the xlsx files into Azure Blob, use data factory for ETL, join with other SQL tables, and finally push the final table to Azure SQL?
Appreciate it!
Before using Logic app to extract excel file Know Issues and Limitations with respect to excel connectors.
If you are importing large files using logic app depending on size of files you are importing consider this thread once - logic apps vs azure functions for large files
Just to summarize approach, I have mentioned below steps:
Step1: Use Azure Logic app to upload excel files from Dropbox to blob storage
Step2: Create data factory pipeline with copy data activity
Step3: Use blob storage service as a source dataset.
Step4: Create SQL database with required schema.
Step5: Do schema mapping
Step6: Finally Use SQL database table as sink

How can i preserve old data in powerbi?

I'm using web api to import data into powerbi. After every refresh, old data is replaced by new data of web api so my question is how can I store that old data in power bi ?
Power BI will not store data, unless you have a query source that will support incremental refresh.
https://learn.microsoft.com/en-us/power-bi/admin/service-premium-incremental-refresh
It would be best to use a tool like Azure Function, Azure Logic Apps or Power Automate to get the data and save it as file to a folder then import the data from the folder. Other option would be to move the data to a database table to preserve the history.

Reading data from Oracle storage cloud to external table

Hi I have csv file in object storage in Oracle cloud. I want to store this data in the external table which is outside the cloud.can anybody guide me on the same?
How can I read the data from cloud and store in table? I am using Oracle gen2 cloud.
Dbms_cloud is the solution for the above issue.

How to move sharepoint list or excel file to azure sql dw?

I want to copy data from sharepoint to microsoft azure sql DW using azure datafactory or alternative service. Can I do this. Please anyone help me with this.
You can do this by setting up a data pipeline using Azure Data Factory to Azure blob storage. Afterwards you can use Azure's fast PolyBase technology to load the data from blob to your SQL Data Warehouse instance.
Can I ask how much data you intend on loading into the DW? Azure Data Warehouse is intended for use with at least terabyte level data up to petabyte compute and storage. I only ask because each SharePoint list or Excel file has a maximum of 2GB per file.