Tableau Incremental Refresh Using SQL Query - sql

I have a question regarding incremental refresh with SQL Query on a Tableau Server.
My plan was the following:
Run the query for data till yesterday(i.e 20/7/2021) . After this I will have the full extract until that Date.
The next day(22/7/2021), I will build a flow that will do this. Each day will run the query for the previous day (21/7/2021) and UNION the data with the extract. In that way, I will have the incremental extract using the SQL Query.
Problem:
For that Procedure, I must use the Output Extract, that the flow will produce.
I tried this Procedure on my local machine, but, Tableau Prep gives me the following Error.
What's the best solution to approach this problem? Is there a better way?
I also attach the full Flow.
Thank you in advance.

Related

How to add new row data to PowerBi table on refresh

On our SQL Server, our org has a table that contains a current instance of records. I need to query that table and append the output row(s) to a PowerBi data table.
I have researched doing this in Power Automate with the “Add Rows to a dataset” step. Unfortunately, I cannot find a way to use the aforementioned SQL query as the payload.
Has anyone else encountered this use case? Is there an alternative way to continuously add rows to a table based on a SQL query?
I would start with this stock template:
https://powerautomate.microsoft.com/en-us/templates/details/ab50f2c1faa44e149265e45f72575a61/add-sql-server-table-rows-to-power-bi-data-set-on-a-recurring-basis/
There are few ways
Incremental refresh https://learn.microsoft.com/en-us/power-bi/connect-data/incremental-refresh-overview
Duplicate's remover, you download whole DB, and then remove dublicates
Crete SQL side VIEW which do same things, and in PBI side use this VIEW

IBM SPSS How to import a Custom SQL Database Query

I am looking to see if the capability is there to have a custom SSMS sql query imported in SPSS (Statistical Package for the Social Sciences). I would want to build syntax that generates this query as my new dataset that I can then continue my scripted analysis. I see the basic query capability of one table from a Sql Server but I would like to create a query that joins to many tables. I anticipate the query to be a bit complex with many joins and perhaps data transformations.
Has anybody had experience or a solution to this situation?
I know I could take the query and make a table of it that SPSS can then connect to but my data changes daily and I would need a job in another application to refresh this table before my SPSS syntax would pull it and I would like to eliminate that first step by just having the query that grabs the data at the beginning of my syntax.
Ultimately I am looking to build out my SPSS syntax and schedule it in the Production Facility to run daily.

Adding new data to the end of a table using power query

I've got my query that pulls data from sql server 2012 into excel using power query. However, when I change the date range in my query I'd like to pull the new data into my table and store it below the previously pulled data without deleting it. So far all I've found is the refresh button, which will rerun my query with the new dates but replace the old. Is there a way to accomplish this? I'm using it to build an automated QA testing program that will compare this period to last. Thank you.
This sounds like incremental load. If your table doesn't exceed 1,1 Mio rows, you can use the technique described here: http://www.thebiccountant.com/2016/02/09/how-to-create-a-load-history-or-load-log-in-power-query-or-power-bi/

Pentaho ETL : Database Join vs Table Input

I need to write a database table data to a text file with some transformation.
There are two steps available to retrieve the data from the table, namely Table input and Database join. I don't see much difference between them except the "outer join?" option (correct me if I understood wrongly). So which would be better to use?
Environment:
Database : oracle
Pentaho Spoon : 5.3.* (Community edition)
Thanks in advance.
Table Input step in PDI is used to read the data from your database tables. The query would be executed once and will return you the result set. Check the wiki.
Database Join works slightly different. It will allow you to execute your query based on the data received from the previous step. For every row coming in from the previous step, the query in this step will be substituted and is executed. Check the wiki.
The choice of using the above steps clearly depends on your requirement.
If you need to fetch the data set from a database table, you should use the Table Input Step - The best choice.
In case, you need to run the query in the database for every row to fetch the result, use Database Join - The best choice.
Hope it helps :)

SSIS/SSRS to SAS

We have a migration task, which require to migrate ~30GB data from SQL server to SAS grid. We already have close to 50 reports using SSIS/SSRS and those need to move entirely to SAS grid. I am more familiar with SAS and haven't worked on SSIS/SSRS stuff.
How do I go about moving the process to SAS?
I know I can use Passthrough in SAS to call the SSIS/SSRS stuff from Proc SQL, but since they want everything needs to be coded in SAS, I am wondering if there is an easy way.
Is there a tool that can convert the SQL code and writes Proc sql code in SAS?
Thanks!
Park
You should be able to view the queries behind the SSIS jobs (run the report, then view the details behind it). Once you have the queries, copy/paste the SQL code into a SAS SQL passthrough statement. The table returned should give the same results as the report. It's then up to you to format the output as desired.