Incremental MS Build - msbuild

New to MSbuild
I want to create a Incremental MSbuild Task which does follows
Take Latest by Changeset(Done)
Create Items of all updated/added/Deleted files
Create output files base on input files.

Related

How to reference table name in DBT Model (GitHub)

How do I reference the table name (highlighted cursor) from a csv file in DBT Model (GitHub)?
My current yml file only has "models: ...".
The csv file to be referenced is named orders.csv, uploaded under tables -> datawarehouse folder.
I think you're referring to a seed, which is a feature where dbt can create a table in your warehouse using a .csv file that is stored alongside the code in your project.
After you add the .csv file to your seeds directory inside your project (or some other directory nested under /seeds/), you run dbt seed in your terminal to create the table from the data in the CSV. From your example, let's say the CSV is called orders.csv and is located at /seeds/tables/datawarehouse/orders.csv.
After that, you can select from the seed in other models by using ref with the seed's filename, so {{ ref('orders') }}.
If you are using another tool (not dbt seed) to upload the CSV, you need to find the location of the table in your data warehouse, and then add that location as a source, and you will specify the database/schema/table name in the sources.yml file. If you have the table defined as a source, you select from it with {{ source('my_source', 'my_table') }}.

Modify my existing SSIS package to perform this specific operation

I have an SSIS package created using SSDT and running as a job on SQL Server 2014.
This SSIS package retrieves an Excel file (.xlsx) from a specific folder and exports its content into a specific table on my SQL Server database. The package runs fine.
My Data Flow is in the following sequence:
Import Excel file from folder
Apply a Conditional Split to split data with today's date
Export the data into the SQL Server table in the database
Here is my problem:
I will now have 4 additional Excel files into that folder and they will need to be exported into that same SQL Server table.
So what is the best way forward to achieve this (assuming all of them are possible solutions):
Rewrite 4 additional SSIS packages from scratch?
Use “Save As” existing package with a new name (4 times) and modify the file name to be retrieved?
Modify my existing SSIS package to accommodate for the additional 4 Excel files?
Any help would appreciated.
Assuming the 4 excel files are the same structure and going to the same table, you'll want to use the ForEach loop for each file in the folder.
SentryOne has a good example of looping through each file in a folder and archiving. I imagine it can be adjusted for your use case.

pentaho data integrator start and end time of a transformation

I am new to PDI, my project will export data from multiple views in a postgresql database and output multiple files. One requirement is for each generated file, add two fields to show the start and end time of the transformation (from query a view to generating a file). What components/scripts to use?

Run a initial Liquibase script

This is my 2nd day using Liquibase.
I have a 'backup' or 'Repositry' with the database that I need to create locally on my PC.
I have looked at the documentation, but Im realy not 100% clear on how to run it.
Ive updated the Liquibase.properties file to reflect the correct paths and username and passwords.
How do you run the update command to generate the tables and test data.
Windows 7
The Liquibase documentation on 'Adding Liquibase to an existing project' is probably the best place to start. Basically, you want to set the properties file so that it refers to the existing 'backup' database, and then run liquibase generateChangeLog
This will connect to the existing database and generate a file that contains the structure of the existing database expressed (typically) in an XML file called a changelog. You then create a new properies file that will connect to your local database and use liquibase update to apply the changelog to the local database and populate the structure. Note that this does not typically transfer the data from the existing database to the new database, just the structure - the tables, keys, indexes, etc. If you want to have test data as well, you can either export that data from the existing database, or you might look into crafting the changesets manually. To export the data, a command like this would be used:
java -jar liquibase.jar --changeLogFile="./data/<insert file name> " --diffTypes="data" generateChangeLog

Sequential execution of generic SSIS package from multiple files

I have a generic SSIS package which is used for multiple files upload and there is only one batch file for triggering this package which is expecting a call from scheduler for different files.
The problem is all the files will get into the system at the same time and ESP will trigger the same batch file multiple times for each file .
I have a Execute SQL task component as my first component in the package which will delete the data from the table since multiple times these package is getting called table is loosing data for single file .
I want to synchronize this behavior meaning I want to run the package again for the other file only once first file load is completed.
You could have a schedule table, then create foreach loop as the first component called. There would be an oledb source for your schedule table that the loop would iterate through to get the list of files. e.g. Once the first file is uploaded, execute sql task to update the schedule table with updatedatetime. The schedule table would contain a row for each file. You would select which files to upload by something like select * from schedule where updatedatetime < getdate() -.5 After the first one is done, you execute another sql task to truncate the table and call the next one in the schedule table.