Liquibase - Test changeset before executing - liquibase

I have a pipeline Jenkins that execute liquibase scripts. However, lots of time the pipeline failed because there are errors in the script.
I would like to test my script locally before running the pipeline. I would run the script locally to detect if there are errors (syntaxe problem, column that doesn't exist, etc), without creating an entry in the databasechangelog.

One option is to run updateSQL, which will display the sql that liquibase update WOULD run. You can take that sql and run it in any SQL query IDE of your choice to test syntax.

Related

Any way to validate pig script before running in hdfs cluster?

I'm new to pig script. The problem I'm facing is inability to validate how syntactically correct my script is. I have to upload it to hdfs cluster and run there just to realize I missed ';' in the end of the line. Big waste of time. I use IntelliJ IDEA with pig script plugin, but while it helps to highlight pig statements it does not validate it. Apache Pig does not seem to have any compiler, you can only run it, but I can't run it locally, data is not available from my laptop. So I wonder it there is any sophisticated pig script syntax validator, so I can run it and check if my script syntactically correct before uploading to the server.

Bat file to run a sql query on a schedule through Task Scheduler

I am trying to run a .sql script on a schedule. I have created a batch file to run the script. The script runs fine in sql server management studio and also when I run the batch file content through cmd.
Contents of the batch file:
sqlcmd -S omfmesql -U OMESRV -P orat -i "\\pvsrv-
fsr14\data\Projects\Stat_Table_Creation_unique.sql"
The sql script is supposed to update a stat table. When I run it though cmd and refresh the stat table, the numbers are updated. But when I run this batch file through Task Scheduler, the only action that seems to be performed is running C:\Windows\SYSTEM32\cmd.exe
The task is stated to be completed successfully but the sql query is just not run.
I am not too experienced with Task Scheduler. Any help here would be very much appreciated. Thanks!
Note: I am not intending to use SQL Server Agent
If you have not done so, you need to set the location in Task Scheduler (TS). In at least some versions of TS, this can only be done when you create a basic task, not from the more general "Create Task..." option. Ensure that all the paths in the batch file are absolute or are based in this location.

Triggering stored procedure from script task in SSIS - login failed error

I have a 2008 SSIS package that have a lot of steps, one of the steps is a script task that runs a stored procedure. It is not an option for now to change it from a script task, as it would require a lot of extra developing. The procedure runs fine in the production environment, but I need to be able to test it from VS.
The reason for the failure I have found is that the connection to the target database is not working, it is giving error "Login failed for user 'user'".
The SSIS package is built on a SSIS framework, and it has a "Initialize Connections" task as an Event in the "Initialize" part of the package. I do not know much about what this does. I want to just run the Data Flow (With "Execute task" option inside the Data Flow).
In the ProtectionLevel for the package I have tried both DontSaveSensitive and EncryptSensitiveWithPassword without any luck. I manually insert the password in the Connection Manager.
I read some about the need to use a connection manager/XML file, but I'm unsure of how this can be done. (Not done much SSIS before).
Please explain how this debugging in VS should be done.

Scheduling a pentaho job in SQL server agent

I have built out a simple FTP job in Pentaho that places a file in a local directory. I need to be able to call this job in a SQL server agent job which I can then schedule and use, but when I set the agent job up it runs through the steps successfully but does not produce anything to show that it was in fact successful.
I am pretty confident the Pentaho job itself is fine because it can be run through the UI, command line, and .bat file. Everything works as expected except when I try to make this SQL Server Agent job and I have no idea why!
Here is the only step in the job When I use this i'm prompted with no errors but nothing actually happens. If I try to enclose it in quotes I get an error.
Any help would be appreciated
Figured it out!
Apparently, only the first line of the command was executing. So it was navigating to a different directory but not executing any commands. I remedied this by putting everything on one line and adding a && to it.
Command line used: cd c:\pentaho\data-integration && kitchen.bat /file:C:\pentaho\Jobs\BW\FTP_BW_TRN.kjb /level:Basic

Execute multiple sql scripts in order using web deploy

I am using SSDT project to keep my current schema.
I have some scripts that I need to run during deployment of sql database.
How can I setup scripts to execute in order as on picture below?
Currently my deploy fails when I have added publish data into post deploy, which means that steps 1,2 did not execute as I hoped.
See order I need to execute those below: