Liquibase Gradle Plugin -> updateSQL command - liquibase

I have set up a gradle project that uses the Liquibase Gradle Plugin.
I am trying to use the functionality described in liquibase output
When I do gradle updateSQL basically the task outputs every change in the terminal (I try using this and putting the output of the command like "gradle updateSQL > changes.sql" but this also includes stuffs that I can not run later on, besides it haves all the changes and not just the updates).
I am trying to use the command updateCountSql ( the description says "Writes SQL to apply the next change sets to STDOUT.") I have tried to enter parameters to this task but I can't make it work (constantly getting the error "The Liquibase updateCountSql command requires a value"), does anyone know how does it work?
I just need to keep track of the changes on the database, and be able to create a script with all the changes.
Thanks in advance.

You can use specify a target other than stdout using outputFile, e.g.:
liquibase {
activities {
main {
changeLogFile 'src/main/db/changelogs.groovy'
url 'jdbc:mysql://localhost:3306/my_db'
username 'myusername'
password 'mypassword'
outputFile 'path/to/script.sql'
}
}
}

Related

how to customize nix package builder script

The root problem is that nix uses autoconf to build libxml2-2.9.14 instead of cmake, and a consequence of this is that the cmake-configuration is missing (details like version number, platform specific dependencies like ws2_32 etc which are needed by my project cmake scripts). libxml2-2.9.14 already comes with cmake configuration and works nicely, except that nix does not use it (I guess they have their own reasons).
Therefore I would like to reuse the libxml2-2.9.14 nix package and override the builder script with my own (which is a trivial cmake dance).
Here is my attempt:
defaultPackage = forAllSystems (system:
let
pkgs = nixpkgsFor.${system};
cmakeLibxml = pkgs.libxml2.overrideAttrs( o: rec {
PROJECT_ROOT = builtins.getEnv "PWD";
builder = "${PROJECT_ROOT}/nix-libxml2-builder.sh";
});
in
Where nix-libxml2-builder.sh is my script calling cmake with all the options I need. It fails like this:
last 1 log lines:
> bash: /nix-libxml2-builder.sh: No such file or directory
For full logs, run 'nix log /nix/store/andvld0jy9zxrscxyk96psal631awp01-libxml2-2.9.14.drv'.
As you can see the issue is that PROJECT_ROOT does not get set (ignored) and I do not know how to feed my builder script.
What am I doing wrong?
Guessing from the use of defaultPackage in your snippet, you use flakes. Flakes are evaluated in pure evaluation mode, which means there is no way to influence the build from outside. Hence, getEnv always returns an empty string (unfortunately, this is not properly documented).
There is no need to refer to the builder script via $PWD. The whole flake is copied to the nix store so you can use your files directly. For example:
builder = ./nix-libxml2-builder.sh;
That said, the build will probably still fail, because cmake will not be available in the build environment. You would have to override nativeBuildInputs attribute to add cmake there.

Can Liquibase Autopick new changesets files like Flyway?

I am new to both Liquibase and Flyway. Was trying to do some Hello Worlds. I successfully ran basic SQL ( create insert etc ) using both Liquibase and Flyway. Was interested in running them from command line.
Flyway :
was kind of easy to start with
I had to just Put sql file in correct naming format 'V1_xxxx.sql' in correct folder 'flyway/sql' & Run 'flyway migrate'
the best part was it automatically picked up any new sql file given the correct file name.
LiquiBase :
had to spend some time to understand and use it
Need to give correct file name each time
liquibase --driver=com.mysql.jdbc.Driver --classpath=/path/to/classes --changeLogFile=com/example/db.changelog1.xml --url="jdbc:mysql://localhost/example" --username=dev migrate
liquibase --driver=com.mysql.jdbc.Driver --classpath=/path/to/classes --changeLogFile=com/example/db.changelog2.xml --url="jdbc:mysql://localhost/example" --username=dev migrate
Is there a way in liquibase to automatically pick the new xml files ? like Flyway i could just give folder name and Liquibase could use its table DATABASECHANGELOG to find the deltas and execute same.
Second Question for Liquibase only
In windows in order to run command successfully i had to change the changeLogFile parameter ... from ...
liquibase --driver=com.mysql.jdbc.Driver --classpath=/path/to/classes --changeLogFile=com/example/db.changelog1.xml --url="jdbc:mysql://localhost/example" --username=dev migrate
to
liquibase --driver=com.mysql.jdbc.Driver --classpath=/path/to/classes --changeLogFile=./db.changelog1.xml --url="jdbc:mysql://localhost/example" --username=dev migrate
i.e. i changed my present working directory to com/example and then modified the changeLogFile param to point to a file in current folder and execute command.
Is there a way i can point to changeLogFile in another folder (apart from current folder)
One thing you can do to make liquibase a bit easier to use from the commandline is to create a file named liquibase.properties and save that in the directory where you are running the command. If I remember correctly, the command line will look for files with that name and use the properties in that file rather than requiring all the options on the command line. See http://www.liquibase.org/documentation/liquibase.properties.html and http://www.liquibase.org/documentation/command_line.html#using_a_liquibase.properties_file for more details. The docs there have this:
If you do not want to always specify options on the command line, you
can create a properties file that contains default values. By default,
Liquibase will look for a file called “liquibase.properties” in the
current working directory, but you can specify an alternate location
with the --defaultsFile flag. If you have specified an option in a
properties file and specify the same option on the command line, the
value on the command line will override the properties file value.
Yes, you can have liquibase automatically load files from a directory. You have to have a simple changelog.xml that is referenced from the command line or your properties file, but then that changelog can just reference another directory that contains more changelog files. The <includeAll> tag is used for this purpose (see http://www.liquibase.org/documentation/includeall.html) for more details.
Also, yes, you can put the changelog file wherever you like.

IntelliJ: Dynamically updated file header

By default, IntelliJ Idea will insert (something like) the following as the header of a new source file:
/**
* Created by JohnDoe on 2016-04-27.
*/
The corresponding template is:
/**
* Created by ${USER} on ${DATE}.
*/
Is it possible to update this template so that it inserts the last date of modification when the file is changed? For example:
/**
* Created by JohnDoe on 2016-03-27.
* Last modified by JaneDoe on 2016-04-27
*/
It is not supported out of the box. I suggest you do not include information about author and last edit/create time in file at all.
The reason is that your version control system (Git, SVN) contains the same information automatically. So the manual labelling is just duplicate of already existing info, but is only more error prone and needs to be manually updated.
Here's a working solution similar to what I'm using. Tested on mac os.
Create a bash script which will replace first occurrence of Last modified by JaneDoe on $DATE only if the exact value is not contained in the file:
#!/bin/bash
FILE=src/java/test/Test.java
DATE=`date '+%Y-%m-%d'`
PREFIX="Last modified by JaneDoe on "
STRING="$PREFIX.*$"
SUBSTITUTE="$PREFIX$DATE"
if ! grep -q "$SUBSTITUTE" "$FILE"; then
sed -i '' "1,/$(echo "$STRING")/ s/$(echo "$STRING")/$(echo "$SUBSTITUTE")/" $FILE
fi
Install File Watchers plugin.
Create a file watcher with appropriate scope (it may be this single file or any other scope, so that any change in project's source code will update modified date or version etc.) and put a path to your bash script into Program field.
Now every time the file changes the date will update. If you want to update date for each file separately, an argument $FilePath$ should be passed to the script.
This might have been just a comment to #oleg-mikhailov excellent idea, but the code snippet won't fit. Basically, I just tweaked his solution.
I needed a slightly different syntax but that's not the issue. The issue was that when the script ran automatically upon file save using the File Watchers plugin, if ran on a file which doesn't include PREFIX it would run over and over for ever.
I presume the that the issue is with the plugin itself, as it didn't happen when run from the shell, but I'm not sure why it happened.
Anyway, I ended up running the following script (as I said only a slight change with respect to the original). The new script also raises an error if the the prefix doesn't exist. For me this is a feature as Pycharm prompts me with the error, and I can fix the file.
Tested with PyCharm 2021.2.3 on macOS 11.6.
#!/bin/bash
FILE=$1
DATE=`date '+%Y-%m-%d'`
PREFIX="last_modified_date: "
STRING="$PREFIX.*$"
SUBSTITUTE="$PREFIX$DATE"
if ! grep -q "$SUBSTITUTE" "$FILE"; then
if grep -q "$PREFIX" "$FILE"; then
sed -i '' "s/$(echo "$STRING")/$(echo "$SUBSTITUTE")/" $FILE
else
echo "Error!"
echo "'$PREFIX' doesn't appear in $FILE"
exit 1
fi
fi
PHPStorm has not a "hook" for launching task after detect a change in file (just for uploading in server yes). Code templating is based on the creation of file not change.
The behaviour you want (automatic change file after manual change file) can be useful for lot of things but it's circular headhache for editor. Because if you change a file it must change file (and if a file is change ? it change file ?).
However, You can, perhaps, "enable Live Templates" when you launch a "reformat code" which able to rewrite your begin template code that way rewrite date modification.
Other solution is that use a tools with as grunt but I don't know if manage php file.

Running Grunt Typescript from MSBuild

I have a typescript project set up that is being built with GruntJS using the typescript plugin. I also have a Visual Studio project that I'd like to be able to invoke the build process from.
My first attempt at doing this was adding an <Exec> task to the BeforeBuild target in visual studio, with the <Exec> task configured like this:
<Exec Command="grunt --no-color typescript" />
This runs the build fine, however, when errors are output from Grunt and they populate the Error List in VS, the filename is incorrectly listed as EXEC.
Looking at the Exec Documentation I see that CustomErrorRegularExpression is a parameter to the command, but I can't quite grasp how to use it to solve my problem.
I messed around with it a bit and managed to change the reported filename to my .jsproj file, which is also incorrect. Looking at this post I tried forming my own regex:
<Exec CustomErrorRegularExpression="\.ts\([0-9]+,[0-9]+\):(.*)" Command="grunt --no-color typescript" IgnoreExitCode="true" />
Does anyone have any experience using this command with this parameter to achieve this sort of thing? I think maybe part of the problem is that grunt is printing out errors in two lines?
You are correct about the Exec task only processing single line messages. In addition it also uses Regex.IsMatch to evaluate the error/warning condition, without making use of the pattern's capture groups.
I was not able to find a way to work around this via MSBuild, but the changes to correct the problem were easy to make directly in the grunt task.
I'm using the grunt-typescript task from: https://www.npmjs.org/package/grunt-typescript.
There were 3 trivial changes necessary to make this work.
1) Replace the output utility methods near the top of tasks/typescript.js:
/* Remove the >> markers and extra spacing from the output */
function writeError(str) {
console.log(str.trim().red);
}
function writeInfo(str) {
console.log(str.trim().cyan);
}
2) Replace Compiler.prototype.addDiagnostic to write the file and error data on the same line:
Compiler.prototype.addDiagnostic = function (diagnostic) {
var diagnosticInfo = diagnostic.info();
if (diagnosticInfo.category === 1)
this.hasErrors = true;
var message = " ";
if (diagnostic.fileName()) {
message = diagnostic.fileName() +
"(" + (diagnostic.line() + 1) + "," + (diagnostic.character() + 1) + "): ";
}
this.ioHost.stderr.Write(message + diagnostic.message());
};
Once these changes are made, you no longer need the CustomErrorRegularExpression to be set on your Exec task, and your build output should display the error text including the correct source file with line and column information.

Bamboo with tSQLt - Failed to parse test result file

First of all I should point out I'm new to Atlassian's Bamboo and continuous integration in general. This is the first project where I've used either.
I've created a raft of unit tests using the tSQLt framework. I've also configured Bamboo to:
Get a fresh copy of the repository from BitBucket
Drop & re-create the build DB
Use Red-Gate SQL Compare to deploy the DB objects from source to the build DB
Run the tSQLt tests
Output the results of the tests in XML format to a file called TestResults.xml
I've checked and can confirm that the TestResults.xml file is created.
In Bamboo I then added a JUnit Parser task to consume the contents of this TestResults.xml file. However when that task runs it returns this error:
Failed to parse test result file
At first I thought it might have meant that Bamboo could not find the file. I changed the task that created the results file to output a file called TestResults2.xml. When I did that the JUnit Parser returned this error:
Failing task since test cases were expected but none were found.
So I'm assuming that the first error message means Bamboo is finding the file, it just can't parse the file.
I have no idea where to start working out what exactly is the problem. Has anyone got any ideas?
I had a similar problem, but turned out to be weird behavior from bamboo needing file stamps being modified to have visibility of the JUnit file.
In Windows enviornment you just need to add "script task" before the "JUnit task"
powershell (ls *.xml).LastWriteTime = Get-Date
Reference
https://jira.atlassian.com/browse/BAM-12768
I have had several cases of this and was able to fix it by removing single quotes and greater than / less than characters from test names inside the *.rb file.
Example
test "make sure 'go_to_world' is removed from header and length < 23"
change to remove single quotes and < symbol
test "make sure go_to_world is removed from header and length less than 23"
Very common are contractions: "won't don't shouldn't", or possessives: "the vessel's data".
And also < or > characters.
I think there is a bug in the parser that just doesn't escape those characters in a test title appropriately.