I'm having an issue trying to rollback a changeSet by referring sibling-changeset.
master-changelog.xml
includes v.1.changes.xml (here is the table created)
includes v.2.changes xml (here the table dropped and I would like to refer a changeset from v.1.changes.xml as a rollback)
However no matter how do I reference the changeset in v.1.changes.xml it's not visible to v.2.changes.xml and I'm getting liquibase.exception.SetupException: liquibase.parser.core.ParsedNodeException: Change set not found.
master-changelog.xml
<include file="v1/v1.changes.xml" relativeToChangelogFile="true"/>
<include file="v2/v2.changes.xml" relativeToChangelogFile="true"/>
v1.changes.xml
<changeSet id="1" author="dima">
<createTable tableName="test-table">
<column name="test" type="number"></column>
</createTable>
</changeSet>
v2.changes.xml
<changeSet id="1" author="dima">
<dropTable tableName="test"/>
<rollback changeSetAuthor="dima" changeSetId="1" changeSetPath="src/main/resources/std/v1/v1.changes.xml"/>
</changeSet>
It appears that you're using Maven, so this answer will be Maven specific, as I have not been able to reproduce the solution on the command line.
First of all, it's possible that Liquibase is confused because you're using the same changeSet id in both files, and I'm not sure if it correctly scopes those ids to the changeSet file, or if the ids need to be global. You might first try changing the id on the second changeSet and see if it clears it up for you.
If that's not the issue, then the trick to getting this to work is to make sure your relative references are all in the context of the Java classpath. As I interpret your example, the classpath resources of your files would be:
std/master-changelog.xml
std/v1/v1.changes.xml
std/v2/v2.changes.xml
When running your migration, your changeLogFile setting should reference the classpath resource, not the disk file; i.e. std/master-changelog.xml instead of src/main/resources/std/master-changelog.xml. This puts the origin changelog in a classpath context rather than a file context.
In your v2.changes.xml, you then refer to the first change using the classpath resource name: v1/v1.changes.xml. This should allow Liquibase to find it correctly.
If you have more than one level of changeLog file inclusion, you might be running into this issue which prevents Liquibase from finding sibling changeLogs below the first level of inclusion. Until the pull request is merged and released, you'll be limited to a single level of file inclusion.
This solution is assuming you're using the Maven plugin, liquibase will still find the changelog, since Maven puts your resource files on the classpath by default. I also attach the plugin to the process-resources step (or later) so that the source resources will be in the target/classes directory when the migration is run.
Related
I'm writing a CustomSqlChange for the first time and want to test the outcome by running it on my current database. Of course I could start up the application and execute all change sets via liquibase (including the one that executes my CustomSqlChange), but that takes a lot of time.
Is there a way to manually execute the java class implementing CustomSqlChange from my IDE (IntelliJ) as if it would be from liquibase? Could one maybe even debug that execution?
You can create a separate changelog file, where only your's custom change will be included. Point Liquibase to use it instead of base one. This will give you ability to debug it as well.
<?xml version="1.0" encoding="UTF-8"?>
<databaseChangeLog .....>
<changeSet id="custom-change" author="author" runOnChange="true" >
<customChange param="..." />
</changeSet>
</databaseChangeLog>
As per documentation parameter values are looked up in the following order:
Passed as a parameter to your Liquibase runner (see Ant, command_line, etc. documentation for how to pass them)
As a JVM system property
In the parameters block ( Tag) of the DatabaseChangeLog file itself.
Can I set these parameters in the standard properties file?
It is possible to put changelog parameters in liquibase.properties, or a custom --defaultsFile. Reading the source indicates that you must prefix the properties with "parameter.".
Your error probably looks something like this:
SEVERE 11/4/16 10:26 AM: liquibase: Unknown parameter: 'read_only_user'
liquibase.exception.CommandLineParsingException: Unknown parameter: 'read_only_user'
at liquibase.integration.commandline.Main.parsePropertiesFile(Main.java:453)
Example liquibase.properties:
driver=oracle.jdbc.driver.OracleDriver
url="jdbc:oracle:thin:#(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=dbhost)(PORT=1600))(CONNECT_DATA=(SERVER=DEDICATED)(SERVICE_NAME=devdb)))"
username=scott
password=tiger
defaultSchemaName=app_admin
promptOnNonLocalDatabase=false
parameter.read_only_user=app_read
Using the parameter in a changeset:
<changeset author="Ryan" id="1">
<sql>GRANT SELECT ON APP.SOME_TABLE TO ${read_only_user}</sql>
</changeset>
Yes, and is very usefull if you want to manage information under control...like passwords in differents enviroments:
liquibase.properties:
parameter.pass_admin=idonthavepassword
and the changelog:
<changeSet author="rbs" id="create_user_admin" labels="inicial">
<preConditions onFail="CONTINUE">
<sqlCheck expectedResult="0">SELECT COUNT(1) FROM pg_roles WHERE rolname='admin'</sqlCheck>
</preConditions>
<sql>CREATE USER admin PASSWORD '${pass_admin}' LOGIN SUPERUSER NOCREATEDB NOCREATEROLE INHERIT NOREPLICATION CONNECTION LIMIT -1
<comment>Creating admin user</comment>
</sql>
</changeSet>
Yes, it is possible: http://www.liquibase.org/documentation/liquibase.properties.html.
And use --defaultsFile to pass property file.
I would like to classify Liquibase changesets like
"must run" (e.g. add column)
"can run" (e.g. change column size)
Is there a way to do something like that?
The reason to do this is that the execution of changesets should not stop if a changeset from class "can run" runs in a ValidationFailedException or something similar.
Thanks
You can mark your changesets with an attribute
failOnError
Example:
<changeSet id="changeset1" failOnError="true">
<!-- do important stuff here -->
</changeSet>
<changeSet id="changeset2" failOnError="false">
<!-- do not so important stuff here -->
</changeSet>
For our project we like to have most dependecies automaticaly up to date so we want to use the lastest strategies in IVY. However we dont want to run the bleeding edge of the dependencies ie. alpha and beta versions.
When using:
<dependency org="org.apache.httpcomponents" name="httpclient" rev="latest.revision" />
or
<dependency org="org.apache.httpcomponents" name="httpclient" rev="latest.release" />
We get revision 4.4-alpha1
This is understandable as we use the ibiblio resolver which contains the following xml in maven-metadata.xml
<metadata>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<versioning>
<latest>4.4-alpha1</latest>
<release>4.4-alpha1</release>
<versions>
<version>4.0-alpha1</version>
<!-- snip --->
<version>4.3-alpha1</version>
<version>4.3-beta1</version>
<version>4.3-beta2</version>
<version>4.3</version>
<version>4.3.1</version>
<version>4.3.2</version>
<version>4.3.3</version>
<version>4.3.4</version>
<version>4.3.5</version>
<version>4.4-alpha1</version>
</versions>
<lastUpdated>20140801101402</lastUpdated>
</versioning>
</metadata>
The meta data indicates the alpha version as both release and latest. (not sure if this is related actualy)
In this case we there is a version we would like to get in the metadata list being 4.3.5
Now ivy has a construct with and but the documentation is quite sparse, and i cant figure out how to make this strategy 'ignore' the alpha release.
I tried variation of the following to no avail (using rev="latest.test") :
`
Edit:
From the source code of org.apache.ivy.plugins.latest.LatestRevisionStrategy it appears specialmeanings wont be able to solve this since the version is first split in parts and then compared on a part by part basis.
If there is a way to forbid revisions that contain a specific string my problem would also be solved.
`
The source code of org.apache.ivy.plugins.latest.LatestRevisionStrategy indicated it's impossible to fix this with special-meaning strings in a latestStrategy element. (thanks to: this post)
we ended up using a version matcher to enforse ivy to not use -beta- or -alpha releases.
Its not an optimal solution and the regexp probably needs to be updated a few times still.
in ivysettings.xml:
<version-matchers usedefaults="true">
<pattern-vm name="lastest.nobeta">
<match revision="latest.nobeta" pattern="\.*\d+\.\d+\.?\d*(FINAL|RELEASE|STABLE)?" matcher="regexp" />
</pattern-vm>
</version-matchers>
and in ivy.xml:
<dependency org="org.apache.poi" name="poi" rev="latest.nobeta"/>
Not entirely sure if this takes the lastest version but so far that seems to be the case.
The MSBuild documentation hints in several places that Items aren't necessarily the same as files.
"MSBuild items are inputs into the build system, and they typically represent files."
"Items are objects that typically represent files."
However, I can't seem to find any examples where Items do not represent files. In particular, I would like to perform batching over a set of non-file items. But every item that I create, even from a custom build task, somehow acquires file-like metadata (FullPath, RootDir, Filename, Extension, etc.). Furthermore, I'm confused about the ramifications of setting the Inputs of a target to a set of items that aren't files, and what to use as that target's Outputs.
Does anybody have an example of using non-file Items to perform batching in MSBuild?
edit
Sorry for taking so long to come up with an example. I understand things a bit more, but I'm still uncertain (and there seems to be a complete lack of documentation about this). Everything here is going off my recollection; I'm not at my work computer right now, so I can't verify any of it.
In my experience, MSBuild doesn't like to build multiple configurations of a .sln file in one go. So, this:
msbuild.exe SampleMSBuild.sln /p:Configuration=Debug%3BRelease
(The encoded semicolon being necessary so that it doesn't try to define multiple properties.)
Produces this:
"D:\src\SampleMSBuild\SampleMSBuild.sln" (default target) (1) ->
(ValidateSolutionConfiguration target) ->
D:\src\SampleMSBuild\SampleMSBuild.sln.metaproj : error MSB4126: The
specified solution configuration "Debug;Release|Any CPU" is invalid.
Please specify a valid solution configuration using the Configuration
and Platform properties (e.g. MSBuild.exe Solution.sln
/p:Configuration=Debug /p:Platform="Any CPU") or leave those properties
blank to use the default solution configuration.
[D:\src\SampleMSBuild\SampleMSBuild.sln]
So, it seems like it should be possible to use batching and items to handle this.
<Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003"
DefaultTarget="Rebuild"
ToolsVersion="4.0">
<ItemGroup>
<Configurations Include="Debug" />-->
<Configurations Include="Release" />-->
</ItemGroup>
<UsingTask TaskName="LogMetadata"
TaskFactory="CodeTaskFactory"
AssemblyFile="$(MSBuildToolsPath)\Microsoft.Build.Tasks.v4.0.dll">
<ParameterGroup>
<Items ParameterType="Microsoft.Build.Framework.ITaskItem[]"
Required="true" />
</ParameterGroup>
<Task>
<Code Type="Fragment" Language="cs">
<![CDATA[
foreach (var item in Items) {
Console.Write(item.ItemSpec);
Console.Write(" {");
foreach (string metadataKey in item.MetadataNames) {
Console.Write(metadataKey);
Console.Write("=\"");
Console.Write(item.GetMetadata(metadataKey).
ToString().Replace("\"", "\\\""));
Console.Write("\" ");
}
Console.WriteLine("}");
}]]>
</Code>
</Task>
</UsingTask>
<Target Name="Rebuild">
<LogMetadata Items="%(Configurations.Identity)" />
</Target>
</Project>
Which produces this:
Debug {
FullPath="D:\src\SampleMSBuild\Debug"
RootDir="D:\"
Filename="Debug"
Extension=""
RelativeDir=""
Directory="src\SampleMSBuild\"
RecursiveDir=""
Identity="Debug"
ModifiedTime=""
CreatedTime=""
AccessedTime=""
}
Release {
FullPath="D:\src\SampleMSBuild\Release"
RootDir="D:\"
Filename="Release"
Extension=""
RelativeDir=""
Directory="src\SampleMSBuild\"
RecursiveDir=""
Identity="Release"
ModifiedTime=""
CreatedTime=""
AccessedTime=""
}
As you can see, the items have all kinds of file metadata attached to them. I can't drop the Include attribute, since it's required, but I could synthesize the items in a custom task. HOWEVER, when I do that, they still somehow magically gain all the same file metadata.
What are the ramifications of this? Since I haven't specified these as Inputs or Outputs to a target, will the file metadata cause any problems? Will the build system skip over targets, or build more than it needs to, because the files specified in the Items' FullPath metadata do not exist? What if those files did exist? Would it cause any problems?
I use items to build several solutions after each other and doing some "manual" task with them, therefore I define my own items:
<ItemGroup>
<MergeConfigurations Include="project1\project1.SDK.sln">
<MergeOutAssemblyName>product1.dll</MergeOutAssemblyName>
<MergePrimaryAssemblyName>project1.Interfaces.dll</MergePrimaryAssemblyName>
<SolutionBinaries>project1\bin\$(FlavorToBuild)</SolutionBinaries>
</MergeConfigurations>
<MergeConfigurations Include="project1\project1.Plugin.sln">
<MergeOutAssemblyName>product1.dll</MergeOutAssemblyName>
<MergePrimaryAssemblyName>project1.Interfaces.dll</MergePrimaryAssemblyName>
<SolutionBinaries>project1\bin\plugin\$(FlavorToBuild)</SolutionBinaries>
</MergeConfigurations>
<ItemGroup>
Then I use a target to take the information and do what is necessary:
<Target Name="MergeSolution"
Inputs="%(MergeConfigurations.Identity)"
Outputs="%(MergeConfigurations.Identity)\Ignore_this">
<PropertyGroup>
<MergeSolution>%(MergeConfigurations.Identity)</MergeSolution>
<MergeOutAssemblyName>%(MergeConfigurations.MergeOutAssemblyName)</MergeOutAssemblyName>
<MergePrimaryAssemblyName>%(MergeConfigurations.MergePrimaryAssemblyName)</MergePrimaryAssemblyName>
<SolutionBinaries>%(MergeConfigurations.SolutionBinaries)</SolutionBinaries>
</PropertyGroup>
[....]
</Target>
Hope this helps to point you in the direction you need.
Tricky question. I'm an MSBuild noob, and found myself needing to understand batching recently, so figured I'd share some of what I found.
In general, yes, it seems that most everywhere you see samples and discussions about ITaskItems they tend to be about files. But the underlying implementation is very flexible and can deal with many other things as well. In my case, I've been working with strings and XML data.
This MS article gives some great examples of non-file ItemGroups and metadata.
This article was the best summary I can find that talks about the mechanics of Items and how they are different than Properties. It also covers the whole bit about # and % syntax, converting between strings and Items, and a hint as to where those file metadata properties are coming from - MSBuild is optimized for it.
Whenever you have an interface used as a parameter to a task or whatever, there is going to be a default implementation of that interface somewhere. My guess is that your code sample is newing up a number of these default objects under the hood and those define the metadata that are created by default. If you were to implement this interface yourself, I'll wager you could change this behavior. Probably beyond the scope of the question though =)