This is a transitional solution.
I don't want to write my own plugin.
I have too many small, scattered makefiles to convert to pom.xml all at once.
I'm trying to move to Maven2 as a skunkworks project and gradually migrage new + changing code to pom.xml files.
The arch-independent C/C++ solutions are not needed - plus the Makefile run multiple scripts to generate code before compilation.
Any pointers to extending "mvn deploy" to handle "make deploy" would be great.
Thanks
AJ
You should check Codehaus' related CBUILDS project. It may help you with downloading dependencies, unpacking, patching, and invoking make.
I'm a bit confused as to why you're moving towards maven with C projects, but if I were you, I'd look at gmaven-plugin to execute an inline groovy script to invoke what you need. Then, if you want to create a plugin later, it should be easy to migrate to one.
Related
I'm developing a Solr plugin and using the Solr test-framework I place a test SOLR_HOME dir under test/resources with /conf/ and /lib . Now the framework inistantiates a SolrCore and loads my plugin from /lib. Not an issue to output the jar of the plugin to /lib, but the issue is that the plugin is not yet available since it still needs to past the test (chicken and the egg).
How do you recommend solving this? I see those options:
Create another project for the tests with a dependency on the plugin, and in it run the tests. Simple enough, but how do I ensure that everytime the plugin is built also the tests of this other project is built? The point of the automated tests at every build is to having a new plugin jar which breaks the tests.
In dp4j pom.xml I build the project on 2 phases, in the 1st I <include> only the annotation processors while in the other I compile the tests which rely on the annotation processors compiled in the eariler phase.
I'm in favor of 2 since copy-pasting the configuration doesn't seem a bad option, and makes it seem less complicated than it probably is. I don't remember if I had asked about it here - what do you recommend? Any other case studies /working code to look at?
there's a 3rd. most probably best solution ~ do nothing!
I was under the impression that the Solr Testframework need to load my plugin from /lib but apparently it doesn't need to, it can load it from test-classes, all on its own!
I have a alot of jobs on Hudson, most of which are really small and consist of just a few modules. But one is big and consist of several modules.
When ever I make a commit to our subversion repository for any of those several modules in that big job, Hudson builds the entire job instead of just the module that have changed.
It doesn't matter if I just scm-polling or a subversion hook, the result is the same.
It seems to me like it would be better if the modules where built instead of the jobs since the other modules in other jobs have dependencies to the modules and not to the jobs.
Can this be configured or do I have to create several jobs instead of the big one? And if so, can I configure the big job to never build when any of it's modules are being triggered but still build when it's own pom.xml is changed?
Thanks.
Hudson has an "Incremental Build" option in the Maven area of the job configuration.
It's hidden in the "Advanced" area.
You could make use of the reactor plugin. For example:
mvn reactor:make-scm-changes
This will only build those modules that have been changed in the SCM. Follow the link for other examples.
Doesn't your compiler offers you the incremental compile option? The java 1.6 compiler usually searches for class and source files and decides using the timestamp to determine whether to use the source or class file. Just leave out the clean goal when building your code.
Another option would be to first run a batch/shell script to determine what files changed and delete the corresponding class files so that the compiler incrementally builds the class files that are missing.
I'm using PMEase QuickBuild to perform automated builds of our Maven2 projects and a nightly sanity test to ensure nothing is broken.
The test needs to untar packages which are created by the automated Maven2 projects. The problem is that the package names change frequently due to project versions being incremented all the time.
Does anyone know how I can configure QuickBuild to pick up the version (ideally from the POM file of the individual components), if this is possible at all?
I don't know if this is an option for you but it looks like you can do it the other way around. Quoting Build with Maven:
Control build version
If you want to control the build
version from QuickBuild side, please
follow below steps:
Change the POM file and define the project version as
${buildVersion}. Do not forget to
commit the file into your SCM after
change.
Define a build property like below when define the Maven build
step:
buildVersion=${build.version}
There are maybe other options but I must admit that my knowledge (zero) of QuickBuild is very limited
I created a work around to this issue by having QuickBuild execute a shell script which did the untarring by using wildcards, similar to the following (to avoid computing the exact version):
tar xzf filename-*.tar.gz
I couldn't figure out how to do this in QuickBuild, so I offloaded the work to the shell script.
Summary: I'm looking for a way to instruct maven to search for dependencies in target/classes instead of jar in the local repository
Say I have 2 modules, A and B where A depends on B. Both are listed in a module S. Normally I need to run 'mvn install' in S. I'm looking for a way to run 'mvn compile' so that when A is compiled its classpath will contain ../B/target/classes instead of ~/.m2/repository/com/company/b/1.0/b-1.0.jar.
(my reason is so that i can have continous compilation without the need to go through packaing and installation, or, more exactly, use 'mvn scala:cc' on multiple modules)
I don't think that this is possible without horrible hacking, this is just not how maven works. Maven uses binary dependencies and needs a local repository to resolve them. So, the maven way to handle this is to launch a reactor build on all modules. Just in case, have a look at Maven Tips and Tricks: Advanced Reactor Options.
But, during development, can't you just import all your projects in your IDE and use "project references" (i.e. configure your projects to depend on source code instead of a JAR) like most Java developers are doing? This is the common approach to avoid having to install an artifact to "see" the modifications.
If this is not possible and if you really don't want to install artifacts into your local repository, then you'll have to move your code into a unique module.
i know this is annoying. which helped me here is definitely IDE support. eclipse and IntelliJ are clever to collect all dependencies once a maven-project import is done. even cross module dependencies are compiled live.
I'm in a bit of a pickle. I'm trying to run some environmental scripts before I run the build in a m2 project, but it seems no matter how hard I try - the 'pre' build script are never run early enough.
Before the 'pre-build' scripts are run, the project checks to see if the correct files are in the workspace - files that won't be there until the scripts I've written are executed.
To make them 'pre-build', I'm using the M2 Extra Steps plugin - but's it's not 'pre' enough.
Has anyone got any suggestions as to how I can carry out what I want to do?
Cheers.
Have you considered breaking it up into two projects, and setting the pre-build project to be upstream of the build project?
e.g.,
Foo Pre-build
Foo Build
After Foo Pre-build runs, cause "Foo Build" to run.
I have used this, admittedly in different scenarios than yours, quite successfully. This has the added benefit (if you need it) of allowing you to manually run a build without going through the pre-build steps, if you know they aren't necessary.
You should use the free form project type and not the maven project type.
If this is a problem (ie, there are projects that are expecting to be triggered by or triggering from), consider using a custom workspace location and having a free form project execute in this workspace before the maven project runs. The free form project can be used as the trigger for the maven project.
Does adding another build step as a shell script work?
My problem stemmed from the fact I wanted to set-up my workspace before I ran anything due to an issue with Dynamic Views (ClearCase) not being accessible from the workspace - I wanted to add a symlink to fix this.
However, Andrew Bayer has made a change to the plugin that I'm currently testing that should fix this...so the question is probably invalid in it's current form.
Will edit it once we come to a conclusion.